• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Not spending money for testing out if I can fix 10gb being enough in FC6.

Think at this point, you'd fling the kitchen sink at it not to use more than 10gb in any games, matter of time before more break 10gb...:p

Fair, your money after all but given all the evidence I threw your way with regards to how the RAM usage jumped up by 4+GB (as noted by several others too and even Matt iirc noted the same) after that patch, which fixed the texture rendering/loading issue, it's pretty obvious UBI are passing some of the workload onto the RAM hence why you are encountering issues when I and others aren't. I have noticed RAM usage going up in quite a few games now and overall, game feels smoother with 32GB over 16GB so wouldn't just be for FC 6 based on my experience.....

It was me, I wasn't restarting the game after running out of vram, it all works properly with a restart, forgot all about that, good shout.

Again, not an issue I ever encountered at 3440x1440 even without FSR, telling you, it's the RAM..... ;) :p
 
I can't remember if it was Tommy or Gerard but one of them said they were even having FPS drops and crashes when NOT using the HD texture pack and at 1440P even with FSR.......

Can't recall any crashes, though never really played it much, enemies respawning in front of you just made it seem "meh".
 
I have yet to notice any real "pop in" or "texture loading" issues with regards to textures because of "vram", what I do notice is poor LOD draw distance in pretty much every game especially ubisoft open world games though, obviously if we could dump everything straight into vram, this would solve those LOD issues but sadly not even 16GB, let alone 24GB would be enough for this, not to mention, it is an incredibly lazy and inefficient method.

This is why direct storage is going to be a must have in the future, we have already seen how it benefits this kind of thing in ratchet and clank for the ps 5 but given it is rather new gen and majority of pc gamers won't be on the cutting edge tech to support it, we sadly won't see it anytime soon, although I am hopeful maybe nvidia will sponsor some games for their version of it to get the ball rolling i.e. RTX IO

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/


:cry:

I read an article earlier where nvidia have found a way to be able to get 2x extra ray tracing performance so looking forward to the 4070/4080 :D

LOD is more about draw calls, extra physics, AI, graphics that have to be rendered and less about VRAM. Draw calls on PC, especially under dx11 or lower, are a pain. Also you can't have many shadow casting lights as it also kills performance.
 
My 1440p benchmark result with custom Ultra/Psycho(SS reflections and RT lighting) settings with undervolted / clocked 3080 and undervolted / PL2 125W 12900k -

oC5MMoz.jpg

Game is installed on a SATA drive. I think the most demanding VRAM scene was -

0IGgE1S.jpg

Sorry about the untidy Afterburner stats. I'm really impressed with Alder Lake, both CPU and GPU are air cooled and the system, which is under my desk, remained silent.
 
Last edited:
Here is a good video reviewing the Radeon VII 16GB showing the AMDs review guide of how to make the cards with 8GB or less struggle like the RTX 2080 that was used in the review with 8GB and if you watch the video AMD clearly knows what games that will use more VRAM if allowed and what games will struggle with 8GB or less video cards. Anyone remember FarCry6 and such shenanigans, this proves this has been going on a good while.

Also what this video shows that even in 2016-2019 8GB was really not enough for 4K for some games and well we know what will happen to 10GB soon and maybe even 12GB cards.

Remember the Radeon VII came out in the start of 2019.


From 8 minutes 40 secs

Quick link to the time stamp.

https://youtu.be/7j2PFKkYrVg?t=520

From AMDs Review Guide for the Radeon VII.

EtcCnOG.jpg

WglGzjs.jpg

uP5YAt1.jpg


Watch the video fully to get a good idea what is going on and why Nvidia plays the low VRAM on some cards, it is basically planned obsolescence to make you update sooner if you want high resolutions like 4K with in game higher settings, you can see even in the screenshot from the review guide there is games that used way more than 8GB even when they came out in 2016 (rise of the tomb raider as the example from the screenshot, using 9.8GB in highest settings mode, humm that's basically 3080 10GB levels in 2016, watch the rise of the tomb raider tested in the video how it makes the RTX 2080 struggle when it was doing better than the Radeon VII in most other games, also in the test is a 1080ti with 11GB and well it was almost twice as fast (playable no stutters) in the same test due to the 11GB VRAM compared to 8GB on the RTX 2080 (unplayable stuttered like crazy). RTX 2080 and 1080Ti are basically the same performance cards, but different VRAM amounts installed.

Now the question is ..is it a game issue or Nvidia driver issue too at the time of testing? I wonder if anyone with Rise Of The Tomb Raider, an RTX 2080 and a 4K screen can test it in 2022 with all the game updates and latest drivers on the same part of the game he shows with the problems. This will prove if it was the game that needed patching or the Nvidia drivers at the time causing these issues too or is it just not enough VRAM or VRAM bandwidth as the Vega 64 with 8GB HBM2 had no problems in the test.


Good stuff
 
I'd much rather have a 3080 10gb over a Radeon VII 16gb for gaming unless I'm missing something.

That extra 6gb VRAM won't help when the performance just isn't there and it will be the same with the 3090 24gb once next gen is out and games + RT are pushed further.

The only difference is those of us who bought the 3080 10gb at MSRP will have almost a grand extra in their pocket for the next upgrade compared to buyers of the 3090.
 
Good stuff

It is a "interesting" read/watch, however, people shouldn't be looking at it as a case of "it uses more VRAM if available therefore more vram is a must have"...... Instead they should be looking at as a case of "is the lack of vram causing issues?". Throughout this thread we have already had people claiming there are issues with lesser vram cards such as the 3080 just because people with high vram cards 16/24GB see more than 8/10GB vram being used and automatically assume "oh dear, said 8/10GB cards must be struggling".... yet when further looked into, turns out, just because more vram is used does not necessarily mean that it is going to cause problems, especially when we have seen in quite a few games where nvidia gpus "generally" use less vram.

HU recent video summed it up well imo, "still yet to see more vram pay of"....


And given that no one has been able to post any clear cut evidence to show issues with 10GB vram still.... that kind of says it all.... In fact, I think I am the only one to post any kind of evidence showing what an actual vram issue looks like with cyberpunk and using several 4k & 8k texture packs :cry:


The "planned obsolescence" comments are rather amusing though as per my original reasoning on that:

Of course, it is perfectly valid to say it is planned obsolescence. Heck, we could say the same for any product on the market (look at the article talking about apple and their products, didn't they even admit it too?), after all you're talking about companies worth billions, they are all in it to make as much money as possible and none of them are your friend despite what some think :o We could say the exact same with amd and their lack of RT and dlss competitor (FSR is not even in the same league and amd themselves have even said it is not to be compared to dlss due to the differences in how they work), how many rdna 2 users will be upgrading for those 2 things alone? Assuming rdna 3 is better.... Same way turing owners upgraded to ampere just for the better RT perf.

EDIT:

I'd much rather have a 3080 10gb over a Radeon VII 16gb for gaming unless I'm missing something.

That extra 6gb VRAM won't help when the performance just isn't there and it will be the same with the 3090 24gb once next gen is out and games + RT are pushed further.

The only difference is those of us who bought the 3080 10gb at MSRP will have almost a grand extra in their pocket for the next upgrade compared to buyers of the 3090.

Again stop with the logic!!!! :mad:

:cry:
 
I don't know if this is expected behaviour but I am able to repeatedly replicate a permanent fps drop over the course of a session. When this happens, my GPU usage goes down by 2% but my CPU is not maxed out at all.

43-46 fps after an hour of gameplay

emIBLeor_o.jpg


52 fps on a fresh save

8Z1ijdju_o.jpg
 
I will post again stating an opinion I think almost the entire review industry is currently not assessing VRAM usage properly, whether thats down to lack of understanding, lazyness to change review procedures or not wanting to **** off nvidia I dont know.

DF tweeted shaming reviewers who were relying on after burner FPS reports to determine FF7R had no stuttering even though its clearly visible (but monitoring tools failing to pick it up). Also the lack of analysing texture quality, pop ins, texture streaming etc. Everyone is just obsessed with FPS measurements as if all that matters.

Also how many games are been tested with proper 4k quality textures?

Things will get even worse in the future as dynamic resolution becomes more and more a thing with reviewers failing to detect resolution dropping down in games as well (DF the only one to detect it in FF7R).

Another problem is low VRAM issues may not exhibit immediately and need prolonged gameplay. I feel like reviewers like DF who do extensive reviews of a single game will rise above the rest in the incoming years, vs the likes of HUB who do 20+ games a time.
 
Last edited:
I will post again stating an opinion I think almost the entire review industry is currently not assessing VRAM usage properly, whether thats down to lack of understanding, lazyness to change review procedures or not wanting to **** off nvidia I dont know.

DF tweeted shaming reviewers who were relying on after burner FPS reports to determine FF7R had no stuttering even though its clearly visible (but monitoring tools failing to pick it up). Also the lack of analysing texture quality, pop ins, texture streaming etc. Everyone is just obsessed with FPS measurements as if all that matters.

Also how many games are been tested with proper 4k quality textures?

Things will get even worse in the future as dynamic resolution becomes more and more a thing with reviewers failing to detect resolution dropping down in games as well (DF the only one to detect it in FF7R).

Another problem is low VRAM issues may not exhibit immediately and need prolonged gameplay. I feel like reviewers like DF who do extensive reviews of a single game will rise above the rest in the incoming years, vs the likes of HUB who do 20+ games a time.

I partially agree that testing could be better as we have seen time and time again, many people blame some "game" issues because of lack of vram or something else and turns out once the game gets patched or/and driver update, the issue because of "vram limitations" suddenly disappears....

DF are very good and probably the best (they even picked up deathloops texture loading/rendering upon fast movement/camera angle adjustment on lesser vram cards, something that no one else picked up on) but sadly a lot of people have written them of as "nvidia shills" due to their love for ray tracing and dlss, same way those same people have also written of HU, GN etc. for also being "nvidia shills", I think techpowerup are also on that list now too :p :cry:

It's pointless sites like pcgh, techpowerup just showing how much vram is used as per my comment above, it means nothing as we have witnessed.... frame latency is the main thing I would like to see more of when it comes to vram bottlenecks as it is always the first thing to suffer/indicate there is a vram issue/bottleneck and FPS average/min/max bar charts don't show this well.
 
The fact that it's barely 10% faster than a 3080 in most situations while costing over twice the price is more of a shortcoming over the 3080's ram size.

I was just thinking that you really couldn't tell if that was a 3080 or 3090 without the stats, well at least until you get the power bill :eek:
 
Status
Not open for further replies.
Back
Top Bottom