• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

10GB vram enough for the 3080? Discuss..

Status
Not open for further replies.
Being someone that has a 3080 for £650 on pre-order I would not want to pay over £750 for a 20gb card, and we all know it will be closer to £900 imo, so many people that say "oh i'll wait for a 20gb card then", I don't think they realise the cost increase.

Indeed. My one ended up costing less than £600. The money saved plus the money I get from selling this card should easily pay for a 4080 Hopper which will naturally have 16gb at the very least I would imagine due to AMD going 16gb :D

Will see what's what when the time comes though. I may be going back to AMD if I have my OLED monitor by then.
 
Being someone that has a 3080 for £650 on pre-order I would not want to pay over £750 for a 20gb card, and we all know it will be closer to £900 imo, so many people that say "oh i'll wait for a 20gb card then", I don't think they realise the cost increase.

One of the problems we've had in this thread is accusations that somehow nvidia are greedy, or that they're skimping on vRAM which has the implication that in theory they could have produced a 20GB variant for the same cost to the consumer. It's that weird belief mixed with bad expectations on vRAM together which has caused a lot of this mess and the end result is going to be that a 20GB variant if such a thing comes to exist will be a lot more expensive, probably at least £150 more on MSRP as that's likey the cost of the raw components themselves. And once that happens we'll get to see if people are willing to put their money where their mouth is, I did with my 3080 and bought the 10GB variant but it remains to be seen that if given all of the info here and some large added premium on a 20GB variant if people would actually buy that or not. Time will tell.

No one has bothered to find out what the performance penalty is if the 10GB VRAM is fully utilized. Could be as little as 1 FPS, you won't know until you test it.

The simple answer is- If you can still get a minimum of 60 FPS or more at the desired resolution, exceeding the VRAM limit is irrelevant.

It's difficult to test because there's really no games that exceed 10GB for real real, you can apparently do it if you have 1000+ mods on skyrim but there's just no way I was able to replicate that claim because even getting such a thing to work purely from a compatibility standpoint is a nightmare. And the few games that get anywhere near close are already running so slowly that it's a moot point anyway.

In the old days running out of vRAM would cause a cache miss, and cause the card to fetch the asset from RAM or disk which resulted in huge pauses/stuttering and thus average frame rate drop. In more recent engines that do texture streaming the impact is pretty much gone, they simply display a lower resolution texture on the surface, and so the penalty is a visual one, not a performance one. This fact especially when considered along side the ability to push file steaming speeds to 7-8GB/sec as early as next year is going to mean for many gamers that fetching high res assets from disk is not going to be a problem.
 
Did that person you contacted get back to you about testing for the actual VRAM used instead of what was allocated?

Which one? I've reached out to a good 5-6 reviewers at this point, about 10 youtubers and several other people. The person testing Skyrim mods wasn't someone I'd messaged it was a person in the original resetera forum thread on the same topic who made the claim. But they'd never backed it up with details, if that usage was real memory used, what mods they were using, etc.
 
W1zzard at TPU responded to me via email but wasn't very interested in the topic, he confirmed they used GPUz to test games vRAM usage and that this was reporting the same thing as the Afterburner default vRAM measurement which is memory allocated, and dismissed the memory used as a better/useful metric. And I didn't push the point with him. I do have the emails saved between us but sadly I don't think we're going to get anywhere appealing to him, I don't think he has the technical understanding of how this all works. I think there's articles by him that say that Control is using like 16GB of vRAM when maxed out, which is outrageous but...whatever. You can't make people see sense.
 
W1zzard at TPU responded to me via email but wasn't very interested in the topic, he confirmed they used GPUz to test games vRAM usage and that this was reporting the same thing as the Afterburner default vRAM measurement which is memory allocated, and dismissed the memory used as a better/useful metric. And I didn't push the point with him. I do have the emails saved between us but sadly I don't think we're going to get anywhere appealing to him, I don't think he has the technical understanding of how this all works. I think there's articles by him that say that Control is using like 16GB of vRAM when maxed out, which is outrageous but...whatever. You can't make people see sense.

Oh well, we will do our own testing using peer feedback, just need to find a transparent enough software tool that indicates the actual usage and push it for testing measures.
 
just need to find a transparent enough software tool that indicates the actual usage and push it for testing measures.

It isn't always that simple - VRAM can be allocated but only freed in a just in time and/or dynamic resource managed manner based on availability - which to any external tool is confusing to say the least - you can potentially see the asset isn't in current use but no idea if it is essentially marked as disposable or being "cached" for anticipated use, etc.

The best way to test is having a range of the same brand GPUs at a similar performance level but different VRAM amounts.
 
W1zzard at TPU responded to me via email but wasn't very interested in the topic, he confirmed they used GPUz to test games vRAM usage and that this was reporting the same thing as the Afterburner default vRAM measurement which is memory allocated, and dismissed the memory used as a better/useful metric. And I didn't push the point with him. I do have the emails saved between us but sadly I don't think we're going to get anywhere appealing to him, I don't think he has the technical understanding of how this all works. I think there's articles by him that say that Control is using like 16GB of vRAM when maxed out, which is outrageous but...whatever. You can't make people see sense.

Oh he most certainly does. He's the guy responsible for the original ati 9500 non pro softmod drivers. He's not daft.
 
One of the problems we've had in this thread is accusations that somehow nvidia are greedy, or that they're skimping on vRAM which has the implication that in theory they could have produced a 20GB variant for the same cost to the consumer. It's that weird belief mixed with bad expectations on vRAM together which has caused a lot of this mess and the end result is going to be that a 20GB variant if such a thing comes to exist will be a lot more expensive, probably at least £150 more on MSRP as that's likey the cost of the raw components themselves. And once that happens we'll get to see if people are willing to put their money where their mouth is, I did with my 3080 and bought the 10GB variant but it remains to be seen that if given all of the info here and some large added premium on a 20GB variant if people would actually buy that or not. Time will tell.



It's difficult to test because there's really no games that exceed 10GB for real real, you can apparently do it if you have 1000+ mods on skyrim but there's just no way I was able to replicate that claim because even getting such a thing to work purely from a compatibility standpoint is a nightmare. And the few games that get anywhere near close are already running so slowly that it's a moot point anyway.

In the old days running out of vRAM would cause a cache miss, and cause the card to fetch the asset from RAM or disk which resulted in huge pauses/stuttering and thus average frame rate drop. In more recent engines that do texture streaming the impact is pretty much gone, they simply display a lower resolution texture on the surface, and so the penalty is a visual one, not a performance one. This fact especially when considered along side the ability to push file steaming speeds to 7-8GB/sec as early as next year is going to mean for many gamers that fetching high res assets from disk is not going to be a problem.

Not a 10GB but 8GB issue - Borderlands 3, Ultrawide 1440p resolution, 10bit and HDR on. Full details of textures on, DX12 on. GPU - 2070 Super. I can constantly see in GPU-Z vRAM usage very close to 8GB and once in a while stuttering happens, where whole game freezes for a moment as system is moving stuff to/from vRAM. It's really bad, as the freeze can take even 10 seconds! - but gladly rare. However, that should answer your question in regards what happens when you run out of vRAM - really bad things, you NEVER want that to happen and it definitely isn't 1FPS difference. It's complete freeze of the game for few seconds in worst case, and horrible stuttering for a while in best case. And yes, lowering textures settings help - stuttering and freezes are gone, vram usage drops.
 
Last edited:
Not a 10GB but 8GB issue - Borderlands 3, Ultrawide 1440p resolution, 10bit and HDR on. Full details of textures on, DX12 on. GPU - 2070 Super. I can constantly see in GPU-Z vRAM usage very close to 8GB and once in a while stuttering happens, where whole game freezes for a moment as system is moving stuff to/from vRAM. It's really bad, as the freeze can take even 10 seconds! - but gladly rare. However, that should answer your question in regards what happens whern you run out of vRAM - really bad things, you NEVER want that to happen and it defintielly isn't 1FPS difference. It's complete freeze of the game for few seconds in worst case, and horribble stuttering for a while in best case. And yes, lowering textures settings help - stuttering and freezes are gone, vram usage drops.
Keep in mind that here it also matters if you have it on an SSD or not, and whether the game allows you to set-up higher streaming budgets. Also UE4 in particular is notoriously bad with streaming tech so stutters are common across almost all UE4 games of any slightly wider scope. This can sometimes be alleviated, eg I can use unreal console unlocker to set up custom streaming parameters in Jedi Fallen Order and that helps a lot, but I imagine in BL3 it might not be allowed due to its MP/online nature. It could also be a CPU issue, because it's gonna be hit hard & fast when it's asked to decompress, so if it's not a high core-count fast CPU, then it's gonna have those stutters; memory matters too ofc. Again, ties in with how poor the UE4 streaming set-up is and that's why you'll see the near-freezing stutters that can happen in those games. And I haven't even went into DX12 and the issues that can cause! :D

So my point would be, even with 16 GB it wouldn't solve your problem necessarily because I have 16 GB and I have similar issues sometimes if I can't push the game to use it properly (which as you can imagine is sadly common). We'll see how it evolves, but it's definitely not going to be a vram-light future. Heck, barely any games even have proper 4K textures yet and even then that 8 GB threshold is being met or breached.
 
Keep in mind that here it also matters if you have it on an SSD or not, and whether the game allows you to set-up higher streaming budgets. Also UE4 in particular is notoriously bad with streaming tech so stutters are common across almost all UE4 games of any slightly wider scope. This can sometimes be alleviated, eg I can use unreal console unlocker to set up custom streaming parameters in Jedi Fallen Order and that helps a lot, but I imagine in BL3 it might not be allowed due to its MP/online nature. It could also be a CPU issue, because it's gonna be hit hard & fast when it's asked to decompress, so if it's not a high core-count fast CPU, then it's gonna have those stutters; memory matters too ofc. Again, ties in with how poor the UE4 streaming set-up is and that's why you'll see the near-freezing stutters that can happen in those games. And I haven't even went into DX12 and the issues that can cause! :D

So my point would be, even with 16 GB it wouldn't solve your problem necessarily because I have 16 GB and I have similar issues sometimes if I can't push the game to use it properly (which as you can imagine is sadly common). We'll see how it evolves, but it's definitely not going to be a vram-light future. Heck, barely any games even have proper 4K textures yet and even then that 8 GB threshold is being met or breached.

Fast M.2 NVME SSD (not pci-ex 4 but maxing pci-ex 3). So it's not an SSD issues for sure. CPU Ryzen 7 3700X (well cooled), 32GB of fast RAM (so also not an issue). Whatever the case, the point being - letting games run out of vRAM is bad. I remember the same happening back in the days with 2GB and 4GB cards - same results with horrible stuttering. It's nothing new, looking back in time you also get all these answers. New cards still didn't change anything in this regard (direct storage isn't alive yet). And agreed, 8GB is already tight as witnessed myself, 10GB will follow shortly (I give it a year at best).

Though, I don't think we need anything more than 11-12GB in foreseeable future (few years). By that time any current gen GPUs will be too slow for full details anyway and we'll upgrade. Hence, I'd say NVidia gave us a bit too little RAM and AMD definitely too much (increasing price). Though, both had little choice, considering the bus width requiring speficic numbers of vRAM modules.
 
Last edited:
Not a 10GB but 8GB issue - Borderlands 3, Ultrawide 1440p resolution, 10bit and HDR on. Full details of textures on, DX12 on. GPU - 2070 Super. I can constantly see in GPU-Z vRAM usage very close to 8GB and once in a while stuttering happens, where whole game freezes for a moment as system is moving stuff to/from vRAM. It's really bad, as the freeze can take even 10 seconds! - but gladly rare. However, that should answer your question in regards what happens when you run out of vRAM - really bad things, you NEVER want that to happen and it definitely isn't 1FPS difference. It's complete freeze of the game for few seconds in worst case, and horrible stuttering for a while in best case. And yes, lowering textures settings help - stuttering and freezes are gone, vram usage drops.
Sounds like video memory saturation. I've seen similar behaviour before in several games, although the stuttering did not sound as severe as yours when testing with a 8GB GPUs. I've used 8GB/16GB GPUs side by side and only the 8GB GPU would show this behaviour on certain games. The GPUs i used were Vega 64 8GB/Vega Frontier 16GB and a 5700 XT 8GB/Radeon VII 16GB. What you described is exactly the behaviour i saw.

Would you mind running MSI Afterburner and enabling per process video memory allocation to see what the actual video memory usage is in MB?

My theory is that the VRAM allocation metric (touted as the holy grail) will not show more usage (in MB) than the physical video memory available on your GPU - making the metric almost redundant for determining how much video memory is required for an optimal experience.

Instructions here on how to set it up. PSA: MSI Afterburner can now display per process VRAM! : nvidia (reddit.com)
 
Last edited:
Sounds like video memory saturation. I've seen similar behaviour before in several games, although the stuttering did not sound as severe as yours when testing with a 8GB GPUs. I've used 8GB/16GB GPUs side by side and only the 8GB GPU would show this behaviour on certain games. The GPUs i used were Vega 64 8GB/Vega Frontier 16GB and a 5700 XT 8GB/Radeon VII 16GB. What you described is exactly the behaviour i saw.

Would you mind running MSI Afterburner and enabling per process video memory allocation to see what the actual video memory usage is in MB?

My theory is that the VRAM allocation metric (touted as the holy grail) will not show more usage (in MB) than the physical video memory available on your GPU - making the metric almost redundant for determining how much video memory is required for an optimal experience.

Instructions here on how to set it up. PSA: MSI Afterburner can now display per process VRAM! : nvidia (reddit.com)

This value is not set based on what is optimal though, that's the problem. We can see in the worst case scenarios that developers literally have a 90% rule. And that's how you end up with these crazy outcomes where some games on a 3090 have like ~20GB allocated when they clearly do not need that or anything remotely close to it. And cases like CoD Cold War where they literally put an option in the menu to let the user pick the percentage.

You're right that the amount you allocate cannot go above what the card physically has and if games allocated say 10Gb and used 10GB you might suspect that it actually could use more if more was available, that would be a very reasonable assumption that would be worthy of testing. But in pretty much all of these cases usage is less than allocated, typically by quite wide margin, if you're not making good use of the memory you do have it's a bit of a stretch to think that adding more would help.

I actually saw that with my FC5 experience, it was using about 6GB peak usage on my 1080 and moving to a 3080 with 2GB more didn't increase my usage at all.

Keep in mind that here it also matters if you have it on an SSD or not, and whether the game allows you to set-up higher streaming budgets. Also UE4 in particular is notoriously bad with streaming tech so stutters are common across almost all UE4 games of any slightly wider scope.

This is true, in fact epic confirmed they had to re-write their IO systems for unreal engine to cope with the PS5s faster disk, so sometimes it's literally just an engine constraint.
 
Status
Not open for further replies.
Back
Top Bottom