• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why - "Is 'x' amount of VRAM enough" seems like a small problem in the long term for most gamers

Big performance difference with a modified RTX 3070 with 16GB of VRAM, in Watch Dogs Legion:

Framerate 0.1 Low - 24 FPS with 8GB VRAM.
Framerate 0.1 Low - 45 FPS with 16GB VRAM.

Also, much reduced freezing and stuttering. Info found here:
https://videocardz.com/newz/nvidia-geforce-rtx-3070-with-16gb-memory-tested

The game is using a few hundred MB (around 8332MB) more than the unmodified RTX 3070 has available (tested on high or Ultra settings).

You can avoid these problems by setting the texture resolution to 'Medium' on 4K resolution, if you have a graphics card with 8GB or less of VRAM.
 
@Grim5 - Obviously not true. Performance at 4K depends on the game. Most DLSS capable games play very well (60+ FPS) at 4K DLSS performance mode, including WD: Legions (with textures on medium).

The main problem seems to be that not all game developers bother to optimise the amount of VRAM consumption in games. This could be better handled with more texture streaming settings and engine optimisation. For example, Ubisoft appears to have 'optimised' their games at 4K for the Series X and PS5, which have between 10-16GBs of VRAM.

How many can actually get their hands on a GPU with 10 or more GBs of VRAM these days? The main candidates are the RTX 3080, RX 6800 and RX 6800 XT (can't buy a RTX 2080 TI). How many can afford these GPUs, likely priced at over £1,000 until next year.

The only graphics cards that can be had or less than £1,000 this year, with more than 8GB of VRAM are the RX 6700, with 12GB of VRAM and the RTX 3060 (also 12GB VRAM). Both of these can be brought for around £600-£700 new (or used), not great considering these GPUs are both less powerful than the RTX 2080 TI with 11GB of VRAM.
 
Last edited:
All graphics cards are created equally, at least thats what i'm lead to believe, you see the ones with high vram but lower cuda count, mid range will have mid bus speed, top range high everything. Stick to your budget and see where it gets you
 
The way I see it, if you are running 4k resolution or plan to in the near future, then maybe a sub-£500 MRSP card like the 3070 isn't the right call. I mean when have cards in that price bracket ever been able to properly handle the most demanding games in 4k, regardless of VRAM? It's only a conversation because the 3070 is pretty beefy in terms of horsepower, but to me it is a good 1440p card.

And yeah, in line with what I've always said, this example is an incredibly easy problem to solve, just drop the textures setting with the other bells and whistles left enabled, job done.

It's a shame there aren't texture comparison screenshots between medium textures and whatever they are using for the tests.
 
Big performance difference with a modified RTX 3070 with 16GB of VRAM, in Watch Dogs Legion:

Framerate 0.1 Low - 24 FPS with 8GB VRAM.
Framerate 0.1 Low - 45 FPS with 16GB VRAM.

Also, much reduced freezing and stuttering. Info found here:
https://videocardz.com/newz/nvidia-geforce-rtx-3070-with-16gb-memory-tested

The game is using a few hundred MB (around 8332MB) more than the unmodified RTX 3070 has available (tested on high or Ultra settings).

You can avoid these problems by setting the texture resolution to 'Medium' on 4K resolution, if you have a graphics card with 8GB or less of VRAM.
What about running the game with DLSS as that uses a lower resolution and upscales so would VRAM still be an issue in that case as let's face it 45fps ain't to great either.
 
those kind of %1 lows happens regardless of what you throw at the game;

https://youtu.be/qw8BU0hdlwA?t=331

even at 1080p dlss performance mode, 3080/5800x can't simply keep high %1 lows and a smooth gameplay

https://youtu.be/qw8BU0hdlwA?t=648

That's not true, plenty of tests have been done to show how 8 GB is insufficient for that game (max detail textures/lods), and that was since the game launched. It's 100% proven that the lack of vram hits the 3070 & the like. It's also the nvidia drivers which put extra demand on CPU so in a ray-traced open world game like wd:l, which also has extra cpu-demanding settings besides raytracing, the performance will look weak unless you have a 5800x or better + tight memory. When you combine those two factors, well... the 3070 will look particularly weak.

https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/
 
That's not true, plenty of tests have been done to show how 8 GB is insufficient for that game (max detail textures/lods), and that was since the game launched. It's 100% proven that the lack of vram hits the 3070 & the like. It's also the nvidia drivers which put extra demand on CPU so in a ray-traced open world game like wd:l, which also has extra cpu-demanding settings besides raytracing, the performance will look weak unless you have a 5800x or better + tight memory. When you combine those two factors, well... the 3070 will look particularly weak.

https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/
Surely this is at 1080p super high refresh rate? If you play at 2/4k and 60fps then 8gb is fine for a 3070 in most games
 
That's not true, plenty of tests have been done to show how 8 GB is insufficient for that game (max detail textures/lods), and that was since the game launched. It's 100% proven that the lack of vram hits the 3070 & the like. It's also the nvidia drivers which put extra demand on CPU so in a ray-traced open world game like wd:l, which also has extra cpu-demanding settings besides raytracing, the performance will look weak unless you have a 5800x or better + tight memory. When you combine those two factors, well... the 3070 will look particularly weak.

https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/
You'd probably want to turn settings down anyway at 4K since your not getting 60FPS.

The 3070 is ok for what it is "a mid range card" I doubt most people who play at 4K would buy it though as they know they will need more raster horse power to run the latest games above 60fps.

Nvidia should have probably went with 10gb and a 324bit bus for the 3070 as this would have meant performance closer to the 6800 but Nvidia don't like giving extra performance away unless they can help it and the irony is that if it wasn't for AMD this generation then the 3080 would have got the 3070 specs complete with 8gb VRAM.

Nvidia will need to up their game come 4000 series as I think all the cards will need more VRAM and with AMD looking to have made some sizeable performance gains with a first MCM design Nvidia will have to deliver strong performance improvements to keep pace.
 
Surely this is at 1080p super high refresh rate? If you play at 2/4k and 60fps then 8gb is fine for a 3070 in most games
Vram does not increase with refresh rate (fps), so that part is irrelevant.

You'd probably want to turn settings down anyway at 4K since your not getting 60FPS.

It has nothing to do with 4K, in fact the vram demand doesn't jump significantly as you increase resolution, it is texture quality & streaming settings which dictate that. For example I was messing around with WD:L earlier and even at 360p I could see the "usage" (because it's in-use + allocation, but that allocation is also very important for fluidity so there's less swap from slow storage to be had) go past 10 GB easily. Same for The Division 2 (https://i.imgur.com/Mvg42Ul.jpg), Cyberpunk 2077 etc. It's the engine & how all of that is set-up that determines vram usage most of all, not the resolution. Granted, with UE5 and the like it seems like vram requirements are going down, at least for that demo, but who knows how it plays out for actual games. Guess we'll have to wait and see, but in the games that do exist 8 GB is very very weak if you care about texture quality & streaming distance/models.

Found a new cool article on vram tests for 3080 vs 3080 Ti vs 3090:
https://www.pcgameshardware.de/Nvid...peichertest-Benchmarks-Reichen-10-GB-1373623/
 
The problem the OP is missing is that a VRAM limit is not a sudden onset on every game and the majority buy a GPU to last more than 1 or 2 years. VRAM issues are a gradual thing and you initially find it happening on the odd game where you have to lower settings even though GPU grunt seems to be there. It will be a rare event at first, with some random modded game or niche sim but the list of games you have issues with grows.

For example the 2GB GTX 680 ran out of VRAM limits long before the competing AMD 3GB HD 7970. A 7970 could still get playable FPS using medium/high settings at 1080p 3-5 years after release while a GTX 680 was having to use low/medium in most newer games due to VRAM limits. Now that may not sound like much but that is like running a 2060 at medium against a 2080 at high and declaring them competing GPUs. So when you hear the "you run out of GPU grunt long before VRAM" it is usually from someone who upgrades every year or 2.

So it is worth asking when some enthusiast on this forum says "x amount of VRAM is perfectly fine", if they upgrade their GPU every year. Then reminding them that the majority buy a GPU to last them 3-5 years.

8GB on the RTX 3070 is going to be perfectly adequate for the majority of games and the majority of owners for a while yet. especially if they are paired with a 1080p monitor. Most will never even encounter any issues as they will move on in a year or two max. Yet for the majority of gamers who buy a GPU to last for years, this will become a problem before GPU grunt does.

This. Still seeing threads/posts "you're idiots you don't need more than 8 or 16gb ram to game" and I'm now hitting 24gb use etc on average on my 32gb system. Playing *older* games. :P So yeah, you can use it, when you know how to/have special use cases. Not got to the VRAM yet, as nothing to buy GPU wise, but I'm hoping to get 8 or really 12GB+ vram for playing with AI models...

I'd never recommend someone buying a PC for internet and email, the occasional game of Rocket League, to get a 3090, but at the same time, why poo poo those who do game in VR etc and need the 3090? Seems the human race fails to notice not everyone is a clone...
 
Surely this is at 1080p super high refresh rate? If you play at 2/4k and 60fps then 8gb is fine for a 3070 in most games
No it's kind of the opposite actually. If playing 4k and insisting on high textures then a 3070 might hit a VRAM limit on some games. 3070 is more suited to the 1080p scenario, as it is fast enough to give decent framerates at that res and less likely to hit a VRAM wall.
 
That's not true, plenty of tests have been done to show how 8 GB is insufficient for that game (max detail textures/lods), and that was since the game launched. It's 100% proven that the lack of vram hits the 3070 & the like. It's also the nvidia drivers which put extra demand on CPU so in a ray-traced open world game like wd:l, which also has extra cpu-demanding settings besides raytracing, the performance will look weak unless you have a 5800x or better + tight memory. When you combine those two factors, well... the 3070 will look particularly weak.

https://www.pcgameshardware.de/Rayt...acing-CPU-Cost-Benchmarks-Frametimes-1371787/


the video i gave you have 5800x + 3080 in tow

no need to prance around with words. 5800x+3080 can't provide smooth performance with or without rtx in that game. argue all you want

it was you people that said "zen 3 is bereft of nvidia overhead". no need to bring that now. oh, i see. its already obsolete, isn't it? you will need "muh" zen 4. keep shelling out.

low %1s without ray tracing + 400 dollar 8/16 top of the line cpu + 700 dollar (actually 1000+) top of the line gpu at 1080p with dlss balanced (internal 626p)

n076JY8.png


oooh... but you need tight memory, huh?? even at 1080p dlss balanced with rtx off with a 400 dollar 8/16 zen 3 cpu? nice. i guess


even at lowest possible settings, you will keep getting low %1 lows. this is the nature of Windows 10 OS and badly optimized ports. there's no other way around this.
 
Last edited:
The 0.1% lows means very little performance wise, the actual averages are about the same. This is basically what you'd expect from a game that has bad optimization for streaming assets which is that if you exceed the vRAM budget and swapping occurs, if it happens too late you get micro stutter. This is why you tend to get that sort of thing happen in WDL if you're speeding through the city, the kind of zoning the engine is using is not aggressive enough to handle players passing through different areas quickly. Most notably at 3:17 in the video the benchmark which I believe is also WDL from the built in benchmark, it's show basically the same 0.1% lows for the 8/16GB variants and again no surprise there because the scene is slowly panning through the same small area and unlikely to be triggering much, if any asset swapping.

The reason I think that's a fair standard is that all modern AAA games are doing this, it doesn't matter how much vRAM you have they all have a library of assets which when uncompressed into vRAM would far exceed even a 3090s 24GB. That's why streaming assets was invented in the first place, to allow in game memory budgets to exceed vRAM limitations. This allows the overall variety of assets to increase but you only have some fraction of them on screen at any one time. If you skip through the same video and look for instances where vRAM usage is at ceiling on the 8Gb variant but then exceeding it significantly in the 16GB variant such as at 5:34 in the snowy scene you can see an example of where the lows are basically the same, so are the averages. This means the assets needed for the scene exceed 8GB of the vRAM of the stock 3070 but whatever dynamic texture/asset swapping is going on is being done well and doesn't effect the lows.

It's also worth taking with somewhat a grain of salt because the 0.1% lows are actually higher for the stock 3070 than the 16GB variant in some instances in the same video, such as at 6:45.

Vram does not increase with refresh rate (fps), so that part is irrelevant.

This is true in one sense, that if you tested different GPUs each with different amount of horsepower and got different frame rates with the same settings then vRAM usage you'd expect to be approx the same in each case. But with any given GPU if you're targeting high frame rates by lowering your visual settings then indirectly through lower settings you get lower vRAM usage. This is why I've advocated for thinking about vRAM as more a function of your GPU rather than a function of how demanding games are of assets. Far more vRAM today is spent on visual effects rather than holding assets and each of those visual affects costs GPU cycles. Far greater % of future game vRAM usage increase compared to games today will come form visual affects than it will from assets, and as such have a greater impact on the GPU. It's why in basically almost all cases GPU grunt is running out before vRAM is.
 
Big performance difference with a modified RTX 3070 with 16GB of VRAM, in Watch Dogs Legion:

Framerate 0.1 Low - 24 FPS with 8GB VRAM.
Framerate 0.1 Low - 45 FPS with 16GB VRAM.

Also, much reduced freezing and stuttering. Info found here:
https://videocardz.com/newz/nvidia-geforce-rtx-3070-with-16gb-memory-tested

The game is using a few hundred MB (around 8332MB) more than the unmodified RTX 3070 has available (tested on high or Ultra settings).

You can avoid these problems by setting the texture resolution to 'Medium' on 4K resolution, if you have a graphics card with 8GB or less of VRAM.

'Big performance difference'

....

0.1%

...

puffybear-puffy.gif
 
'Big performance difference'

....

0.1%

...

puffybear-puffy.gif
as you can see, when 3070 or any other cpu/gpu fails to deliver high %0.1 lows, its the blame on them being midrange or low end hardware

when 3080+5800x fails to deliver high %0.1 lows, blame is on nvidia+ubisoft+windows or rtx being heavy or just that ultra settings are being heavy (so they weren't on 3070, huh?)
 
Back
Top Bottom