• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Why - "Is 'x' amount of VRAM enough" seems like a small problem in the long term for most gamers

Soldato
Joined
30 Jun 2019
Posts
8,159
I'd be more concerned with a graphics card's total memory bandwidth, as this will effect all games you play, to some extent (regardless of VRAM usage).

The reason I think it's not the most relevant question to ask regarding performance (especially if you managed to buy a Founders Edition /reference graphics card), is that you should be able to upgrade next year (or 2023), if you brought a graphics card in 2021 with 8GB of VRAM.

Sure, you can buy a high end Nvidia graphics card with 10-12GB of VRAM this year, but you will pay through the nose if you do.

Many of these cards should still be quite valuable next year, perhaps even selling for the same / more than you brought them for. That's because the prices aren't just driven by Etherum GPU mining (a craze that is now on the decline), but also the massive deficiency in semiconductor and VRAM manufacturing capacity this year, which isn't likely to disappear any time soon.

You will likely be able to upgrade to a graphics card with 12/16GB next year, or 2023, then sell your current gen. GPU and upgrade at little to no extra cost. This is one good thing AMD has accomplished, all of their RX 6000 series has 12/16GB of VRAM, 4GB more than the RX 5000 series, which puts a lot of pressure on Nvidia to include higher quantities of VRAM on future graphics cards (which will likely all use GDDR6X VRAM, except for the low end).

Also, on most 8GB cards like the RTX 3060 TI/3070, you don't tend to get VRAM performance issues at 1440p resolution. You can also use various DLSS modes (and FSR soon too) at 4K, which will also reduce VRAM usage.
 
Last edited:
It's a non issue created by people who can't get hold of a certain card and end up trying to convince themselves that they don't want it anyway.

I don't want a card (or any PC component) that can't be produced at scale, that hardly anyone can actually buy at reference model price. It's a very good graphics card, but I can get good enough performance on a rtx 3070 for now.
 
TL; DR

What I was getting at, is that I think people don't need to worry much about buying graphics cards with 8GB of VRAM this year, you can just upgrade next year or 2023, likely at little to no extra cost (the caveat here is the price you brought your graphics card for, which this year has been almost entirely decided by if you got a AIB or reference model).

Maybe this is slightly optimistic and I know not everyone likes to upgrade their GPU every year or so. Before getting a RTX 3070 FE, I was using a R9 390 bought in 2015. So, I waited 5-6 years and got a card with the same VRAM capacity lol. Would I have brought a RTX 3070 with 12/16GB of VRAM for an extra £50? Probably, but no more than this.

But, the resale price of graphics cards is very high, so why not just upgrade every year? The more frequently you do this, generally, the higher the resale price. The rules for buying reference / FE models seems to be 1 per generation at the moment (per household).

Upgrading a graphics card is a piece of p*ss (if you have a half decent power supply). I think the production capacity will improve a bit next year too, it will be interesting to see how switching to 6/5nm GPU dies affects this.
 
Last edited:
4k Texture packs like the one WD: Legions has can cause a lot of problems (gameplay freezing for 20-30 seconds) combined with a 4k display res, on a graphics card with 8gb or less of VRAM. The textures are a little sharper with the texture res on ultra, but the settings below look the same, and it uses a lot more VRAM with it enabled (8-9gb or with it enabled on high, about 5gb vram disabled on high texture res. Personally, I don't mind turning it off, it only makes a small visual difference in gameplay (depending on how much the inbuilt sharpening filter is applied).

The game itself incorrectly estimates how much VRAM is used either either way.

So, probably a good rule of thumb not to use ultra /4k texture packs with 8gb VRAM.

In 2022/2023, Unreal Engine 5 (likely other engines too) will be using photogrammetry to capture details from photographs of real objects, I think this will result in much higher texture detail and corresponding VRAM usage on higher settings. New consoles will need that extra VRAM.
 
Last edited:
Yeah but again their very own engine tech demo/videos literally covers how worlds are zoned into regions and as you pass in/out of those regions it calculates what you can see and adjusts essentially what is in memory. The trick here is that very high quality models and textures are only needed right up close to your point of view, as objects appear more distant you simply cannot resolve all that detail and you can swap it out lower quality variants of the textures. Mixed together with LOD systems and the zoning of worlds into sections, along with some clever predictive code you can only hold in memory the highest quality assets for when they can be appreciated, which is right up close to your viewport.

I think that's all quite theoretical and based on game developers knowing how to get most out of the UE5 game engine (e.g. for Levels of Detail) and the type of game they are making, e.g. strategy, FPS/third person, open world sandbox etc. I think for other game engines (proprietary or free to use), many developers will want to use similar photogrammetry + 3D imaging techniques, but I doubt they will be well optimized for texture streaming, which often seems to suffer in open world games like GTA V and Assassin's Creed titles.

Of course, there are some science fiction games like Starcraft II, where photogrammetry isn't particularly relevant, except perhaps for creating more realistic terrain.

Another poorly optimized game (with slow texture streaming, low quality textures at long distances) was Kingdom Come: Deliverance, which used the CryEngine. I think the developers picked this engine because they liked the realistic foliage, and the developers were familiar with the engine, unfortunately, they want to use the same engine again for the sequel! This is despite the fact that the developers had a lot of issues working with the engine, and no doubt had to make some compromises. There was a strong focus on realism, and modelling real locations in the game accurately, makes me wonder if the team will be using photogrammetry in the sequel.
 
Last edited:
This is kind of interesting, in WD Legions, the average framerate of the RTX 3070 TI (GDDR6X 8GB VRAM) is only about 1-2 FPS higher than the RTX 3070 (GDDR6 8GB VRAM) at 4K resolution. I suppose you need the extra Compute Units and Raytracing hardware (that the RTX 3080/3090 have) to significantly improve FPS in games like this. Link here:
https://tpucdn.com/review/nvidia-ge...dition/images/watch-dogs-legion-3840-2160.png
 
Last edited:
Big performance difference with a modified RTX 3070 with 16GB of VRAM, in Watch Dogs Legion:

Framerate 0.1 Low - 24 FPS with 8GB VRAM.
Framerate 0.1 Low - 45 FPS with 16GB VRAM.

Also, much reduced freezing and stuttering. Info found here:
https://videocardz.com/newz/nvidia-geforce-rtx-3070-with-16gb-memory-tested

The game is using a few hundred MB (around 8332MB) more than the unmodified RTX 3070 has available (tested on high or Ultra settings).

You can avoid these problems by setting the texture resolution to 'Medium' on 4K resolution, if you have a graphics card with 8GB or less of VRAM.
 
@Grim5 - Obviously not true. Performance at 4K depends on the game. Most DLSS capable games play very well (60+ FPS) at 4K DLSS performance mode, including WD: Legions (with textures on medium).

The main problem seems to be that not all game developers bother to optimise the amount of VRAM consumption in games. This could be better handled with more texture streaming settings and engine optimisation. For example, Ubisoft appears to have 'optimised' their games at 4K for the Series X and PS5, which have between 10-16GBs of VRAM.

How many can actually get their hands on a GPU with 10 or more GBs of VRAM these days? The main candidates are the RTX 3080, RX 6800 and RX 6800 XT (can't buy a RTX 2080 TI). How many can afford these GPUs, likely priced at over £1,000 until next year.

The only graphics cards that can be had or less than £1,000 this year, with more than 8GB of VRAM are the RX 6700, with 12GB of VRAM and the RTX 3060 (also 12GB VRAM). Both of these can be brought for around £600-£700 new (or used), not great considering these GPUs are both less powerful than the RTX 2080 TI with 11GB of VRAM.
 
Last edited:
The main thing is you are never going to notice that 0.1% dip until you turn on an FPS monitor / graph.

I agree completely (I can barely perceive 0.1% low dips). But, in the case of Watch Dogs: Legion, the 1% lows were 35 with an RTX 3070 with 8GB of VRAM and 50 FPS with a modified version with 16GB of VRAM. There's also problems with freezing and stuttering, which is the main problem as it completely interrupts gameplay.

With only a little more optimization, they could reduce the VRAM required for this game. In 2021, very few have a GPU with more than 8GB of VRAM, so perhaps the developers will.
 
Back
Top Bottom