• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16gb GDDR7 enough for 4K gaming in 2025 onwards?

Soldato
Joined
1 Sep 2003
Posts
6,239
Location
Redcar
Seems a key point of the 5080 and below. Only Nvidia's most premium graphics card (that will likely cost around £2000) will have more this gen.

Is 16gb GDDR7 enough do you think? Does the fact that its the latest type of VRAM give it any advantages over GDDR6 VRAM?

I want to say its not enough, seeing Indiana Jones go past that recently, but on the other hand it would be madness for Nvidia to gimp its entire product stack apart from its premium one, which they are saying is for "professionals" anyways, not gamers... if you believe the marketing hype.

Any early thoughts on this?
 
OP Now thinking dam I sold my 4090 :cry:

I think I'd rather a 5080 and £700 in my pocket if it comes to it lol.

Nvidia always seem to give people less than the want, but so far they have always been on the money in terms of VRAM allocation.

Isn't allocated but not used VRAM a thing as well? Also what kind of performance penalties when its maxed out?

You'd think Nvidia would have done loads of tests with the amount and I doubt they would be afraid to pass any extra cost on to the consumer, if it was needed!
 
I mean they know we are in a rock and a hard place. What is better?

Lesser performance, but lots of vram.

or

More performance, and little vram.

If above 16gb VRAM is only for the 1% surely most devs won't plan there game going past it?
 
Not necessarily, certainly not with current pricing, you can buy a 34" ultrawide for a similar price to a regular 27" 1440P screen at times.

I'm running a Philips Evnia 34" OLED, you can pick the things up for under £500 which is a bloody bargain for an OLED and quite competitive even vs 27"/1440P OLED monitors.

Yeah it depends a lot on your setup as well and gaming position. For couch gamers like me its probably better to always get the biggest screen you can and increase the FOV where necessary.
 
Faster memory is always welcome but if an application demands 20GB, For example, Then it won't care if you have 16GB of fast memory, It'll still need 20GB.
Is there a difference between a game allocating a large amount of vram because you have it, rather than it strictly needing it? Or am I thinking of normal RAM.
 
Should work in a similar way to software program. Request memory for assets, drop the allocation when done, however, you’d have to consider how those assets are used in the game world and whether it’s faster to keep them in memory, because they’ll be used again soon, or drop them - this is also where developers will get lazy.

Maybe faster memory changes that but it’s not like developers will make that effort.

So in a way it does sound like what was debated above is correct in that GDDR7 would allow for less overall because it's more proficient (faster) at pulling and flushing the assets that it needs. Theoretically anyways.
 
Nope, RTX 3070 called and said:"Cyberpu....." darn it ran out of vram again.

What resolution is that at? If 4k I wouldn't necessarily say the 3070 is a modern 4k card so in a way understandable.

That said I'm currently playing FF7 Remake 4k max settings on my temp 3050 just fine, so who knows.
 
No. A bunch of games now use 11-15GB of VRAM alone with the remaining being used by the OS and BG apps which is the norm. When you enable other tech like frame gen you increase VRAM usage as well. 16GB is imo not enough for modern 4K gaming, even when upscaled.

If so how could Nvidia get it so wrong? They work hand in hand with these studios with geforce drivers etc. It can't be purely about driving people to the very top end £2k sku if they wish to game at 4k? You would think 24gb Gddr6 would give them a easier time both with fans and financially.
 
Last edited:
Nothing Nvidia have done in recent years points to them caring about the mid tier gaming scene really.

It wouldn't be the mid tier though it's almost the entire product stack, apart from one £2k sku. I suppose in a way it does force devs to optimize to 16gb. :)
 
If you sell that 77" G4 and buy a cheaper model 55" or 65" TV then you will have an extra grand in your pocket to upgrade any other part of your setup like for example paying the extra 1k for a 5090 over the 5080 ;):cry:

Once you go OLED you never go back, much bigger upgrade than any graphics card.

What I mean is, is getting 130fps over say 175fps worth over a thousand more? Both are very similar experiences from an enjoyment point of view.
 
I feel like 4k gaming has been very achievable for years TBH. First with my 3080, when my screen max was 60fps. Then with my 4090 when my screen max was 120fps. And soon again with my 5080/5090 with my screen max of 144fps. Everything maxed I played native with the very odd exception (Alan Wake II and Cyberpunk IIRC).
 
Last edited:
Back
Top Bottom