• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16gb GDDR7 enough for 4K gaming in 2025 onwards?

Soldato
Joined
1 Sep 2003
Posts
6,238
Location
Redcar
Seems a key point of the 5080 and below. Only Nvidia's most premium graphics card (that will likely cost around £2000) will have more this gen.

Is 16gb GDDR7 enough do you think? Does the fact that its the latest type of VRAM give it any advantages over GDDR6 VRAM?

I want to say its not enough, seeing Indiana Jones go past that recently, but on the other hand it would be madness for Nvidia to gimp its entire product stack apart from its premium one, which they are saying is for "professionals" anyways, not gamers... if you believe the marketing hype.

Any early thoughts on this?
 
For 4k gaming in general, it'll probably be fine.

For 4K gaming in certain games with the texture quality maxed out, path tracing, DLSS and framegen on all at the same time, probably not.

It's interesting because like you say, unless they have some fancy new VRAM compression tech coming with the 50 series, the 2nd best 4K graphics card will end up being the 4090.
 
Last edited:
OP Now thinking dam I sold my 4090 :cry:

I think I'd rather a 5080 and £700 in my pocket if it comes to it lol.

Nvidia always seem to give people less than the want, but so far they have always been on the money in terms of VRAM allocation.

Isn't allocated but not used VRAM a thing as well? Also what kind of performance penalties when its maxed out?

You'd think Nvidia would have done loads of tests with the amount and I doubt they would be afraid to pass any extra cost on to the consumer, if it was needed!
 
yes-sit.gif
 
Nvidia always seem to give people less than the want, but so far they have always been on the money in terms of VRAM allocation.

I had to upgrade from both these SLI setups of GTX580’s & GTX780TI’s due to not enough vram

Plus wasn’t it the 970gtx that loads of people went bad about the slow 512mb mix with 3500mb of fast vram being used on it
 
Last edited:
I guess it depends on how long you intend to keep it. If for 2 years should be fine for the most part.

If not then what choice do you reallybhave anyway? A 5090? :cry:
 
I think I'd rather a 5080 and £700 in my pocket if it comes to it lol.

Nvidia always seem to give people less than the want, but so far they have always been on the money in terms of VRAM allocation.

Isn't allocated but not used VRAM a thing as well? Also what kind of performance penalties when its maxed out?

You'd think Nvidia would have done loads of tests with the amount and I doubt they would be afraid to pass any extra cost on to the consumer, if it was needed!

They are passing the extra cost onto the consumer, they've been releasing cards with too little VRAM for the long term for two generations at this point and probably three judging from the rumours about the 5000 specifications.

12GB is becoming borderline in some scenarios for 1080-1440P, they know that and they know that they have the mindshare. They want people to buy a new GPU every generation, combine that with the swift uptick in AAA requirements and the adoption of an often poorly implemented UE5?

Dial this back twelve or so years ago and I was arguing that 4gb on something like a Nvidia 670 was pointless outside of niche instances (aka SLI and certain games), and telling people to save the cash and go for the 2gb variant instead.
 
Last edited:
I mean they know we are in a rock and a hard place. What is better?

Lesser performance, but lots of vram.

or

More performance, and little vram.

If above 16gb VRAM is only for the 1% surely most devs won't plan there game going past it?
 
Most major developers are dropping their own in house engines in favour of UE5 for the sake of cutting costs, and while UE5 can do a grand old job when worked on properly it rarely is and instead we get the laziest iteration imaginable more often than not.

UE5 is only functional for most people due to the heavy adoption of upscaling tech such as FSR/DLSS/XeSS, discounting that it's a generation or two too early in terms of the hardware we have available imo. Sure we have powerhouse cards at the top end, and this thread is ultimately about 4K gaming, but using any of the upscaling options at lower resolutions often looks absolutely horrific. We actually had better visual fidelity in some cases in games dating back 10 years, the entire industry seems to be beelining toward shortcuts at the expense of the consumer.
 
Last edited:
Swap to an ultrawide 1440P monitor, 4K is overrated. This way a 5080 will last you years.

My problem is that while I have a 1440P ultrawide monitor, I also run a 4K 65" TV for couch gaming, and I find I've been doing a lot more of that lately.

My next upgrade is going to have to be at least a 5080 or AMD equivalent, and due to the fact I'd prefer it to be a long term purchase I need to decide between the longevity offered by DLSS (which frankly looks a hell of a lot better than FSR to me) and possibly running out of VRAM early.

A lot of people are between a rock and a hard place.
 
I actually think Ultra Wide screens are bit of a scam, pay more for less!

Not necessarily, certainly not with current pricing, you can buy a 34" ultrawide for a similar price to a regular 27" 1440P screen at times.

I'm running a Philips Evnia 34" OLED, you can pick the things up for under £500 which is a bloody bargain for an OLED and quite competitive even vs 27"/1440P OLED monitors.
 
Not necessarily, certainly not with current pricing, you can buy a 34" ultrawide for a similar price to a regular 27" 1440P screen at times.

I'm running a Philips Evnia 34" OLED, you can pick the things up for under £500 which is a bloody bargain for an OLED and quite competitive even vs 27"/1440P OLED monitors.

Yeah it depends a lot on your setup as well and gaming position. For couch gamers like me its probably better to always get the biggest screen you can and increase the FOV where necessary.
 
Yeah it depends a lot on your setup as well and gaming position. For couch gamers like me its probably better to always get the biggest screen you can and increase the FOV where necessary.

I don't disagree, my being a bit of a hybrid in that regard certainly complicates matters, but I do love having the two screens to suit my preferences for certain games.

Ultrawide for anything I prefer a mouse and keyboard/hotas etc for, I need a desk setup for work purposes anyway. TV for games I'm happy to play on a pad and the comfy factor, so RPG's/adventure/platforming type releases etc.
 
Last edited:
I actually think Ultra Wide screens are bit of a scam, pay more for less!

I'm driving a 77" G4 OLED, love playing from a couch in a home cinema. I'm a lazy gamer lol.

You can get a 34" ultrawide 100Hz+ for as little as £200, I wouldn't call that a scam in any way shape or form. You can pay ludicrous amounts for most screens.

Ultrawide adds a lot to a games immersion.
 
It’ll probably be fine. It’s not like games are optimised, and often end up using less VRAM after a couple patches - maybe the memory speed will help.

Need to see what features, if any, they lock to the 5000 series. Probably another AI thing to compensate for rubbish game development.
 
I read somewhere that the bandwidth of the GDDR7 is massively higher than that used in the 40s series so you can’t make a direct comparison GB per GB.

… not sure how true that is!
 
Back
Top Bottom