• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16gb GDDR7 enough for 4K gaming in 2025 onwards?

Not necessarily, certainly not with current pricing, you can buy a 34" ultrawide for a similar price to a regular 27" 1440P screen at times.

I'm running a Philips Evnia 34" OLED, you can pick the things up for under £500 which is a bloody bargain for an OLED and quite competitive even vs 27"/1440P OLED monitors.


Problem is it's less because ultrawide on the market today have far lower refresh rates than 1440p screens

Most ultrawide OLED as an example are 160-175hz, and one or two models are 240hz, where as for 1440p there is many 360hz and 480hz models and in two weeks there will be 520hz models too
 
Last edited:
Problem is it's less because ultrawide on the market today have far lower refresh rates than 1440p screens

Most ultrawide OLED as an example are 160-175hz, and one or two models are 240hz, where as for 1440p there is many 360hz and 480hz models and in two weeks there will be 520hz models too

Unless you are a legit pr0 g4m3r and still have the reaction time and eyesight sharpness of a spotty teenager, you are barely going to notice the difference going from 240 to 360 to 480 or 520 in actual game play.

Marginal gains at best.
 
Unless you are a legit pr0 g4m3r and still have the reaction time and eyesight sharpness of a spotty teenager, you are barely going to notice the difference going from 240 to 360 to 480 or 520 in actual game play.

Marginal gains at best.
I tend to agree with this largely because there's little value in playing a AAA title at higher frame-rates, maybe 120 max, whereas for competitive games, you'll want higher frame rates but also wouldn't want anything bigger than 27", or an ultra-wide.
 
It's not about marketing, it's just what 4K is at the moment. The 4090 barely handles games like Cyberpunk or decent UE5 games, with everything enabled, without upscaling or frame generation - UE5 already has new features in the pipeline that will likely push hardware even harder.

I think games like Alan Wake and Cyberpunk are outliers, that have some high settings just for fun. My 4090 destroys nearly everything I've thrown at it in 4k apart from those two.

At some point it's on developers too that their games scale to the best hardware accordingly.
 
I wouldn't buy any GPU with less than 20GB of VRAM today. Especially, when it comes to 5080 level of pricing.

Nvidia have always been as tight as a fairies chuff when it come to VRAM.
 
I wouldn't buy any GPU with less than 20GB of VRAM today. Especially, when it comes to 5080 level of pricing.

Nvidia have always been as tight as a fairies chuff when it come to VRAM.
Its a good stance in theory. The problem is that puts you in a position of having significantly less performance or spending thousands.
 
For 4k gaming in general, it'll probably be fine.

For 4K gaming in certain games with the texture quality maxed out, path tracing, DLSS and framegen on all at the same time, probably not.

It's interesting because like you say, unless they have some fancy new VRAM compression tech coming with the 50 series, the 2nd best 4K graphics card will end up being the 4090.
People have odd ideas about RAM compression - compression that people like to talk about happens in transfer between RAM and vRAM. Ergo, forget about it, it's not relevant to vRAM at all.
Data stored in vRAM is NOT compressed much anymore, it has to be fully ready by the GPU at any time, often dozens of times a second and there's no time for decompressing it live every single time, or your FPS would suffer horribly. Now, textures are already compressed usually, and sadly it's a lossy compression and has been for many years now (hasn't really changed) - this is why NVIDIA and AMD are talking about new fancy AI texture compression which would improve quality without inreasing size much, but it does add latency (and so lowers FPS). Texture compression in general is already in effect though, has been for ages, so might as well not even mention it on either camp.
 
Last edited:
i thought there were some near realtime texture decompression methods like dxt1 (accessed in the same clock cycle), and there was full flexibility in how vendors could implement these methods
source: https://en.wikipedia.org/wiki/S3_Texture_Compression

edit: its based on some method conceived in 1970, so surely there must have been some progress in the meantime
 
Last edited:
Unless you are a legit pr0 g4m3r and still have the reaction time and eyesight sharpness of a spotty teenager, you are barely going to notice the difference going from 240 to 360 to 480 or 520 in actual game play.

Marginal gains at best.

I'm absolutely fine with 120hz, the 175hz of my current screen is more than ample and I suspect that's the case for 95% of gamers.

I'll take the vastly superior IQ and HDR performance over a super high refresh rate.
 
i thought there were some near realtime texture decompression methods like dxt1 (accessed in the same clock cycle), and there was full flexibility in how vendors could implement these methods
source: https://en.wikipedia.org/wiki/S3_Texture_Compression

edit: its based on some method conceived in 1970, so surely there must have been some progress in the meantime
That's why I wrote in my post about texture compression - it's a very old method, very quick to decompress and it's lossy. There are physics-based limits of compression, sometimes you just can't go any better as physics says no. However, as I mentioned too, AI compression is the next evolution of it, it will increase textures quality but have a noticeable performance hit, it seems.
 
I'm absolutely fine with 120hz, the 175hz of my current screen is more than ample and I suspect that's the case for 95% of gamers.

I'll take the vastly superior IQ and HDR performance over a super high refresh rate.
Higher refresh rate (and by that visible max FPS) isn't about latency or reaction times (aside maybe 1% super twitchy teenagers). However, it's definitely about image clarity in movement - you simply can't see clear image with faster movement even on 175Hz OLED (as I tested myself), it will still be blurry. As per scientific research we'd need about 1000Hz to completely eliminate blur in movement and have image as clear as our eyes can see it naturally. Though, it's not really that necessary, most gamers seem to be ok with about 60FPS as game designers tend to take that into account and design their games so we don't need to have full clarity to be able to play just fine.
 
Once you go OLED you never go back, much bigger upgrade than any graphics card.

What I mean is, is getting 130fps over say 175fps worth over a thousand more? Both are very similar experiences from an enjoyment point of view.

It's not just about current FPS differences though. There is a degree of future proofing which may or may not have value to you.

I tend to skip a few generations and having a card which will still pull it's weight in 3 or 4 years time is a good value proposition to me. This wouldnt be the case if the mid range was priced more aggressively as you could roll the savings into more incremental updates.
 
Last edited:
It's not just about current FPS differences though. There is a degree of future proofing which may or may not have value to you.

I tend to skip a few generations and having a card which will still pull it's weight in 3 or 4 years time is a good value proposition to me. This wouldnt be the case if the mid range was priced more aggressively as you could roll the savings into more incremental updates.

True. Nvidia are really good at adding features though. Frame gen for example is a great feature exclusive to newer cards. But on the whole that is a good strategy.

At least we know that the 6080 will be within 10% of a 5090 only with 16gb vram lol.
 
Seems apparent that the way they set it up is different and may be possible that on newer cards 16gb may be enough

Not all 16gb VRAM is the same!
 
I don’t think it’s enough. I think it’s worth waiting for the Super range. A 5080 with 20GB would be a much better proposition.

They have knowingly held the ram back, probably to force people in to a 5090 where they have even more profit.
 
Yep if I was going for a 5080 I'd be very happy. All games today work with 16gb (Indiana needs a small tweak) demanding games will have texture compression you'd imagine if they are going past 16gb traditional. If it's like dlss in terms of image accuracy, which we have little reason to doubt, no worries.
 
Last edited:
I don’t think it’s enough. I think it’s worth waiting for the Super range. A 5080 with 20GB would be a much better proposition.

They have knowingly held the ram back, probably to force people in to a 5090 where they have even more profit.
I think they kept this in mind with the 'low' 5080 price point. They will likely refresh with more vram but also charge 100-200 extra IMO, so not to alienate anyone.
 
Back
Top Bottom