• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Is 16gb GDDR7 enough for 4K gaming in 2025 onwards?

I do feel that 4K DLSS looks good in a lot of scenarios, I was quite happy to play 4K maxed out FF16 with DLSS balanced at a constant 60fps. But even with my being happy with DLSS "balanced." I'm still not convinced any card I buy for £1000, with 16gb of VRAM will not be adequate for more than 2 years.

That does assume that you’d keep the 5080 for more than two years.

Depending on costs, you could potentially buy a 5090, or ‘a 5080 and a 6080’.

Not a terrible option, although I think the latter would be a bit more expensive.
 
Last edited:
That does assume that you’d keep the 5080 for more than two years.

I'd sure as damnit prefer to do so, I'm tired of chasing the dragon with GPU tech at this point.

At the very best I'd prefer to keep such a card for a console generation, and at worst 3-4 years. That's not expecting constant max settings, just consistent performance with measured settings for newer games.
 
Last edited:
Unless you pay the premium of a top tier card, and even then likely playing slightly older games, you're not going to have a legitimately good experience at native 4K.

It's bordering on scam level when we see consoles selling the idea they're outputting 4K but it's poorly upscaled 1080-1440P at best with awful framerates.

I do feel that 4K DLSS looks good in a lot of scenarios, I was quite happy to play 4K maxed out FF16 with DLSS balanced at a constant 60fps. But even with my being happy with DLSS "balanced." I'm still not convinced any card I buy for £1000, with 16gb of VRAM will not be adequate for more than 2 years.
Yeah. It's expensive and the line up from Nvidia this gen looks about as bad as 4xxx so far without seeing the numbers. Worse even as vram has been proven to be a huge issue and nothing much seems to be changing there.

The 7900xtx is looking more attractive than ever, especially on Linux where Nvidia is lagging behind a fair bit.
 
Last edited:
I feel like 4k gaming has been very achievable for years TBH. First with my 3080, when my screen max was 60fps. Then with my 4090 when my screen max was 120fps. And soon again with my 5080/5090 with my screen max of 144fps. Everything maxed I played native with the very odd exception (Alan Wake II and Cyberpunk IIRC).
 
Last edited:
I actually think Ultra Wide screens are bit of a scam, pay more for less!

I'm driving a 77" G4 OLED, love playing from a couch in a home cinema. I'm a lazy gamer lol.
Have you tried stretched (meaning into 16:9, not with black bars) 21:9 resolutions on it (f.ex. 3840x1620)? It's absolutely sick for first person games (f.ex. I exclusively play like that for Cyberpunk on a 65", 2m away from it; in fact 3840x1350 is even better, but with fov 85, tho you need a bit longer to get used to it first) and as close to VR as it gets on a flat screen.
 
Maybe NVidia are about to introduce their DLSS4 AI texture upscaling, and using market capture they'll ensure a critical mass of game developers adopt it, eventually phasing out traditional full resolution textures.

Perhaps then the 7900 XTX's 24GB memory buffer will sit massively underutilised displaying horrible blurry textures as it can't upscale the new 500 kb files that would usually be 50 MB.

NVidia can claim they're saving users disk space and reducing our out of control game installation sizes. Just upgrade your 7900 XTX 24 GB to a 5060 8 GB to get your image quality and full resolution textures back ;)
 
Call me crazy, but if I'm spending £1000+ on a GPU I expect to run AAA games maxed to a playable level. I definitely agree the low/mid range is where users should adjust settings as appropriate.

I disagree. Ultra should be far too much for every graphics card for a few generations. It would future proof games.

Currently it's up to the modding community to create high resolution texture packs for old games. I've been playing Skyrim with dozens of texture mods etc.

The modding community does what game developers should have been doing from the start.

In 10 years time, they'll be ultra high res texture mods on sites like Nexus Mods for today's AAA games like Indiana Jones to help them look relevant. All because developers only focused on today's hardware.

People love to replay old games.
 
Last edited:
I disagree. Ultra should be far too much for every graphics card for a few generations. It would future proof games.

Currently it's up to the modding community to create high resolution texture packs for old games. I've been playing Skyrim with dozens of texture mods etc.

The modding community does what game developers should have been doing from the start.

In 10 years time, they'll be ultra high res texture mods on sites like Nexus Mods for today's AAA games like Indiana Jones to help them look relevant. All because developers only focused on today's hardware.

People love to replay old games.

I don't think developers will put work into making their game look great, that no one can play today (and they won't care about the few who's playing their game 5-10 years away). They would likely choose the path of least resistance and we would all just lose any kind of optimisation, we otherwise would have had.
 
I don't think developers will put work into making their game look great, that no one can play today (and they won't care about the few who's playing their game 5-10 years away). They would likely choose the path of least resistance and we would all just lose any kind of optimisation, we otherwise would have had.

So you want developers to tone down the games and use lower quality textures etc just so you can feel good about being able to use ultra settings?
 
Call me crazy, but if I'm spending £1000+ on a GPU I expect to run AAA games maxed to a playable level. I definitely agree the low/mid range is where users should adjust settings as appropriate.
Price aside, the X080 is a budget 4K card. If it could run 4K at max settings then the X090 card is pointless except for the odd extreme enthusiast.

That being said, paying more than £1k for good 4K experience is stupid.
 
Price aside, the X080 is a budget 4K card. If it could run 4K at max settings then the X090 card is pointless except for the odd extreme enthusiast.

That being said, paying more than £1k for good 4K experience is stupid.

The second best card In a product stack (world) isn't budget 4k by any means. Don't believe the Nvidia marketing hype lol.
 
So you want developers to tone down the games and use lower quality textures etc just so you can feel good about being able to use ultra settings?
We could have more optimised textures without sacrificing quality, it's just that everything is rushed these days and developers lean on overpowered hardware to compensate for lack of optimisation.
 
The second best card In a product stack (world) isn't budget 4k by any means. Don't believe the Nvidia marketing hype lol.
It's not about marketing, it's just what 4K is at the moment. The 4090 barely handles games like Cyberpunk or decent UE5 games, with everything enabled, without upscaling or frame generation - UE5 already has new features in the pipeline that will likely push hardware even harder.
 
We could have more optimised textures without sacrificing quality, it's just that everything is rushed these days and developers lean on overpowered hardware to compensate for lack of optimisation.

I think they're rushed because of the sheer amount of work involved. Such long development time before they get a payday. The industry is seeing it as a diminished return

 
I think they're rushed because of the sheer amount of work involved. Such long development time before they get a payday. The industry is seeing it as a diminished return

It's definitely not just 1 thing. Companies want to see profits sooner, which is funny because more than 1 game has flopped hard and would've lost them millions.

Tech is also advancing and trying to cram 30 different upscalers, RT, every gender pronoun, etc. into a game takes time; that being said, there a plenty examples of studios producing good, profitable games - Sony and Nintendo get it right (most of the time). We don't need more "open world" gambling simulators.
 
Back
Top Bottom