• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4090 - actually future proof?

Associate
Joined
15 Oct 2018
Posts
1,514
Especially for us 1440p / UWQHD 3440x1440 users. Are we talking about rock solid max settings with ray tracing at these resolutions at 100-120-144 Hz? Even the power consumption scales when it's under-utilized.

Sure, currently it's said the 4090 is wasted on anything less than a high refresh rate 4K display. However, looking back at a previously mega-OP GPU, the 1080 Ti, I'm sure that was once considered wasted on 1080p displays, yet now 1080p is the only category it can truly bring a high FPS experience to still.

Also there is no new technology on the near horizon that might render the 4090 obsolete as far as I'm aware of. Sounds to me like you could buy the 4090, and that might be a GPU that will last you 5 years+ or more at 1440p / UWQHD with DLSS 3.0, maybe evem 6, 7, 8 years.

I'm throwing this out there as Devil's Advocate. I don't want to pay £1600 for a GPU, but if I could actually get a triple A futuristic experience that might last the better part of a decade from a GPU, then that sounds to me like a potential investment.
 
Someone hasn't future proofed themselves by buying a 4090 at 1400p or 21:9 ultra wide. They've just bought a card which is honestly not needed or suitable for your use case and is overkill for now. It will be under utilised and a 3090 or 3080 was probably plenty and by the time they need to upgrade, it'll be 6080 time and the performance will beat the 4090 for a much better price with newer technologies.

You'll be better off upgrading your display where its OLED, a wider aspect ratio, a bigger screen etc. than sinking money into a 4090.

Look at the history of GPUs. There has NEVER truly been a future proofed GPU. Is always been out-done and relegated to the bargain basement on ebay. It is not going to change now. NVIDIA and all game designers and manufacturers will MAKE SURE that you need to upgrade every couple of years.

Future proofing is only really a concept in the tech world IMO re: platform support. E.g. Being able to upgrade your processor. But GPU wise.. no... not really buying it.
 
Last edited:
Hate the term "Future Proof", no such thing in the world of IT, someone needs to ban it.
100% agreed.

Is essentially code for "I've bought something I shouldn't have which my current gear can't take advantage of; can you please help me justify it?".

I've been there and future proofed plenty of tech purchases. My gaming PC, my macbook, my ipad - and they're all oudated junk now LOL.
 
Don't think there's anything different with the 40** compared to previous gens - other than the cost of entry.

It may last a decade if you're prepared to increasingly compromise on some combination of quality, resolution or framerate.

Edit - 'last a decade' in terms of actually running. Very much doubt it will meet minimum specs of anything released towards 2032.
 
Last edited:
Don't think there's anything different with the 40** compared to previous gens - other than the cost of entry.

It may last a decade if you're prepared to increasingly compromise on some combination of quality, resolution or framerate.


Exactly. Is my RTX 2080 'future proofing me' because I'm decreasing settings across the board to hit a reasonable FPS? :D
 
Some say there is no future proofing, and yet haven't certain Intel processors enjoyed massive unexpected long-term relevance until recently (especially compared with AMD, where even Intel processors from an earlier era outlived their FX series or early Ryzen counterparts massively).

Also the 1080 Ti, which was once considered overkill, turned out to be the only GPU that matched the 2080 in rasterization and it took the 5700 XT to overcome it from AMD, and it stayed relevant long after the rest of the 1000 series fell away into Vega 64 / GTX 1080 territory. Many reacted to the 2000 series launch by snapping up the 1080 Ti.

The 1080 Ti did not have the advantage of DLSS and it was the last really mega-card before ray tracing. The 4090 sounds like it occupies the massively OP territory the 1080 Ti once did, but it has ray tracing and DLSS 3.0 to support it for the rest of its lifetime.
 
Future as in 2 years, yes. After that yes it will still be a fantastic card but you will likely lose interest in it by then after you see the shiny new cards
 
Please describe what the new cards of 2024 are likely to be bringing to the table that make the RTX 4090 obsolete in two year's time.
 
probably depends on what the next generation of consoles bring, or if suddenly DX13 comes out with some amazing hardware tech features that are game changes
 
Please describe what the new cards of 2024 are likely to be bringing to the table that make the RTX 4090 obsolete in two year's time.

Who said its going to be obsolete?

We probably need to define what future proofing is. If future proofing means 'will this GPU be able to play video games in 4 years?' , then yes it will.
Just like my RTX 2080 can today, or a 1080ti form 6 years ago can today. I wouldn't call those 2 cards neccarsily 'future proofed' lol.
 
Last edited:
And yet we're coming off the back of some tech that became 'futureproof', like certain Intel CPUs and graphics cards like the GTX 1080 Ti, which was only dethroned by its lack of DLSS and ray tracing, and the increasing resolution and refresh rates of our displays.

The RTX 4090 looks set to breeze through the problem of max ray tracing, 4K displays, and 100 Hz+, but where it can't, it can summon DLSS 3.0.

Looks like the most plausible case of 'final graphics card' I've ever seen.
 
Especially for us 1440p / UWQHD 3440x1440 users. Are we talking about rock solid max settings with ray tracing at these resolutions at 100-120-144 Hz? Even the power consumption scales when it's under-utilized.

Sure, currently it's said the 4090 is wasted on anything less than a high refresh rate 4K display. However, looking back at a previously mega-OP GPU, the 1080 Ti, I'm sure that was once considered wasted on 1080p displays, yet now 1080p is the only category it can truly bring a high FPS experience to still.

Also there is no new technology on the near horizon that might render the 4090 obsolete as far as I'm aware of. Sounds to me like you could buy the 4090, and that might be a GPU that will last you 5 years+ or more at 1440p / UWQHD with DLSS 3.0, maybe evem 6, 7, 8 years.

I'm throwing this out there as Devil's Advocate. I don't want to pay £1600 for a GPU, but if I could actually get a triple A futuristic experience that might last the better part of a decade from a GPU, then that sounds to me like a potential investment.

Won't last a decade, but in 5 years time will still be a decent card at that resolution, if you decide to go up a resolution then all bets are off. 3440x1440p is not really hard to run (I use to use a 980ti on that and was very happy with it), 4k is at high refresh rates and 5120x1440 screens the super ultra wides are basically 4k screens minus a million pixels.

Also in 5 years time people will be talking about 8K or the next super wide resolution, the industry just likes pushing larger and larger numbers at us even when it doesn't really make much difference to the average user. Look at 4k and 8k now, have you seen 8k tvs next to a 4k tv at normal viewing distance ? There is zero difference unless you sit so close you can't see the full screen, only time 8k will be worth it is when we get wall sized tvs and 8k on a normal sized monitor is a joke and even 4k monitors that are under 32inch is a comedy act and basically resizing the windows os to make it readable. Basically buying more pixels just to blow the image up again to make it usable.
 
Last edited:
Back
Top Bottom