• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4090 - actually future proof?

Been gaming for well over 15 years the answer is no ! same with everything it life it gets replaced a year later with better tech.... You could technically have 4090 now but will it give you the performance for games in 3 years time on the newest titles ? defo not.
Yes it will if gaming in lower than 4K res AND the fact that games are console first, pc later.

In the last 2 years, what demanding pc games have come out that needs a 4090..
 
I think the DP1.4 thing is over hyped, yes its a thing but honestly is the 4090 designed to go over 4k 120 with RT etc etc? I'm not sure it is tbh, the real argument imho is Nvidia cheaping out on such an expensive product i.e. it shouldn't have been in there at that price level.

Welcome to Nvidia.
 
It's certainly more futureproof than either of the 4080s and will likely out perform a 5080 if this gens 80 gains are anything to go by and buying a 4080+5080 will likely cost more than just sucking up and getting the 4090.
 
Pretty much that's me, that's my hdmi 2.1 is all I need, DisplayPort doesn't compete
Real monitors use DP as the best connection, only tvs best connection is HDMI and if tvs had DP you wouldn't be saying that and no reason for tvs not to have DP either apart from market segmentation.


HDMI 2.1 its bandwidth capacity to 48Gbps

DP 2.0 its bandwidth capacity is 80Gbps

DP 2.1 we don't know yet but will either be higher or more features added to it at same capacity
 
Last edited:
Umm I'm no expect but most people use HDMI 4K 120FPS and Dolby Atmos and it all connects up with a cinema receiver...
For tv users yes, for monitor users no they would use dp for video and if they wanted to use a receiver they would send audio only via hdmi to get the best audio experience but DP for best video experience.


HDMI 2.1 its bandwidth capacity to 48Gbps

DP 2.0 its bandwidth capacity is 80Gbps

DP 2.1 we don't know yet but will either be higher or more features added to it at same capacity
 
Last edited:
To clarify, this is purely academic / speculative on my part. I have no need to justify a 4090 purchase as I'm certainly not getting one any time soon. Other than it costing way more than I'd personally spend on a GPU, I'd have to get a bigger case, probably replace my rather old 750w PSU, and I suspect my motherboard only having PCI-E 3.0 x 16 will make it even more silly for me.

There are new features every generation, but some of them turn out not to catch on despite initially being hyped (Rebar, PhysX maybe?). There are also new features that don't really become useable until a generation or two after their introduction (there was a good example from the old days, can't remember if it was hardware T&L / something to do with shaders) and more recently ray tracing.

There's also surely widening inequality in the hardware of PC gamers. I expect developers will want their games to be playable on the average PC, which currently has around a GTX 1060 level GPU, and the decent cheap upgrades of yesteryear seem to be gone forever. That might act as an anchor on how demanding games can get, allowing a card like the 4090 to dominate on high settings for a long time to come.

Personally I believe 8K and greater than 120-144 Hz displays suffer too much in the way of diminishing returns to be a worthwhile upgrade in terms of image quality and smoothness. As it sounds like the 4090 is going to be hard locked by its DP version to a max of 4K/~120hz, I suspect it could push that with the help of DLSS for a very long time.
 
Nvidia are cost cutting with the display port thing but in reality its not going to affect many/any people because the truth is that the 4090, although powerful, is not going to be pushing frames to fulfil or satiate the 80GBPS. Its just not that powerful.

I think its nice of AMD to include that, and nvidia are annoying as hell for cost cutting, but at the same time; I struggle to envisage a use case where it'll be a massive issue as you can just use a HDMI 2.1 to display port adapator surely for full 48gbps output?

I'm more annoyed NVIDIA didn't pack 3 HDMI ports and 3 display ports or give AIBs the option to have 3 and 3. I hate having to use adaptors.
 
Last edited:
And yet we're coming off the back of some tech that became 'futureproof', like certain Intel CPUs and graphics cards like the GTX 1080 Ti, which was only dethroned by its lack of DLSS and ray tracing, and the increasing resolution and refresh rates of our displays.

The RTX 4090 looks set to breeze through the problem of max ray tracing, 4K displays, and 100 Hz+, but where it can't, it can summon DLSS 3.0.

Looks like the most plausible case of 'final graphics card' I've ever seen.
,Mid 20's?

Anyone with full prefrontal cortex development knows your 4090 will only be good enough for 60FPS in 2 years at 4k.

1440P is silly. A better card will replace it for cheaper and you never used the full power of the 4090. Such a waste.
 
Last edited:
same as always, 2 years of playing maxed out best everything, 2 years turning it down in a couple of spots to keep the rates up, then waiting for the new card and playing stuff with either DLSS maxed out or at a lower rez.

Nothing is futureproof, but i can see the card having a good long life.
 
Honestly not looking that OP for 3440x1440:


Is still just a 55-60 fps card in Cyberpunk 2077 without DLSS at max settings.
 
I think the forbidden words 'future proof' has triggered some here.

The only acceptable use of that term was describing AM4, and now AM5, what with AMD's commitments to such platforms.

The RTX 4090 will put any owner of said card massively ahead of the curve, especially while the average PC user continues to use something in the 1060-2060 range.
 
The RTX 4090 will put any owner of said card massively ahead of the curve, especially while the average PC user continues to use something in the 1060-2060 range
Given its already looking 40-50% faster than a $1200 4080 it'll probably also be faster than a 5080 and anything down from that so possibly 4 years before it even gets overtaken by 80 non ti class card by which time you'd have spent getting on for 4K if you buy a 4080 5080 and 6080.
 
Last edited:
Back
Top Bottom