• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4090 - actually future proof?

Given its already looking 40-50% faster than a $1200 4080 it'll probably also be faster than a 5080 and anything down from that so possibly 4 years before it even gets overtaken by 80 non ti class card by which time you'd have spent getting on for 4K if you buy a 4080 5080 and 6080.

I think what we are going to find out is that most people won't in fact pay £1200 for a cut down xx80 card
 
The 4060 will be far more future proof than the 4090.

High end users are obsessed with the highest framerate possible. When the 5090 comes out they'll get an anxiety attack and will be unable to resist the urge. People at the lower end are more happy to keep their GPU for several gens so more future proof.
 
Last edited:
The 4060 will be far more future proof than the 4090.

High end users are obsessed with the highest framerate possible. When the 5090 comes out they'll get an anxiety attack and will be unable to resist the urge. People at the lower end are more happy to keep their GPU for several gens so more future proof.

Yep that's why nothing is future proof, owners of high end hardware will just upgrade before the 4090 even gets a chance to show future proofness
 
As people have said, there's no such thing as future proof GPU.

That said however, looking at history there have always been cards come up with a long shelf life. Stuff like the 8800GTX and 1080Ti being examples, to avoid accusations of bias towards one manufacturer, perhaps even the RX480 8GB (fine wine) or looking further back the 9700pro.

Generally speaking, cards with a long shelf life will hold these traits:
  • High VRAM, typically 50-100% more than 'normal' cards. 8GB was a lot when the RX480 came out. 11GB on the 1080ti was more than most. etc.
  • No weaknesses in terms of memory bandwidth or raw 'grunt' (cores/core clock etc)
  • Has the latest feature support (traditionally DX version, shader support, more latterly DLSS etc)
  • Are not simply late models from an old architecture, but typically are new architecture and hence get driver support/optimisations for longer (and more chance of optimisation since they are new and start life not fully optimised)
  • Aren't on an interface that is being replaced (e.g. an AGP card when PCI-E was on the way, PCI about to be superseded by AGP, etc) meaning you can upgrade your motherboard without needing to replace it
  • Can handle resolutions higher than typical displays of the time, and hence seem 'overpowered' for what most people are using.
  • Offer a really good performance increment on the prior generation, i.e. are "something special".
Looking at the 4090, I'd say it ticks most of the boxes, but then so does the 3090ti.


I would also echo the point made about about the fact that future proofing often isn't the best strategy compared to spending less more often. i.e. you buy a fast card today and then a fast card in a couple of years and end up paying the same or less, just with better performance in the future and slightly less performance in the short term (when you often don't need it). In other words lets say a 4090 gives you 300fps today and 50fps in 2027. Whereas a weaker card might give you 200fps today but you upgrade it in 2025 and then have 120fps in 2027. Basically trading near-term performance for long-term performance.
 
Last edited:
ReBAR seemed to be a real flop. Which did get pushed for that big marketing and fell away.

I wouldn't really call it a flop, more a silent background thing. Most newer machines and cards support it, and have it enabled by default. It was a bigger deal at the start, as the machines and cards back then often had to have it switched on manually, and in some cases firmware and vbios updates to enable it.

In a way frustrating it couldn't be forced, as it meant many people on 9/10 series machines who didn't get good manufacturer support were locked out of the feature because thier motherboard didn't have the option exposed, so even with a compatible graphics card, they still couldn't use it.

Intel recently reiterated (and its in the Intel update utility) that for Arc, you need REBAR for best performance, and the benches showed this is really true.

Not really as flop, more a now almost given thing on newer machines!
 
Last edited:
I wouldn't really call it a flop, more a silent background thing. Most newer machines and cards support it, and have it enabled by default. It was a bigger deal at the start, as the machines and cards back then often had to have it switched on manually, and in some cases firmware and vbios updates to enable it.

In a way frustrating it couldn't be forced, as it meant many people on 9/10 series machines who didn't get good manufacturer support were locked out of the feature because thier motherboard didn't have the option exposed, so even with a compatible graphics card, they still couldn't use it.

Intel recently reiterated (and its in the Intel update utility) that for Arc, you need REBAR for best performance, and the benches showed this is really true.

Not really as flop, more a now almost given thing on newer machines!

That's not the reason why I'm calling it a flop. I have ReBAR and actually disabled it because the games I tested it in and the games still coming out, there was not much of a boost on performance. Nor is the games list getting updated of ReBAR being supported. It's like a dead end feature. There were only 15 - 18 games on the list or so. It's tiny.
 
Pretty sure Nvidia update that list silently in the background, pretty sure I read they added a few games even in the last few months. (as they have to whitelist)
AMD is pretty much everything works, just some stuff gains, others don't.
Intel have basically said with it off you will suffer and the reviews actually back that up.
 
The minimums in many games absolutely tank with Rebar off, even if averages seem fine.

Techpowerup also did some on v off tests, and drop in average performance was about 25%!

Inte's engineers have very publically said you want it on, and for once, I don't doubt them.
 
Last edited:
All I know is that people that were able to snag a 3080 or 3090 soon after release got the most out of their cards. A similar pattern has been observed in previous launches as well for example people that bought 1080ti's at release have done quite well out of them.

I think buying early into new generations, especially where they offer a big uplift from the previous gen makes sense when buying Nvidia. Their cards don't tend to drop in price much over the course if their shelf life and the late in the day releases (like the 3090ti) often offer really poor value based on the time they are on thr market before being superceded.


If you want good 4k performance and have the money buy a 4090 now and get the most out of it. Nvidia will amount certainly release a Titan or 4090ti down the line but it will cost more, have less time until its superceded, will be more power hungry and only offer a relatively small uplift in performance.
 
Last edited:
If you have the money then just go for it and enjoy.
I have bought a 3070ti a few months ago and didnt regret it, played Elden ring maxout on my ultra wide screen, just dont want to wait forever for a 4070 or 3060.
 
Stuff like the 8800GTX and 1080Ti being examples, to avoid accusations of bias towards one manufacturer, perhaps even the RX480 8GB (fine wine) or looking further back the 9700pro.

I think the 8800GTX & the 9700Pro were probably the biggest leaps forward relative to their previous gen. Both incredible cards for their time.
 
Last edited:
No, no technology is.

Pay your £2000 now, but dont be sad when MS/Sony's upgraded Series X/PS5 plays games just as well as it in a few years time for ~£600
 
Last edited:
I think the 8800GTX & the 9700Pro were probably the biggest leaps forward relative to their previous gen. Both incredible cards for their time.
The HD5800 series was a huge jump over the GTX200 and HD4000 series as well. Even the subsequent gen GTX460 1GB and HD6850 were still around 10% slower than the HD5850.
 
The 4090 is split second proof until the next comes out, such as there are rumors about the Ti not too far behind the 4090 launch. What gets me though some games take resident evil in this example uses X amount of Vram your visuals depend on this how far you can go. The 3090 with 24 is the same as the 4090 with 24 but my personal opinion is that this is what makes a card as you put it " future proof " on the basis of how much vram it has. Dlss will eventually become obsolete as the new trend comes in. Frame rate is not too great a difference imo, I used a 3090 before the 4090 release and it makes little actual difference. Tbh for the 1st time since my last AMD athlon II days I am looking forward to the 7900xt that may come out- reason being 4090 still runs gen 4 but apparently amd are pushing the use of gen 5 which will change the game. Lets face it the intel arc are pathetic.
 
The 4090 in 2 years time will still be good for games then but it will be likely equivalent to a 5070 which going by rising prices will likely cost the same as a 4090.

Buy what you can now and enjoy it.
 
Last edited:
Back
Top Bottom