• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Graphics card longevity factors

Associate
Joined
4 Feb 2009
Posts
1,394
Many of us want our GPUs to last as long as reasonably possible. However, I'm not sure I've ever seen a good discussion of what features will extend card relevance, and the relative importance of different factors. The context of this is I have a 2060, and I'm debating options. So how long things last is very much on my mind.

So, possible break points:
NVidia generations: nvidia have a (to me) bad habbit of limiting their current DLSS version to only the most recent generation of cards. This forces faster card replacement, as if you want DLS3/4/5/whatever you're going to need to replace it. But is this relevant? Will newer games support older DLSS versions that nvidia allows us to run?

AMD: But AMD haven't limited it FSR, but at the same time it's not as good? And games can't retrospectively improve the FRS version? Or haven't? What gives?

Memory. Obviously memory. Please don't flame me. But I'm sure we all agree that 2GB is laughable, 8 is a bare minimum that we'd probabyl want to consider and that 24GB (xtx) might be overkill right now. But how long do people reckon the variations variations will last for?

Other possible points: Poweruse, software, drivers. But I can't see how that impact things much?

Those are the obvious ones that spring to mind. Note that I'm not asking for specific predictions (4070 will be relevant until...) but feel free to add them if you think it would be interesting? I'm just hoping to spur soe dicussion, to help me think through the options currently out there...
 
Last edited:
I'd say:
1. Performance. If the card is too slow, then it doesn't matter if it has DLSS6.5 and 128GB of VRAM.
2. VRAM. If you don't have enough the quality of the graphics will be terrible, or it'll just not load.
3. Features. In the past there's been a cut off with DirectX for example and I think ray tracing will be one soon.
4. Driver support. For the most part it doesn't matter, but if a game doesn't load and only a newer driver fixes it, you're stuck with third party hacks/mods.

Memory. Obviously memory. Please don't flame me. But I'm sure we all agree that 2GB is laughable, 8 is a bare minimum that we'd probabyl want to consider and that 24GB (xtx) might be overkill right now. But how long do people reckon the variations variations will last for?
8GB will be enough for playability at 1080p for quite awhile, maybe 5 years, but that is going to have increasing issues with graphical quality. Some games are better optimised than others, so I don't think it'll be an absolute cut-off for maybe 10 years.

12GB is the minimum I'd be looking for right now on a card I buy today, but preferably 16GB. 16GB shouldn't have any major issues at 1080p or 1440p for the next 5 years, outside of a few outliers that munch VRAM or are poorly optimised.

16GB is fine for 4K right now, but in the longer-term, that's hard to say. Realistically though, the performance requirements for 4K are pretty high, so I'm not sure it matters as much as you'd think (since you might have to drop e.g. a 4080 down to 1440p anyway to retain playability in 5 years time).

This forces faster card replacement, as if you want DLS3/4/5/whatever you're going to need to replace it. But is this relevant? Will newer games support older DLSS versions that nvidia allows us to run?
It looks like they're going for DLSS / FSR as a baseline for their optimisation to achieve playability, so that's possible, but so far you just need the AI cores and driver support.

There might be hardware-agnostic less effective versions that continue for awhile (like FSR 2.x).

From a few things that Jensen has said in the past, I wonder if AI will be used more and more to add performance rather than brute force, especially if ray/path tracing takes off because the compute power is enormous for that right now. But, at least we can say that's some time off, because current cards (even a 4090) are pretty rubbish at it and AI/algorithms can only do so much to improve performance without adding excessive latency, visual glitches, or whatever.
 
Last edited:
I think you can change DLSS versions, people grab newer versions from titles and use them on other games.

I went from a 980TI all the way till the 4090 came out


GPus physically cam last a long time. that 980ti still works 100% it's sitting in it's box as a backup for some reason.
My first run through cyberpunk was on a 980ti and it was more than playable.


In the past GPUs usually became obsolete because motherboards changed from PCI to AGP and then to PCIE we have now.
also when pixel shaders came along, if your card didn't support it then the games wouldn't even load, it would just give an error like needs pixel shader version 1.4.

I'm sure something like a 1080 is still capable today at 1920x1080.

4080 and 4090 will probably still be good enough even in 4-5 years time, unless there's some massive leap forwards


Devs don't push hardware like they used to, because the people who can't run it will review bomb the game.

there would be no can it run crysis memes
 
Last edited:
Raw performance is still far and away the most important thing.

I wonder if longevity will increase due to the slow rate of improvement in performance per £ in recent years? We're probably already at the point where most people are happily getting 5-10 years out of a card.
 
Last edited:
Back
Top Bottom