• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Future-proofing your GPU.

I don't think it's ever been a thing in tech, tbh. I started with a 486 SX25 and I don't think I've ever bought anything that was "future proof".

You just upgrade or you turn the "quality" sliders down (quotes because who knows what they actually do a lot of the time...)

I would say it was somewhat achievable a while ago, mostly before dx 12 came along (we're only just now starting to see dx 12 in a better shape but now we're kind of back to square 1 with UE 5.....). Alex and Richard summed it up well, you can have the best hardware there is but it's going to do jack **** in games which aren't optimised properly


Ultimately, the real future proofing nowdays is as shown, wait for said games to get patches or rely on tech like upscaling and frame gen to bypass said issues on day 1 of playing.
 
I suppose if you bought the absolute best GPU one year, and would be happy with mid-range perf in 3 years time, it's possible. But who does that?

People who are happy with mid-range perf (me) probably buy mid-range cards, and people who buy top-end this year, will they really wait until their card is effectively mid-range to upgrade? If they've got the money...

I don't know, man. I guess you wouldn't expect me as a mid-range bargain-seeker to ever be future proof (and never have been), but those at the absolute cutting edge, do they really wait until their card is mediocre to upgrade again? Answers on a postcard :)
 
However, I wanted to ask for your thoughts on what you look for when buying a new graphics card. Is it the hardware, brand recognition/trustworthiness, the build, or cooling and thermals?
I tend to go for the best performing hardware, then I go for acoustics.

Nvidia has been my go to brand for many years simply because they’ve produced the most powerful graphics cards. The reference cards (founders edition now) are also extremely good and a viable option, whereas before I’d never have looked twice at them.

I think that comes down to the improved build quality, and the next to no gains that AIB cards give for overclocking.

In terms of AIB though nowadays it’s MSI or ASUS. Years ago it used to EVGA when they made high end cards.

I’m not loyal to a brand though. If intel released the most powerful graphics card next year then that’s where I’d go.

My biggest issue that I have to deal with is FOMO.

If Nvidia released a 4090Ti tomorrow that was 5% faster, I’d be desperate to upgrade to it. Silly really but I can’t help it…..
 
Last edited:
First and foremost, excuse any technical naivety on my part, and forgive me if this question seems a bit silly or if there's already a thread discussing this.

However, I wanted to ask for your thoughts on what you look for when buying a new graphics card. Is it the hardware, brand recognition/trustworthiness, the build, or cooling and thermals?

How much of a decision-making do innovative technologies like rendering techniques such as FSR and DLSS factor into your purchasing decision?

Regarding the software side of things and firmware updates, are you conscious of how each "team" performs, and how much of a factor is that in your decision?

These are some of the first things that I thought of; however, I'm sure there will be many who are able to come up with some very pertinent, nuanced, and insightful points to add to this discussion.

I invite you to post some of your thoughts, or to alternatively :)cry:) ridicule me for being silly inquiring about this, as on the face of it, the answer is self-evident: get the best GPU you can get at the time.
I tend to look at my new GPU purchases to last me 3 years or skip next generation (depending on when I buy it). If I buy it close to launch then it should last me a minimum of 3 years as that's when the game developers start to make their game for use with the later generations (although more recently they seem to think everyone is running 4090's). My older cards used to last me a bit longer pre-GTX10xx series. It also depends on the games you play. If you like to play all the latest games then will probably get less life out of the GPU if you like to run it at the highest end settings. The games I play don't seem to stress a 3080ti with everything turned on at 5120x1440, but other games would make me have to turn most of the eye candy off & use DLSS with a higher scaling rate to run smoothly at that resolution. I still intend to make the 3080ti last me until the wife pesters me for it for her system or 3 years pass & I can then buy myself a better one (should I need it but then again, when has that stopped people buying a shiny new product?). I tend to stick to specific games especially multiplayer ARPG's for many years or until my friends stop playing them & left alone on the server (hasn't happened yet but will one day), so that should mean my GPU stays relevant to me longer than it would if jump from new game to new game.
 
The 3080 I bought over 3 years ago still can't be beat with any of the new stuff for £649 or less so I'd consider that a futureproofed purchase, can't imagine we'll be saying the same about any of the current gen stuff from AMD or Nvidia in 3 years time.

Screenshot-684.png

Screenshot-685.png
 
The 3080 I bought over 3 years ago still can't be beat with any of the new stuff for £649 or less so I'd consider that a futureproofed purchase, can't imagine we'll be saying the same about any of the current gen stuff from AMD or Nvidia in 3 years time.

Screenshot-684.png

Screenshot-685.png
I'm in exactly the same boat. Upgrading from a 2080 net spend was 159 quid to get this card, it was during those crazy times when people paid top dollar for cards and I sold my 2080 for 490 quid. I've loved this card and still do.

Classic case of 'no value in the market' for either of us.
 
It's a struggle to even want to future proof things now when you look at what it takes to run all these great indie games compared to Cyberpunk, Battlefield 2042, Starfield etc.

There's some poor people out there who probably upgraded for Redfall thinking the worst was over. I don't normally give to people in need, but if someone told me they went all in on a new card for Redfall, pain etched on their face, I don't know how I'd react. These titles come out and what can you say to them? ''Don't worry mate, I'm sure it will be fixed in a patch, or ten. You did great future proofing, you've got enough vram there to last you so many more doa early access titles'' when you know deep down, it's Redfall they wanted.
 
Last edited:
The 3080 I bought over 3 years ago still can't be beat with any of the new stuff for £649 or less so I'd consider that a futureproofed purchase, can't imagine we'll be saying the same about any of the current gen stuff from AMD or Nvidia in 3 years time.

I find it both incredibly tragic and hilarious at the same time that a 4070 can't even keep up with a 3080.
 
I think its the first time ever that a 70 class card hasn't been able to beat the previous gen 80, traditionally its the 60 class which matches or beats the previous 80.

It's also sad that the RX6700XT/RTX3060TI are not much slower than the RTX4060TI, which is probably going to be one of the most popular sub £400 cards.
 
Last edited:
The nomenclature is all out of whack for reasons that are hopefully obvious to all...

...to con 'noobs' into buying 'new' stuff that's 'last gen', It's wrong but not illegal.

They are all at it in one way or another, that's why you have to look at (for example) gaming benchmarks in terms of FPS gain, or Min FPS lows at your desired resolution.
 
First and foremost, excuse any technical naivety on my part, and forgive me if this question seems a bit silly or if there's already a thread discussing this.

However, I wanted to ask for your thoughts on what you look for when buying a new graphics card. Is it the hardware, brand recognition/trustworthiness, the build, or cooling and thermals?

How much of a decision-making do innovative technologies like rendering techniques such as FSR and DLSS factor into your purchasing decision?

Regarding the software side of things and firmware updates, are you conscious of how each "team" performs, and how much of a factor is that in your decision?

These are some of the first things that I thought of; however, I'm sure there will be many who are able to come up with some very pertinent, nuanced, and insightful points to add to this discussion.

I invite you to post some of your thoughts, or to alternatively :)cry:) ridicule me for being silly inquiring about this, as on the face of it, the answer is self-evident: get the best GPU you can get at the time.
what games u play?

I dont play any game with RT for example and wont for the next 10 years with my main game.
cpu hold up better and the amd x3d allows for a better gaming experience.
As I play at 4k I upgraded from 6700xt to 6950xt due to bang for money black week deal.
In essence, I get silent 4k performance when adjusted for my usage.
I usually would upgrade cpu more often than a gpu but that was before the 4k which required an upgrade.
Using 32gb ram and a samsung 990 pro ssd allows for that nice windows usage also.
Intel optane would be faster but also cost, 1000 euro more for less gb (so not happening).

Looking forward, games I play means upgrades will slow down for the next 3+ years.
 
Best way to "future proof" is to buy midrange, then sell and buy a new midrange card whilst the old one is still worth something. I'd say every other generation. You're then buying the future whilst losing as little money as you can.
 
Back
Top Bottom