• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Future-proofing your GPU.

Associate
Joined
12 Nov 2020
Posts
99
First and foremost, excuse any technical naivety on my part, and forgive me if this question seems a bit silly or if there's already a thread discussing this.

However, I wanted to ask for your thoughts on what you look for when buying a new graphics card. Is it the hardware, brand recognition/trustworthiness, the build, or cooling and thermals?

How much of a decision-making do innovative technologies like rendering techniques such as FSR and DLSS factor into your purchasing decision?

Regarding the software side of things and firmware updates, are you conscious of how each "team" performs, and how much of a factor is that in your decision?

These are some of the first things that I thought of; however, I'm sure there will be many who are able to come up with some very pertinent, nuanced, and insightful points to add to this discussion.

I invite you to post some of your thoughts, or to alternatively (:cry:) ridicule me for being silly inquiring about this, as on the face of it, the answer is self-evident: get the best GPU you can get at the time.
 
Well the first thing to know is that GPU's are not built with longevity in mind.

For example you could buy a 4090 today with 100GB of vram and think your sorted for vram for a good while. Right?

Well....

For example:

Next year when Nvidia release the 5000 series, DLSS 4 could be locked to the 5000 series only.

On top of that lets say DX13 launches and needs a certain something that only the Nvidia 5000 series and 8000 AMD cards support due to hardware requirements.

Etc etc etc.

What you buy when you purchase a GPU is a certain performance envelope and a set of supported features.

This is for NOW. Not for tomorrow games or API's because all that changes constantly.

So in summary don't buy a GPU based on does it have much longevity. Because none of them do.

Buy what you need that meets your needs as of now.

---

Just to clarify... of course it doesn't hurt to have one eye on the future just realise that GPU's age quicker than any other pc component especially in terms of feature set.
 
Last edited:
I tend to just get an upgrade. I'm way behind everyone else, I've not long had a gtx 1070 which replaced a gtx 970.
I've been buying second hand the last few years. Maybe in a couple of years (if it lasts that long) I'll do some research in to what's good now.
 
Well the first thing to know is that GPU's are not built with longevity in mind.

For example you could buy a 4090 today with 100GB of vram and think your sorted for vram for a good while. Right?

Well....

For example:

Next year when Nvidia release the 5000 series, DLSS 4 could be locked to the 5000 series only.

On top of that lets say DX13 launches and needs a certain something that only the Nvidia 5000 series and 8000 AMD cards support due to hardware requirements.

Etc etc etc.

What you buy when you purchase a GPU is a certain performance envelope and a set of supported features.

This is for NOW. Not for tomorrow games or API's because all that changes constantly.

So in summary don't buy a GPU based on does it have much longevity. Because none of them do.

By what you need that meets your needs as of now.
I understand that completely and it's certainly a fair way of explaining how technologic advancement would work. However, there are so many variables just in the vram discussion, apropos the adequate amount one should have in 2023.

I totally get your point but a lot of people are putting in significant amount of money not just into now, or next month, but rather into a 3-4 year down the line, and those people would surely be looking for stuff like firmware updates, new technologies; vrr, dlss, fsr et al. no?

Thanks for your input, mind. Interesting thought.
 
I tend to just get an upgrade. I'm way behind everyone else, I've not long had a gtx 1070 which replaced a gtx 970.
I've been buying second hand the last few years. Maybe in a couple of years (if it lasts that long) I'll do some research in to what's good now.

Wow. This is awesome. One of the reasons I've started this thread was because I've seen numerous people who are, by today's standards, using very weak GPUs, and yet, they've been able to make it work.

I'd love to know about your experiences.

What are the main choke points? I'm guessing it's VRAM?
 
However, I wanted to ask for your thoughts on what you look for when buying a new graphics card. Is it the hardware, brand recognition/trustworthiness, the build, or cooling and thermals?
1. What resolution is it marketed for.
2. Can it actually play games at that resolution.
3. Is the power consumption and noise within reason for the performance.

Don't care about the other stuff except for: is it a brand/model likely to break, or have a useless warranty?

How much of a decision-making do innovative technologies like rendering techniques such as FSR and DLSS factor into your purchasing decision?

Regarding the software side of things and firmware updates, are you conscious of how each "team" performs, and how much of a factor is that in your decision?
Depends on the intended usage and the price.

- If I wanted to buy Intel their lack of consistency and poor performance in older (e.g. DX9/DX10) games might be a concern for a retro-oriented PC.
- If I wanted to use my PC for workstation stuff, I'd prefer nvidia over AMD for cuda, performance and driver support.

If I'm buying something high-end to play with all the toys then I'd care about ray tracing, but if the card is £200,.. who cares.

FSR and DLSS: not bothered.
 
Wow. This is awesome. One of the reasons I've started this thread was because I've seen numerous people who are, by today's standards, using very weak GPUs, and yet, they've been able to make it work.

I'd love to know about your experiences.

What are the main choke points? I'm guessing it's VRAM?
I don't currently play anything too modern, so the 1070 seems to have plenty of go.
 
1. What resolution is it marketed for.
2. Can it actually play games at that resolution.
3. Is the power consumption and noise within reason for the performance.

Don't care about the other stuff except for: is it a brand/model likely to break, or have a useless warranty?


Depends on the intended usage and the price.

- If I wanted to buy Intel their lack of consistency and poor performance in older (e.g. DX9/DX10) games might be a concern for a retro-oriented PC.
- If I wanted to use my PC for workstation stuff, I'd prefer nvidia over AMD for cuda, performance and driver support.

If I'm buying something high-end to play with all the toys then I'd care about ray tracing, but if the card is £200,.. who cares.

FSR and DLSS: not bothered.
I actually started caring about power consumption and noise a lot more than I used to. So much so that I went with the Hellhound 7800XT over the Nitro.

Regarding the warranty, well, there is a thread on this forum about whom to trust and whom to be more wary of, so on that, there is a little pushback from me.

I personally don't pay much attention to rendering tech myself. I mean, I do, but I just bought an AMD card, which is objectively a less mature technology.
 
It really depends on what you want to play and at what resolution and what budget you are on. But for sure there's no such thing as future proof, current gen consoles can be a somewhat useful guides to component selection but my honest suggestion is to spec your pc for the games you want to play now.

Rule 1 is always don't spend more than you can afford.
Rule 2 is a game is more than just its max graphics settings. A friend of mine is still using a gtx970 and we got through the whole of BG3 just fine, he didn't have any significant FPS issues and honestly it looked fine on his machine.
But if you want to play Alan Wake 2 no amount of tweaking will get that running playably on a Gtx970. A more modern GPU is mandatory due to architecture features and performance minimums so that should be a consideration.
Rule 3 is do you really need to max your FPS? If playing an eSports game, probably yes, but those are typically very lightweight and easy to run. Most other games are actually fine at 30fps.

If you are in the position that pc gaming is your main hobby and are fortunate enough you can afford more or less anything within reason. Then there's still no real point in upgrading incrementally imo, so I look for a 100% uplift GPU and 50% on CPU.
A 30% uplift is only going to take you from 60fps to 78 which is basically equivalent to changing one or two graphics settings eg. going from dlss quality to dlss balanced.
 
No-one knows the future, and we all live in now.

There's no way to know what, say, GTA5 will require, and whether the advertised min specs are accurate.

If can you get decent performance in the games you play now at monitor native resolution, that's enough.
 
@Apex

It's not the first time vrams been a volatile subject, I can remember the 1Gb is plenty days when Sli/CrossFire was affordable-well CrossFire was anyway.

Planned obsolescence has and always will be a vendor option to encourage you to upgrade, only difference now is deployed through new features as well backed up by reasons.
 
I understand that completely and it's certainly a fair way of explaining how technologic advancement would work. However, there are so many variables just in the vram discussion, apropos the adequate amount one should have in 2023.

I totally get your point but a lot of people are putting in significant amount of money not just into now, or next month, but rather into a 3-4 year down the line, and those people would surely be looking for stuff like firmware updates, new technologies; vrr, dlss, fsr et al. no?

Thanks for your input, mind. Interesting thought.
Of course there loads of vaiables as you said as to the people piling money into cards for 3-4 years down the line will have to come to terms with making comprises at some point and no amount of VRAM is going to change that as even the 4090 is struggling on some games. It sounds great in theory to have a card for 4 years but I'm sure going by past data I dont think theres one card that can still maintain the same settings at its release and maintain the same settings in the latest games 4 years later.

Also people are not stuck with the card they have and are free to at any point change their mind and sell the card they have for another if its not meeting their expectations or in some cases people want a new card just for something new to play around with.
 
Last edited:
Future proofing? No such things these days as shown when even a 4090 ***** the bed without upscaling and frame gen.

Things you "can" do though:

Buy a gpu that can provide a "consistently" good experience with things which are providing actual worthwhile use in broken or/and demanding games i.e. a good upscaler (which is usable at lower presets and lower res. and will still provide as good as native or in most cases as evidenced many times by several sources, better than native when not sabotaging yourself by using an old version of said upscaling tech.) and frame gen (which doesn't get you banned and is in a large selection of games), those who bang on about wanting native and no upscaling ain't running at high refresh rate and high res without making substantial sacrifices to graphics.

If RT/PT is important, ensure you buy a gpu with capable HW support given all the games with RT these days, as evidenced, rdna 3 due to its lesser rt capabilities performs like a top end gpu from 3 years ago with it turned on so if future proofing is important (and RT/PT is the future), makes most sense to get a gpu that doesn't have to sacrifice here

Don't buy into vram being the be all like some have fallen for as shown by the 3090, all the vram in the world but not enough grunt to brute force broken or/and demanding games. General rule for 2023 and onwards though is don't spend >£400 on a 8gb gpu for 1440/4k gaming given you could get a 8GB £450 gpu back in 2020. I imagine if you're buying such a gpu nowadays, you will have to accept that settings will have to be reduced anyway due to lack of grunt.

And most importantly, avoid games that are obviously broken on launch and then fixed/improved several months later as shown by patch notes and videos comparing the patches e.g. starfield, lou, hogwarts to name just 3.
 
Future proofing? No such thing... (snip)
I don't think it's ever been a thing in tech, tbh. I started with a 486 SX25 and I don't think I've ever bought anything that was "future proof".

You just upgrade or you turn the "quality" sliders down (quotes because who knows what they actually do a lot of the time...)
 
Of course there loads of vaiables as you said as to the people piling money into cards for 3-4 years down the line will have to come to terms with making comprises at some point and no amount of VRAM is going to change that as even the 4090 is struggling on some games. It sounds great in theory to have a card for 4 years but I'm sure going by past data I dont think theres one card that can still maintain the same settings at its release and maintain the same settings in the latest games 4 years later.

Absolutely no one thinks you don't have to turn settings down 4 years later.

Case in point, an 8Gb 3070Ti is now relegated to low/mid textures on some AAA titles going forwards even@1080p which arguably has a higher viewing impact than turning the texture settings slider down from high to medium/low.
 
Back
Top Bottom