• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: What will be your 3 main factors for deciding on your next GPU?

What will be your 3 main deciding factors for your next gpu (ignoring external factors)?


  • Total voters
    310
Caporegime
Joined
4 Jun 2009
Posts
31,856
Rasterization performance
Ray tracing performance
VRAM amount
Feature set on offer i.e. shadowplay, dlss, fsr
Power consumption & efficiency

Have left out price per performance as obviously that is the main thing for everyone and goes without saying and obviously, it would be nice to have all of them but as we all know, every gpu has some form of a compromise so this thread/poll is purely for what is most important as of now and for the future when it comes to your next purchase.

Any others or better poll choice suggestions?
 
So seems the OP options are good then?

Obviously can list order of preference but the poll will just be the 1 vote to see what is the main thing people are wanting next time round.

Was thinking of adding availability but wanted to keep it purely down to the gpu and not external factors.
 
Ok it seems you can give a certain amount of votes as opposed to just the 1.

So maybe top 2 or 3 deciding factors i.e. you get 2 or 3 votes rather than just being able to vote for 1 option?
 
Yup must admit, surprising to see vram as high. So far it seems raster, vram and power efficiency the most important, if that's the case then kind of makes you wonder what hope amd have got given RDNA 2 was better on those 3 fronts than ampere yet....

CICwcsF.png

Must be that "mindshare"

:cry:
 
Last edited:
That'll have something to do with it, no doubt. Other issues include availability, and also release timing - If the 6800xt had been out first, I'd have probably got that. In my case, I was building my new PC about 3 weeks prior to gpu launch - Nvidia released first, so it's the one I plumped for.

Yes, it took months to arrive, but I wasn't aware of what the supply issues were going to be.

Same here, had there been a chance of getting the 6800xt for MSRP, I would have got it but glad I didn't looking back as the 3080 and it's strengths for my gaming needs (dlss and ray tracing) ended up being invaluable over the past 1-2 years, that and it's also holding its value better as is the norm when it comes to second hand AMD Vs Nvidia
 
Last edited:
Bumping this since we now have the new gpus by both amd and nvidia.

I suspect peoples wants will still be the same so in theory, we should see RDNA 3 selling far more than ada as they have the better option for the top 3 criteria:

- raster
- vram
- power efficiency (at least for raster)

And obviously cheaper too, which is the main factor outside of the poll options.
 
Last edited:
I think price is important but not when talking about high end where GPUs costs 1+k, if people are willing to spend that, I'm sure they're willing to spend another couple/few hundred especially if it means getting a better/more complete package, it's the enthusiast end of the market and people will be willing to get whatever is best no matter the cost.

I think AMD have the top 3 results nicely covered with RDNA3.

They had these advantages for rDNA 2 over ampere too though but external factors such as no UK store (only choice was to pay crazy scalping prices), very poor stock didn't help.
 
Last edited:
No store is less important though this time around as Nvidia's cheapest next gen card will be £1267. Last time the £649 3080 was the card of choice.

True, however, the 7900xtx is going to be costing in the same band according to gibbo and that's only if pound doesn't drop and for black Friday deals so prices outside of black Friday will likely put it much closer to 4080 pricing.

7900 XTX reference at $999 with AIB reference around £1049-£1099 and custom cards around £1199-1299

Of course the 4080 could also come in more expensive too.....

It will probably come down to the same as rDNA 2 and ampere situation i.e. how much you value RT or raster or things like the feature set on offer. This time amd have fsr 2 so I suspect RT won't hold as much strength as it did with the rDNA 2/ampere launch.

This is complete rubbish, value for money still matters even if someone is an enthusiast. Also the difference between a 7900 XTX and 4090 isn't "another couple/few hundred", it's £600+.

Read some of the threads on here, some want the best no matter what, people upgrade from top end hardware even when there is no need for an upgrade. Pc gaming as a hobby is actually pretty cheap when you compare it to many other hobbies.....

I am one who considers myself an enthusiast and cares about bang per buck too hence why I am not considering either amd or Nvidia new GPUs.

Ps. I am also referring to rDNA 3/7900xtx competitor which is the 4080 as per amds own words.....
 
Last edited:
So now we have rDNA 3 and ada out, for those who voted for power efficiency and are going to spend 1+k, which GPU will you be going for and why?
 
Could we have a follow on poll about the final purchase decision of those who voted for power consumption & efficiency. I would hypothesize that it was just a hygiene requirement that people might summarily ignore in favor of perf and price

Would be interesting that tbh but someone else can create the thread :p

Power efficiency is the #1 - Having seen what many 40 series peoples have been doing by power limiting their cards to between 60-80% means the cards are even more power efficient and run quieter/cooler. Guess this is one of the benefits of having a card that's so fast that you can power limit it and still be on top of last gen top end cards.

I was playing in MSI After Afterburner with my 3080 Ti FE recently and did some benches in a modern games that uses both RT and upscaling to good effect, Dying Light 2 and found that I lose less than 10fps by going from 100% power limit to 80%

On a VRR monitor the framerate difference is a non-issue, so I think it's fair to say that shaving off a few degrees in heat, lowering fan speed and lowering wattage draw for the sake of "losing" a few fps but not actually noticing it thanks to VRR is a worthwhile exercise.

RTX-3080-Ti_MSIAB_Power-Limit-Bench.jpg


Now if only the prices could drop....

That's why I am surprised to see so many going for the 7900xtx over the 4080 tbh, of course it is cheaper and offers similar raster perf. but long term with the current energy costs, that difference would be mitigated further unless you plan on only keeping it for a year or 2:

rknJByP.png

iLkn1WZ.png

vEM84s7.png


Currently have my 3080 undervolted so in max usage, it doesn't go above 298w at most, usually in most games, it is around 280w. Ampere really is an undervolting champ and looks like ada is pretty damn good too.
 
  • Like
Reactions: mrk
Power efficiency/consumption isn't just cost but also means runs cooler thus quieter (obviously better coolers help here but if efficient then the cooler doesn't need to be as good nor work as hard)
 
Something tells me all those that preached about power efficiency/consumption and bought a 7900xtx won't be posting here now, that or they'll change their votes.... :p ;) :D :cry:
 
Yeah the XT seems to be the one equivalent power draw wise to the 4080. The XTX seems to be on average using ~50W more when under load. However, if we are comparing the 7900XTX to its previous gen sibling, depending on the game, it can show a drop. I think Eurogamer showed for example in Control, a 27% reduction vs the 6900XT. And 27% in Dying Light 2. So it depends on the angle people are coming from. Nvidia fans will say it's inefficent and uses ~50W more power. AMD fans will say, compared to the 6900XT it is more efficient. And the multi-monitor and video power draw? That has been identified as a driver bug.
Not wrong on a few of those points but we had some people shouting from the rooftops about rdna 2 power efficiency compared to ampere in raster even though as you can see from the above tpu charts, it wasn't that much different especially when ampere undervolted so well, rdna 2 could be tweaked very well too but not undervolted as well as amere iirc.

It is just the usual spiel we see when it comes to nvidia vs amd, when one side does better, it's the best thing ever and the most important metric then when the other side takes the lead in said area, suddenly it doesn't matter and the other side swear by it :D Has happened many times with tessellation, vram, dx 11 and dx 12 perf., power efficiency. I can't wait for when/if amd smash nvidia on RT :D
 
If they ever match or beat nvidia on RT, then it will be a time where it is just as trivial a tech as we consider HBAO etc, remember when those effects were framerate hogs back in the early days?

Also Intel will be a genuine high end GPU competitor then too so the landscape will be considerably different!

Intel are definitely one to look out for! Getting some big improvements already and their RT plus xess is pretty good, not bad considering their very first dGPU attempt :cool:

That isn't true though. Loads of AMD users including myself ahve said they failed as per my bit above. Although I also don't see the power efficiency of the 3000 series. My ex housemate had a 3060Ti that stock to stock pulled 75watt more for 35% less performance. Those graphs make zero sense to me. Gaming my 6900xt in quite mode pulls 225watts which is 80w less than what they claim in those tables. I don't really get that. How much performance do you loose to get a 3080 in gaming to get to that wattage?

Obviously I'm not referring to "every" amd fan/user, as said in another thread, even on amds reddit, there is a good amount of amd owners/fans calling amd out for their claims and what is a poor product in the grand scheme of things. OCUK forum members are heavily biased to amd.

Not very much:


Also, make sure you are testing demanding games with RT maxed e.g. cp 2077 and metro 2 and dl 2 make my gpu work far harder than running something like valheim, even if that game is running at 140+ fps.
 
  • Like
Reactions: mrk
Ah but once again, Doom Eternal's RT upgrade didn't cut any corners
:p



Even though I didn't like the gameplay, I marvelled at the way the engine ran with everything maxed.

It is pretty limited RT at least when compared to likes of CP 2077, metro ee, chernobylite, DL 2 etc.

Although it is a very good showcase of RT reflections, at least once you get to the metal/glass building/areas.


Speaking of RT reflections, just completed miles morale and wow, a lot of that game was made with RT reflections in mind, the last few missions especially, combined with HDR and QD-OLED, a sight to behold :cool: Wish there was a way to capture HDR screenshots without them being greyed out when uploading (think a way to convert them before uploading but too much faff!)
 
  • Like
Reactions: mrk
It shows how a good efficient implementation of RT can work really well even for RDNA 2 hardware that the consoles have. I feel that as consoles adopt more of these technologies it’s only going to benefit RDNA more and really close that gap. Once RDNA 3 consoles appear then things will get very interesting.

Been hearing that for ages now ever since amd first got into consoles :p If anything Nvidia perform better in the lazy console ports and even good ones like all of Sony's.

Consoles are badly due an upgrade/refresh now though.
 
Bump time.

A lot has changed since 2022:

- RT adoption rate exploding and exceeding expectations especially with seeing more RT only games and nearly every dev jumping to UE 5
- rdna 3 power efficiency claims ending up being a wet fart
- amd coming out with FG which is arguably up there with DLSS FG when implemented well as well as having a driver injection method and of course mods and other 3rd party options
- upscaling improvements for both camps
- titles launching in a much better state over the last year especially on vram optimisation front
- nvidia launching new features such as ray reconstruction, RTX HDR and AMD with FMF, anti lag

So I'm wondering if people have changed their mind now (can change vote in poll) and for reference, this is how the poll looks as of Aug 2024:

REPT39e.png
 
Last edited:
As stated in the poll title, "no external factors" because as we all know, pricing is so hit and miss depending on AIB, etailer, MSRP and if you can buy for MSRP and it kind of goes without saying that price is the most important for the vast majority of people.
 
Last edited:
Back
Top Bottom