• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: What will be your 3 main factors for deciding on your next GPU?

What will be your 3 main deciding factors for your next gpu (ignoring external factors)?


  • Total voters
    310
So now we have rDNA 3 and ada out, for those who voted for power efficiency and are going to spend 1+k, which GPU will you be going for and why?
 
None cause they aren't power efficient. For the performance increase for both the 7900xtx and 4080 they aren't actually any better than my 6900xt in that regard since that sits nicely at 220watt compared to 400-500watt otherwise whilst appearing to only be about 30% slower in rasta.

So add the silly prices and yeah shall be waiting for the next generation.
 
Price and games are 2 big factors for me that aren't on the list. I don't forsee any games coming out before at least the next console generation that my current 6800 is going to struggle with. And I'd quite like to not spend more money on my next card than on my last one. So nothing this generation is even going to be worth considering. Maybe next gen, but my guess is I'm not going to be back in the market till at least gen after next, maybe 2025/2026
 
Could we have a follow on poll about the final purchase decision of those who voted for power consumption & efficiency. I would hypothesize that it was just a hygiene requirement that people might summarily ignore in favor of perf and price
 
Power efficiency is the #1 - Having seen what many 40 series peoples have been doing by power limiting their cards to between 60-80% means the cards are even more power efficient and run quieter/cooler. Guess this is one of the benefits of having a card that's so fast that you can power limit it and still be on top of last gen top end cards.

I was playing in MSI After Afterburner with my 3080 Ti FE recently and did some benches in a modern games that uses both RT and upscaling to good effect, Dying Light 2 and found that I lose less than 10fps by going from 100% power limit to 80%

On a VRR monitor the framerate difference is a non-issue, so I think it's fair to say that shaving off a few degrees in heat, lowering fan speed and lowering wattage draw for the sake of "losing" a few fps but not actually noticing it thanks to VRR is a worthwhile exercise.

RTX-3080-Ti_MSIAB_Power-Limit-Bench.jpg


Now if only the prices could drop....
 
Could we have a follow on poll about the final purchase decision of those who voted for power consumption & efficiency. I would hypothesize that it was just a hygiene requirement that people might summarily ignore in favor of perf and price

Would be interesting that tbh but someone else can create the thread :p

Power efficiency is the #1 - Having seen what many 40 series peoples have been doing by power limiting their cards to between 60-80% means the cards are even more power efficient and run quieter/cooler. Guess this is one of the benefits of having a card that's so fast that you can power limit it and still be on top of last gen top end cards.

I was playing in MSI After Afterburner with my 3080 Ti FE recently and did some benches in a modern games that uses both RT and upscaling to good effect, Dying Light 2 and found that I lose less than 10fps by going from 100% power limit to 80%

On a VRR monitor the framerate difference is a non-issue, so I think it's fair to say that shaving off a few degrees in heat, lowering fan speed and lowering wattage draw for the sake of "losing" a few fps but not actually noticing it thanks to VRR is a worthwhile exercise.

RTX-3080-Ti_MSIAB_Power-Limit-Bench.jpg


Now if only the prices could drop....

That's why I am surprised to see so many going for the 7900xtx over the 4080 tbh, of course it is cheaper and offers similar raster perf. but long term with the current energy costs, that difference would be mitigated further unless you plan on only keeping it for a year or 2:

rknJByP.png

iLkn1WZ.png

vEM84s7.png


Currently have my 3080 undervolted so in max usage, it doesn't go above 298w at most, usually in most games, it is around 280w. Ampere really is an undervolting champ and looks like ada is pretty damn good too.
 
  • Like
Reactions: mrk
Yup. I can see myself getting a 4080 if the price drops to the £800~ mark and simply power limiting it to 80% and sitting back all smug because it's quiet and cool and draws less power than my 3080 Ti FE whilst still being faster at the same things, and where applicable, has fake frames to double the framerate further :p
 
Power efficiency is the #1 - Having seen what many 40 series peoples have been doing by power limiting their cards to between 60-80% means the cards are even more power efficient and run quieter/cooler. Guess this is one of the benefits of having a card that's so fast that you can power limit it and still be on top of last gen top end cards.

I was playing in MSI After Afterburner with my 3080 Ti FE recently and did some benches in a modern games that uses both RT and upscaling to good effect, Dying Light 2 and found that I lose less than 10fps by going from 100% power limit to 80%

On a VRR monitor the framerate difference is a non-issue, so I think it's fair to say that shaving off a few degrees in heat, lowering fan speed and lowering wattage draw for the sake of "losing" a few fps but not actually noticing it thanks to VRR is a worthwhile exercise.



Now if only the prices could drop....

Your absolutely right about VRR. Without an FPS counter up, I barely notice if I am getting 50-60FPS in CP2077 or 144+ FPS lighter.

Funny you end your post saying "if only prices drop"... you clearly dont need an upgrade :p

Id love to see 4080s drop to £800~ .. but I just cant see that happening. Not anytime soon. We didnt see that kinda discount even on the terrible value and now redundant 3090Ti.

With the preformance of the 7900XTX as it is... I doubt nVidia will lower the price much at all. If anything... this is pushing folks right up to the 4090, as its the "best value" despite being £2k+ for some.


Going back to the original question.... I just want a card with absolutely 0 coil-whine across the board. Im sick of this lottery. I love all these chunky coolers we get now, with even FE / ref cards having nice quiet coolers... but its all for nothing with these whinney cards... id rather fan noise than whine.
 
Last edited:
You misunderstood my context! I would get a 4080 at around £800 for the power efficiency gains (whilst it still would be faster even when manually power limited to around 80% say than my current card) - I'm all about that quiet and low power draw yet still powerful gaming experience :D

And after selling the 3080 Ti, it would only be a small amount on top to get the 4080, assuming it ever drops to the 800 mark lol.
 
You misunderstood my context! I would get a 4080 at around £800 for the power efficiency gains (whilst it still would be faster even when manually power limited to around 80% say than my current card) - I'm all about that quiet and low power draw yet still powerful gaming experience :D

And after selling the 3080 Ti, it would only be a small amount on top to get the 4080, assuming it ever drops to the 800 mark lol.
But what is power efficency really? Money saved in power. How long is it going to take to pay off even the best-case price difference between the 2 cards in power savings?

If the 4080 dropped to £800... your 3080Ti is going to be worth less than half that used. Even if you managed to get £500 for it at that point... thats £300 in "savings" needed just to get back to if you just kept the 3080Ti longer.

My point being... when spending £800+ , £1000+ or nutz £2000+, who really cares about power consumption?
At 400W, a GPU would cost you 14p to run an hour...
If you gamed for 40hrs a week, for 50 weeks of the year... your still "only" at £272... compared to the purhcase cost?

Lets say you saved 100W by upgrading to a more efficent card and power limiting it... that saves you: £68 in the above example. If you spent £300 for that saving... youd need almost 3 years to "save" anything...
And thats gaming a frankly insane 40hrs a week almost all year round. Game less... it takes far longer.

While its nice to have a more erfficent card, in terms of what really matters to most folks... £ in your pocket... its a pretty negibile cost compared to the purchase cost.

So... the only real reason to ever upgrade is for the extra performance.
 
Power efficiency/consumption isn't just cost but also means runs cooler thus quieter (obviously better coolers help here but if efficient then the cooler doesn't need to be as good nor work as hard)
 
The money side doesn't really matter, power efficient means less heat generated which means the GPU fans spin slower which means a more silent top tier gaming experience.

My 3080 Ti FE at the moment, even with an undervolt or power limit set still spins the fans up to 2100rpm in some games purely because they ramp up the core temp. I could then set a custom fan curve yes but that would just lead to the core being throttled due to heat, which means a larger framerate drop due to the bigger throttle. DLSS reduces the core stress which reduces the fan speed which is great, but not all games have DLSS, so having more efficient and powerful rasterisation performance is also important.

The 40 series seem to be so relaxed with power that there's ample slack when lowering the power limit, and they still remain faster than last gen.

For me a modern day gaming PC = power but quiet at the same time when gaming. The only audible thing in my PC when gaming is the GPU, my CPU AIO fans remain at low RPM as do case fans, the CPU temp in gaming is in the 40s,

With the 3080 Ti I have caught a whiff of that efficiency after power limits/UV are set in some games, I want that in all games now, which is where a 4080 (or 5080) would come into the picture. Whatever I get next I will not be running it at stock because I now know what is possible by simply power limiting the card, and still gaining performance vs what I have now. It's a 100% win/win really.
 
Last edited:
Something tells me all those that preached about power efficiency/consumption and bought a 7900xtx won't be posting here now, that or they'll change their votes.... :p ;) :D :cry:
 
Yeah the XT seems to be the one equivalent power draw wise to the 4080. The XTX seems to be on average using ~50W more when under load. However, if we are comparing the 7900XTX to its previous gen sibling, depending on the game, it can show a drop. I think Eurogamer showed for example in Control, a 27% reduction vs the 6900XT. And 27% in Dying Light 2. So it depends on the angle people are coming from. Nvidia fans will say it's inefficent and uses ~50W more power. AMD fans will say, compared to the 6900XT it is more efficient. And the multi-monitor and video power draw? That has been identified as a driver bug.
 
Last edited:
Yeah the XT seems to be the one equivalent power draw wise to the 4080. The XTX seems to be on average using ~50W more when under load. However, if we are comparing the 7900XTX to its previous gen sibling, depending on the game, it can show a drop. I think Eurogamer showed for example in Control, a 27% reduction vs the 6900XT. And 27% in Dying Light 2. So it depends on the angle people are coming from. Nvidia fans will say it's inefficent and uses ~50W more power. AMD fans will say, compared to the 6900XT it is more efficient. And the multi-monitor and video power draw? That has been identified as a driver bug.
Not wrong on a few of those points but we had some people shouting from the rooftops about rdna 2 power efficiency compared to ampere in raster even though as you can see from the above tpu charts, it wasn't that much different especially when ampere undervolted so well, rdna 2 could be tweaked very well too but not undervolted as well as amere iirc.

It is just the usual spiel we see when it comes to nvidia vs amd, when one side does better, it's the best thing ever and the most important metric then when the other side takes the lead in said area, suddenly it doesn't matter and the other side swear by it :D Has happened many times with tessellation, vram, dx 11 and dx 12 perf., power efficiency. I can't wait for when/if amd smash nvidia on RT :D
 
If they ever match or beat nvidia on RT, then it will be a time where it is just as trivial a tech as we consider HBAO etc, remember when those effects were framerate hogs back in the early days?

Also Intel will be a genuine high end GPU competitor then too so the landscape will be considerably different!
 
Back
Top Bottom