• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Poll: What will be your 3 main factors for deciding on your next GPU?

What will be your 3 main deciding factors for your next gpu (ignoring external factors)?


  • Total voters
    310
Yeah the XT seems to be the one equivalent power draw wise to the 4080. The XTX seems to be on average using ~50W more when under load. However, if we are comparing the 7900XTX to its previous gen sibling, depending on the game, it can show a drop. I think Eurogamer showed for example in Control, a 27% reduction vs the 6900XT. And 27% in Dying Light 2. So it depends on the angle people are coming from. Nvidia fans will say it's inefficent and uses ~50W more power. AMD fans will say, compared to the 6900XT it is more efficient. And the multi-monitor and video power draw? That has been identified as a driver bug.
I have a 6900xt and I haven't seen anything close. The graphs up top don't even show the right power usage tbh. The 6900xt stock uses 255w, put on quite mode and it sits at 225w approx and looses about 1% so margin of error to that stock. But the FPS are similar to what they get at the 225w power setting. So the XTX being 393w approx (if we assume that correct or anything close too then) means 75% more power for an average of 30% more performance over the complete gaming set. That to me doesn't seem great.

So no the new AMD stuff certainly doesn't seem efficient at all.
 
Not wrong on a few of those points but we had some people shouting from the rooftops about rdna 2 power efficiency compared to ampere in raster even though as you can see from the above tpu charts, it wasn't that much different especially when ampere undervolted so well, rdna 2 could be tweaked very well too but not undervolted as well as amere iirc.

It is just the usual spiel we see when it comes to nvidia vs amd, when one side does better, it's the best thing ever and the most important metric then when the other side takes the lead in said area, suddenly it doesn't matter and the other side swear by it :D Has happened many times with tessellation, vram, dx 11 and dx 12 perf., power efficiency. I can't wait for when/if amd smash nvidia on RT :D
That isn't true though. Loads of AMD users including myself ahve said they failed as per my bit above. Although I also don't see the power efficiency of the 3000 series. My ex housemate had a 3060Ti that stock to stock pulled 75watt more for 35% less performance. Those graphs make zero sense to me. Gaming my 6900xt in quite mode pulls 225watts which is 80w less than what they claim in those tables. I don't really get that. How much performance do you loose to get a 3080 in gaming to get to that wattage?
 
If they ever match or beat nvidia on RT, then it will be a time where it is just as trivial a tech as we consider HBAO etc, remember when those effects were framerate hogs back in the early days?

Also Intel will be a genuine high end GPU competitor then too so the landscape will be considerably different!

Intel are definitely one to look out for! Getting some big improvements already and their RT plus xess is pretty good, not bad considering their very first dGPU attempt :cool:

That isn't true though. Loads of AMD users including myself ahve said they failed as per my bit above. Although I also don't see the power efficiency of the 3000 series. My ex housemate had a 3060Ti that stock to stock pulled 75watt more for 35% less performance. Those graphs make zero sense to me. Gaming my 6900xt in quite mode pulls 225watts which is 80w less than what they claim in those tables. I don't really get that. How much performance do you loose to get a 3080 in gaming to get to that wattage?

Obviously I'm not referring to "every" amd fan/user, as said in another thread, even on amds reddit, there is a good amount of amd owners/fans calling amd out for their claims and what is a poor product in the grand scheme of things. OCUK forum members are heavily biased to amd.

Not very much:


Also, make sure you are testing demanding games with RT maxed e.g. cp 2077 and metro 2 and dl 2 make my gpu work far harder than running something like valheim, even if that game is running at 140+ fps.
 
  • Like
Reactions: mrk
The money side doesn't really matter, power efficient means less heat generated which means the GPU fans spin slower which means a more silent top tier gaming experience.

My 3080 Ti FE at the moment, even with an undervolt or power limit set still spins the fans up to 2100rpm in some games purely because they ramp up the core temp. I could then set a custom fan curve yes but that would just lead to the core being throttled due to heat, which means a larger framerate drop due to the bigger throttle. DLSS reduces the core stress which reduces the fan speed which is great, but not all games have DLSS, so having more efficient and powerful rasterisation performance is also important.

The 40 series seem to be so relaxed with power that there's ample slack when lowering the power limit, and they still remain faster than last gen.

For me a modern day gaming PC = power but quiet at the same time when gaming. The only audible thing in my PC when gaming is the GPU, my CPU AIO fans remain at low RPM as do case fans, the CPU temp in gaming is in the 40s,

With the 3080 Ti I have caught a whiff of that efficiency after power limits/UV are set in some games, I want that in all games now, which is where a 4080 (or 5080) would come into the picture. Whatever I get next I will not be running it at stock because I now know what is possible by simply power limiting the card, and still gaining performance vs what I have now. It's a 100% win/win really.
So... new GPU is for quieter operation? With how much all these cards seem to coil-whine... seems almost irrelevant at this point.

Im their with you on silent PC gaming. My issue is the loudest part by a country mile is the coil-whine coming off my 3090FE...
Fans barely ever break 1100rpm... so very quiet. This with a slight UV, but stock fan curve.
 
My 6900 XT coil whine quite a lot in high FPS, my 3070 before had none of that but also the 6900 XT is a massively faster GPU.
Can't say it bothers me, I ramp the fans up on my 6900 XT above stock anyway.
 
Last edited:
So... new GPU is for quieter operation? With how much all these cards seem to coil-whine... seems almost irrelevant at this point.

Im their with you on silent PC gaming. My issue is the loudest part by a country mile is the coil-whine coming off my 3090FE...
Fans barely ever break 1100rpm... so very quiet. This with a slight UV, but stock fan curve.
Every mid to higher end GPU I have had to date has had some form of coil whine depending on the resolution/screen refresh rate passing through etc. Even my 3080 Ti FE has it but only at certain VRR ranges. This is something that cannot be avoided in the grand scheme of things, it is what it is, but the sort of coil whine I have had has not been annoying or overly distracting over anything else - And I think that's the main thing.

But yes, for me a new GPU would be for quieter operation, with less heat. At the moment I can feel 60-80 degrees of source-heat coming out from the top and back of my case below my desk. Whilst this might be perfect in winter, it is not a nice experience outside of December to March since the rest of the year is full of mild to melting weather thanks to climate change.

I suppose what I could do however is replace the thermal pads and paste on the FE cooler with stuff from Thermal Grizzly. The stock TIM on Nvidia cards has never been good and I always read about the massively reduced temps people see just from replacing the TIM, so will likely explore this option if nothing change on 4080 pricing by Feb time. Actually maybe worth exploring now even. A cooler card at stock means the standard power limits can stay put as it won't be throttling as it hits 80, since it should not hit 80 with aftermarket TIM.

I'd be fine with that I think.
 
Last edited:
Every mid to higher end GPU I have had to date has had some form of coil whine depending on the resolution/screen refresh rate passing through etc. Even my 3080 Ti FE has it but only at certain VRR ranges. This is something that cannot be avoided in the grand scheme of things, it is what it is, but the sort of coil whine I have had has not been annoying or overly distracting over anything else - And I think that's the main thing.

But yes, for me a new GPU would be for quieter operation, with less heat. At the moment I can feel 60-80 degrees of source-heat coming out from the top and back of my case below my desk. Whilst this might be perfect in winter, it is not a nice experience outside of December to March since the rest of the year is full of mild to melting weather thanks to climate change.

I suppose what I could do however is replace the thermal pads and paste on the FE cooler with stuff from Thermal Grizzly. The stock TIM on Nvidia cards has never been good and I always read about the massively reduced temps people see just from replacing the TIM, so will likely explore this option if nothing change on 4080 pricing by Feb time. Actually maybe worth exploring now even. A cooler card at stock means the standard power limits can stay put as it won't be throttling as it hits 80, since it should not hit 80 with aftermarket TIM.

I'd be fine with that I think.
You still seem to be doing odd mental gymnastics to make getting a 4080 over a 3080Ti be anything but an upgrade. The only reason to be upgrading a GPU really is for newer features (DLSS3) and more performance. Any arguments about lower heat or saving money with a higher efficency is just a side show.

Remember, replacing the TIM or pads might make your card run cooler but at the expense of MORE heat coming out of your PC. The more efficent your cooling solution is at removing heat from the GPU, the more heat will make it into your living space.

New coolers are not magic in terms of thermals. If you get a GPU which uses 300+W of power, but runs at a cool 55, that just means more of that 300W is being extracted away from the chip and out into your room, compared to say the same card running at 80, that just means less of the heat is being extracted away from the chip. Either because the TIM / Cooler on the card isnt up to the task, or maybe the case isnt very good at venting the heat out.

The only way to really lower the room temp is to have a lower powered computer in the first place. Or... move the computer out of the space you sit in.
If I owned instead of rented our house, I would drill a hole into the under-stairs cubboard and run my PC in that. Have some fans in the door with a thermal switch of some kind, so the heat gets dumped out into the kitchen. Then id have total silence in room where the PC sits, and none of the heat.
 
I think you've misunderstood my context here. I already have good performance and am happy with it, any extra from a new card is far as I'm concerned a bonus. Whatever card I get next is going to be mostly for the quieter operation. I won't be gaming in 4K as no QD-OLED is coming out on the horizon that's ultrawide and larger than 34" @ 3440x1440 - So for this res I'm sorted for a long while yet.

The 40 series does run cooler and quieter at these resolutions than the 30 series because they're so slack at below 4K.

And like I said, I'd only consider the switch if one drops to £800, so I can just add a couple hundred more after selling the 3080 ti. Otherwise it doesn't make sense.
 
I think you've misunderstood my context here. I already have good performance and am happy with it, any extra from a new card is far as I'm concerned a bonus. Whatever card I get next is going to be mostly for the quieter operation. I won't be gaming in 4K as no QD-OLED is coming out on the horizon that's ultrawide and larger than 34" @ 3440x1440 - So for this res I'm sorted for a long while yet.

The 40 series does run cooler and quieter at these resolutions than the 30 series because they're so slack at below 4K.

And like I said, I'd only consider the switch if one drops to £800, so I can just add a couple hundred more after selling the 3080 ti. Otherwise it doesn't make sense.

I dont think the 40 series will be running that slack even at 1440p UW. Probably still be kicking out 270-300W, a marginaly decrease over the 3080Ti.

Unless you sell your 3080Ti now, the gap will be more than a couple hundred.

Why would anybody buy your used 3080Ti for say £600 at a time when a 4080 can be bought for £800??
30 series cards are only worth as much as they are currently BECAUSE the next gen cards are so much MORE expensive.

By the time 4080s hit £800.... 30 series cards will probably be worth £400 or less.
 
Remember, replacing the TIM or pads might make your card run cooler but at the expense of MORE heat coming out of your PC. The more efficent your cooling solution is at removing heat from the GPU, the more heat will make it into your living space.

New coolers are not magic in terms of thermals. If you get a GPU which uses 300+W of power, but runs at a cool 55, that just means more of that 300W is being extracted away from the chip and out into your room, compared to say the same card running at 80, that just means less of the heat is being extracted away from the chip. Either because the TIM / Cooler on the card isnt up to the task, or maybe the case isnt very good at venting the heat out.
If 2 GPU's are running at a fixed 300w power but one is running significantly hotter than the other they will both still put out exactly the same amount of heat into a room. The amount of energy stored in a hot GPU heatsink is absolutely tiny and will eventually be released into the room when the GPU is turned off.
 
I dont think the 40 series will be running that slack even at 1440p UW. Probably still be kicking out 270-300W, a marginaly decrease over the 3080Ti.

Unless you sell your 3080Ti now, the gap will be more than a couple hundred.

Why would anybody buy your used 3080Ti for say £600 at a time when a 4080 can be bought for £800??
30 series cards are only worth as much as they are currently BECAUSE the next gen cards are so much MORE expensive.

By the time 4080s hit £800.... 30 series cards will probably be worth £400 or less.

The 4080 is below 250 watts when the fps is locked, which is something I will be doing since my refresh rate is 144Hz and I have a 3080 Ti lock at 120fps in all but the most unoptimised of games such as Witcher 3 next gen which is locked to 60fps:

wwL7QWa.png


For all intents and purposes, if/when I do get a 4080 (or 4080 ti assuming the prices are all decent by then), then the 4080 will be running faster than the 3080 Ti, whilst drawing much less power, generating much less heat, producing much less noise. For reference my 3080 Ti draws 315 watts locked at 60fps in Witcher, and that's with a 90% Power Limit set in Afterburner as well lol.

WHat is clear is that the 40 series is hugely power efficient vs any 30 series card worth talking about, by a massive margin. With my power limit still set at 90%, and frame caps unchanged, a 4080 will be drawing well under 200 watts for my uses.

Also the 3080 Ti currently sells used for as low as £450. Lots of examples on all the marketplaces.
 
Last edited:
If 2 GPU's are running at a fixed 300w power but one is running significantly hotter than the other they will both still put out exactly the same amount of heat into a room. The amount of energy stored in a hot GPU heatsink is absolutely tiny and will eventually be released into the room when the GPU is turned off.
Yes of course, just the rate at which that transfer happens will be slower.
More heat will stay inside the case for longer, thus the room might stay slightly colder for longer. Ultimately, yes eventually all the GPU heat will make it to room.
The 4080 is below 250 watts when the fps is locked, which is something I will be doing since my refresh rate is 144Hz and I have a 3080 Ti lock at 120fps in all but the most unoptimised of games such as Witcher 3 next gen which is locked to 60fps:




For all intents and purposes, if/when I do get a 4080 (or 4080 ti assuming the prices are all decent by then), then the 4080 will be running faster than the 3080 Ti, whilst drawing much less power, generating much less heat, producing much less noise. For reference my 3080 Ti draws 315 watts locked at 60fps in Witcher, and that's with a 90% Power Limit set in Afterburner as well lol.

WHat is clear is that the 40 series is hugely power efficient vs any 30 series card worth talking about, by a massive margin. With my power limit still set at 90%, and frame caps unchanged, a 4080 will be drawing well under 200 watts for my uses.

Also the 3080 Ti currently sells used for as low as £450. Lots of examples on all the marketplaces.
Sure, 3080Ti still sell for £450~ now. But once a 4080 costs £800.... no chance. Especially by then we will probably have 4070Ti and other lower tier cards that have 3080Ti performance for less money.

Upgrading GPU every generation is rarely "worth it" in terms of £ per FPS... however if you have the money and want to do so, no need to make a justification to anybody but yourself. Especially not with saving power or lowering temp in your room. Both of those things can be achieved in better ways with less money spent.
 
Yes of course, just the rate at which that transfer happens will be slower.
More heat will stay inside the case for longer, thus the room might stay slightly colder for longer. Ultimately, yes eventually all the GPU heat will make it to room.
It might be delayed for a few seconds which won't be noticeable. You made it sound like a better cooler would significantly affect the amount of heat released into a room.

Now that I think about it, a more efficient cooling solution on a GPU is likely to have a much larger heatsink than a bad one. It would be possible for a larger, cooler running heatsink to store more heat than a smaller, hotter running heatsink.
 
Last edited:
3D / Rasterizing performance all the way.

The ray tracing hardware needs to be scaled up a lot, as the cost to performance is currently too high. It's not something that game developers can simply fix with just better optimization.
 
3D / Rasterizing performance all the way.

The ray tracing hardware needs to be scaled up a lot, as the cost to performance is currently too high. It's not something that game developers can simply fix with just better optimization.
But it is, Doom Eternal is proof that when a developer is completely anal about optimisation, 150+fps is possible at max settings with max ray tracing, with no upscaling needed.
 
Last edited:
But it is, Doom Eternal is proof that when a developer is completely anal about optimisation, 150+fps is possible at max settings with max ray tracing, with no upscaling needed.
Optimisation in practice often just means reducing the quality or amount of something though. Realistic / detailed ray tracing is computationally expensive.

Most of the games on consoles are pretty well optimised for their hardware, but games with detailed ray tracing tend to be limited to 30 FPS.
 
Last edited:
Optimisation in practice often just means reducing the quality or amount of something though. Realistic / detailed ray tracing is computationally expensive.

Most of the games on consoles are pretty well optimised for their hardware, but games with detailed ray tracing tend to be limited to 30 FPS.
Ah but once again, Doom Eternal's RT upgrade didn't cut any corners :p


Even though I didn't like the gameplay, I marvelled at the way the engine ran with everything maxed.
 
Back
Top Bottom