• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RX 6700XT and 6800XT Owners a favour please.

Ran the Cyberpunk benchmark, 1080p ultra preset capped at 60 fps and the wattage peaked at 106W both on stock and undervolted 1.1v on a 6700xt.
Amazing. Could you by any chance just run or drive through the city and see if it craps itself at ultra settings 1080p 60? as whenever I see a benchmark running say 90-100fps 1080p ultra it's pulling 176w???

Majorly putting me off buying the card as I want to retain the 250w total draw at the wall with my monitor amp speakers and rig that I currently have as we pay ridiculous prices for electric per kw/h and I don't fancy spending £35-60 a month for the rest of the pcs ownership just to power it, as in a year the rig will be worth half anyway!
 
This review says 130 with vsync.

"V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics card that can't reach 60 FPS at 1080p, we report the power draw at the highest frame rate they can achieve."

But, this video says 160-165 (no vsync).
 
This is the problem. And it doesn't disclose wether that's ultra settings or high of medium.

Then I see people running it at say 90ish fps using 193w at 1080p ultra, so I don't want to stitch myself up and buy something that eats electric.

Does this matter long term though? Eventually you'll try to play something on it more demanding than Cyberpunk that'll pull maximum power draw even at 60fps.

Maybe best to adjust the power limit in the drivers to where it's acceptable for you (I don't know on newer cards but Adrenaline will let me a -50% power target for an RX580) then tune the game settings up or down to target 60FPS (whether that turns out to be medium @ 60fps @ 130w or ultra @ 60fps @ 130w will be on a per game basis).

If anything over 130w is unacceptable, I'd hazard a guess that most reasonably capable GPUs will consume that, and therefore you need to be looking at power efficiency graphs (i.e. FPS per Watt) rather than power consumption graphs to know which card will do the best for you at your 130w power budget.

A quick look at the graph here seems to show the 6700xt is reasonably efficient, but off the top of my head the 6700 non-xt will be more power efficient since it had a mobile gpu core on it I think. Also looking at that graph it looks like a 3060Ti would be 10% better fps/watt. The 4060Ti destroys the 6700xt in terms of efficiency.

Obviously you don't want to go too high up the GPU stack though because ultimately by setting a power limit you're going to have a lot of excess performance in reserve doing nothing.
 
Does this matter long term though? Eventually you'll try to play something on it more demanding than Cyberpunk that'll pull maximum power draw even at 60fps.

Maybe best to adjust the power limit in the drivers to where it's acceptable for you (I don't know on newer cards but Adrenaline will let me a -50% power target for an RX580) then tune the game settings up or down to target 60FPS (whether that turns out to be medium @ 60fps @ 130w or ultra @ 60fps @ 130w will be on a per game basis).

If anything over 130w is unacceptable, I'd hazard a guess that most reasonably capable GPUs will consume that, and therefore you need to be looking at power efficiency graphs (i.e. FPS per Watt) rather than power consumption graphs to know which card will do the best for you at your 130w power budget.

A quick look at the graph here seems to show the 6700xt is reasonably efficient, but off the top of my head the 6700 non-xt will be more power efficient since it had a mobile gpu core on it I think. Also looking at that graph it looks like a 3060Ti would be 10% better fps/watt. The 4060Ti destroys the 6700xt in terms of efficiency.

Obviously you don't want to go too high up the GPU stack though because ultimately by setting a power limit you're going to have a lot of excess performance in reserve doing nothing.
Yeah this is the worry/pending future issue. I only considered the 6700xt due to the extra vram future proofing.
I did consider the 4060ti as I can still use dlss/frame regen in the future if it runs out of vram.
Would you go for the 4060ti in this case then?
As I've seen some games push that to 160w?
I did wonder if I could get away with it by just undervolting either card?
 
Well, obviously 8gb vram isn't ideal at a higher price point. You may get away with 8gb at 1080p to a larger extent than you would have at 1440p. Developers will likely be forced to cater to 8gb cards since it's a large part of the market, but it wont last forever. I think the 4060Ti 16gb is rumoured to be an idiotic price also.

You need to factor in the price of the GPU and whether it's worth going for something more power efficient, or something with less initial outlay.

The cheapest 4060Ti on OcUK is £389, the cheapest 6700xt is £329 unless you want to fart about with rebates then it's £295 but that's another story.

At minimum it's costing £60 more to get a 4060Ti. On the TPU power consumption page it's saying the 4060Ti takes 59w to do the cyberpunk 60hz test, and the 6700xt needs 136w to make the exact same 60fps, so that's a 77w difference.

Power in the UK is about to be around 29.7p/kwh. So for every hour you game at the same 60fps on the 6700xt it's costing you 2.29p more than it would have on the 4060Ti. So you need to 2620 hours before it's more cost effective to buy the 4060Ti from a power efficiency / purchase price standpoint.

Personally I might get 10 hours a week of gaming time if that, so it'd take 5 years to recover the extra £60 outlay. If power was dearer or I gamed more then it'd pay off sooner.

Anyway, the point is you also need to balance in purchase price to your equation. I'm not sure where Brownsville is... Texas? If it is then google says you're paying 15c/kwh? What did you base your estimates on to get £35 - £60 a month for running your PC? Remember also if it pulls 250w at load it's much much less at idle or normal workloads, you should base your GPU power consumption costs around your average gaming hours.
 
Last edited:
Well, obviously 8gb vram isn't ideal at a higher price point. You may get away with 8gb at 1080p to a larger extent than you would have at 1440p. Developers will likely be forced to cater to 8gb cards since it's a large part of the market, but it wont last forever. I think the 4060Ti 16gb is rumoured to be an idiotic price also.

You need to factor in the price of the GPU and whether it's worth going for something more power efficient, or something with less initial outlay.

The cheapest 4060Ti on OcUK is £389, the cheapest 6700xt is £329 unless you want to fart about with rebates then it's £295 but that's another story.

At minimum it's costing £60 more to get a 4060Ti. On the TPU power consumption page it's saying the 4060Ti takes 59w to do the cyberpunk 60hz test, and the 6700xt needs 136w to make the exact same 60fps, so that's a 77w difference.

Power in the UK is about to be around 29.7p/kwh. So for every hour you game at the same 60fps on the 6700xt it's costing you 2.29p more than it would have on the 4060Ti. So you need to 2620 hours before it's more cost effective to buy the 4060Ti from a power efficiency / purchase price standpoint.

Personally I might get 10 hours a week of gaming time if that, so it'd take 5 years to recover the extra £60 outlay. If power was dearer or I gamed more then it'd pay off sooner.

Anyway, the point is you also need to balance in purchase price to your equation. I'm not sure where Brownsville is... Texas? If it is then google says you're paying 15c/kwh? What did you base your estimates on to get £35 - £60 a month for running your PC? Remember also if it pulls 250w at load it's much much less at idle or normal workloads, you should base your GPU power consumption costs around your average gaming hours.

Thanks for the reply mate, haha Brownsville is where M.O.P. are from, it's an inside joke amongst old mates! We pay something disgusting like 46p per kw/h and currently that's about 9-11p under load for my rig with a temp 140w tdp card + the rest of the pc/monitor... I game around 4-6 hours every eve... So yeah quickly adds up every year.
 
Thanks for the reply mate, haha Brownsville is where M.O.P. are from, it's an inside joke amongst old mates! We pay something disgusting like 46p per kw/h and currently that's about 9-11p under load for my rig with a temp 140w tdp card + the rest of the pc/monitor... I game around 4-6 hours every eve... So yeah quickly adds up every year.

In which case the extra £60 spent on a 4060Ti would pay its self off in about a year for you with that cost of electricity and 6 hours a day gaming. Every year you own the 6700XT would cost you another £58 unless the electricity price became more sane.

HHhmmm, that's a lot more difficult decision because of the ram. Not really sure what to suggest. I know for sure I wouldn't pay £500 for a 16gb 4060Ti, it's an idiotic price.

I've owned a GTX 460 768mb instead of the 1gb version of that card and constantly regretted it because high textures were always bumping into the limit and causing stuttering, developers were targeting 1gb ram.

On the flip side I've had a 290x 4gb which I had until a year and a half ago and it wasn't nearly as much of an issue for some reason.

Edit: sorry a bit of faulty math.
 
Last edited:
In which case the extra £60 spent on a 4060Ti would pay its self off in about a year for you with that cost of electricity and 6 hours a day gaming. Every year you own the 6700XT would cost you another £58 unless the electricity price became more sane.

HHhmmm, that's a lot more difficult decision because of the ram. Not really sure what to suggest. I know for sure I wouldn't pay £500 for a 16gb 4060Ti, it's an idiotic price.

I've owned a GTX 460 768mb instead of the 1gb version of that card and constantly regretted it because high textures were always bumping into the limit and causing stuttering, developers were targeting 1gb ram.

On the flip side I've had a 290x 4gb which I had until a year and a half ago and it wasn't nearly as much of an issue for some reason.

Edit: sorry a bit of faulty math.

I think if you spend 42 hours a week gaming(1/4 of the week) for 52 weeks straight,the best thing is to stop gaming so much if you care about a few quid extra quid in electricity costs a month! :p

Yet when Nvidia cards were consuming more power the previous generation nobody cared,despite electricity pricing starting to go up in late 2021,ie,18 months ago. The RX6600 and RX6800 were the most efficient cards of the last generation and the AMD cards tend to undervolt well. Plus things such as Radeon Chill existed since 2016 IIRC.

The whole spiel about the RTX4060TI "efficiency" is because Nvidia is trying to upselling the RTX4050 as the RTX4060TI. Its barely quickly than an RTX3060TI in many instances,and can't even match an RTX3070 a lot of the time. The RTX3060TI was matching 80 series dGPUs from the previous generation.

The RX6700XT has been as low as £285 after cashback,and the RX6800 16GB has been almost £400. Then you have the RTX3060 12GB for £250 and the RTX3060TI for £300.

I wouldn't touch the RTX4060TI with a bargepole at nearly £400. I am sure it will consume less power,as the 8GB of VRAM will mean you have to reduce settings. Anybody who wants a mainstream Nvidia dGPU should get an RTX3060 12GB or even an RTX3060TI 8GB for under £300 and learn to set custom curves,etc.


That is how poor the RTX4060TI - the RTX3060 is much slower but there are instances where the RTX3060 is faster than the RTX4060TI or has better 1% lows! The 8GB of VRAM and the 128 bit memory bus don't help the core at all.
 
Last edited:
How are you guys reading max power draw?

On cyberpunk My 6800xt is pullin about 266w max according to HWmonitor in an indoor gun fight.

Thats 1080p
HDR on
165hz

Mixture of high and medium settings
 
Last edited:
This threads been quite an education for me.. Almost double the gpu power requirement between 1080p 60fps 60hz and uncapped fps at 165hz at 1440p..

But to put it in perspective 100w light bulbs used to be common though! So 3 of them but that's just the gpu power, cpu and the rest of the system you can probably add another 40% to that....
 
This threads been quite an education for me.. Almost double the gpu power requirement between 1080p 60fps 60hz and uncapped fps at 165hz at 1440p..

But to put it in perspective 100w light bulbs used to be common though! So 3 of them but that's just the gpu power, cpu and the rest of the system you can probably add another 40% to that....
10 years back 1kwh of electricity here was 6.5p now its 38p so there was a reason to cut those damn bulbs out :D
 
Back
Top Bottom