• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RDNA 3 rumours Q3/4 2022

Status
Not open for further replies.
Yeah, Lisa said they were on target to achieve their 50% improvement in performance per Watt

IIRC the stated improvement was up to over 50% improvement in performance per watt. If so, then all that would be required would be for one example. A single 7xxx card at stock giving 50%+ better performance per watt than a single heavily overclocked and overvolted 6xxx card would do. Not even necessarily a single case of the same level of performance at a much lower power draw. 40fps at 120W is a 50% higher performance per watt than 80fps at 360W.

I think there will be an overall improvement in performance per watt, including for comparable performance, but I'd be surprised if it was >50% improvement across the board.
 
IIRC the stated improvement was up to over 50% improvement in performance per watt. If so, then all that would be required would be for one example. A single 7xxx card at stock giving 50%+ better performance per watt than a single heavily overclocked and overvolted 6xxx card would do. Not even necessarily a single case of the same level of performance at a much lower power draw. 40fps at 120W is a 50% higher performance per watt than 80fps at 360W.

I think there will be an overall improvement in performance per watt, including for comparable performance, but I'd be surprised if it was >50% improvement across the board.

Well said. All these companies will use certain conditions to come up with these figures.
 
IIRC the stated improvement was up to over 50% improvement in performance per watt. If so, then all that would be required would be for one example. A single 7xxx card at stock giving 50%+ better performance per watt than a single heavily overclocked and overvolted 6xxx card would do. Not even necessarily a single case of the same level of performance at a much lower power draw. 40fps at 120W is a 50% higher performance per watt than 80fps at 360W.

I think there will be an overall improvement in performance per watt, including for comparable performance, but I'd be surprised if it was >50% improvement across the board.
Just went back and checked she said over 50%.

Regarding your example it does not work because voltage curves (?) are not linear so the efficiency of the 360w card would increase as the voltage is reduced. It is widely accepted that you either test at the same wattage or you show the card achieving the same result with half the power.

Yes AMD could pull some shenanigans but they would be slated by everyone if they tried that.
 
Just went back and checked she said over 50%.

Regarding your example it does not work because voltage curves (?) are not linear so the efficiency of the 360w card would increase as the voltage is reduced. It is widely accepted that you either test at the same wattage or you show the card achieving the same result with half the power.

Yes AMD could pull some shenanigans but they would be slated by everyone if they tried that.
I wouldn't get too excited just yet, don't forget RDNA2 was 65% performance per watt over RDNA1 according to AMD https://www.amd.com/en/technologies/rdna-2 yet comparing the 150w 5600XT to the 160w 6600XT the performance gain was only around 20% and let's not forget the price increased by 36%.
 
It also doesn't work if you cut the number of CUs and up the clock speed to compensate.

The 6800 XT and 6900 XT have the same power consumption at stock.

The Ryzen 5900X and 5950X also have the same power consumption at stock.
 
It also doesn't work if you cut the number of CUs and up the clock speed to compensate.

The 6800 XT and 6900 XT have the same power consumption at stock.

The Ryzen 5900X and 5950X also have the same power consumption at stock.
Cutting the CUs and upping clock speed is a great way to increase profits though especially coupled with a price increase ;)
 
Cutting the CUs and upping clock speed is a great way to increase profits though especially coupled with a price increase ;)

It is a fair thing to do. People don't actually care about power consumption (as much as they make out) as long as they can cool it.

Waste of silicon not to be pushing it into the less efficient range.
 
Last edited:
It is a fair thing to do. People don't actually care about power consumption (as much as they make out) as long as they can cool it.

Waste of silicon not to be pushing it into the less efficient range.
Uhm, are you well off?
I think most of the UK are certainly worried my dude, well especially those who game for long hours, I don't but some do.
 
Uhm, are you well off?
I think most of the UK are certainly worried my dude, well especially those who game for long hours, I don't but some do.

50-100W extra for a few hours a day is still nothing when you are dropping £500 on a graphics card.

Even if you somehow gamed 6 hours a day for 200 days a year, at +100W that is only +120 units of electricity.
 
Last edited:
I wouldn't get too excited just yet, don't forget RDNA2 was 65% performance per watt over RDNA1 according to AMD https://www.amd.com/en/technologies/rdna-2 yet comparing the 150w 5600XT to the 160w 6600XT the performance gain was only around 20% and let's not forget the price increased by 36%.

5700XT power consumption 219 Watts: performance 100%
6900XT power consumption 299 Watts: performance 201%

Source TPU.

299 / 219 = 1.36, the performance is 2.01. An efficiency difference of 65%.

101% more performance for 36% more power.
 
Last edited:
250w max with a MAX OC on a 3070 vs 450+ on a 4070 is massive though.

That isn't really performance equivalent is it, if this discussion is about performance per watt.

Max OC anything isn't what AMD (or Nvidia) are selling or advertising.

RTX 4070 is also rumoured to be 285W. Not that outlandish number.

 
Last edited:
It is a fair thing to do. People don't actually care about power consumption (as much as they make out) as long as they can cool it.

Waste of silicon not to be pushing it into the less efficient range.

It's the damn HEAT. Watts= heat. My sim rig with a 400+w 3080Ti requires a window-mount AC unit just for that room. Sure, I could turn down my central AC to compensate for all the heat dumped into that one room, but then my wife would get frozen out of every other room in the house.

Watts = heat.
 
It's the damn HEAT. Watts= heat. My sim rig with a 400+w 3080Ti requires a window-mount AC unit just for that room. Sure, I could turn down my central AC to compensate for all the heat dumped into that one room, but then my wife would get frozen out of every other room in the house.

Watts = heat.

Yet you bought a 3080 Ti at 350W stock. So did you care?

You could limit it's power if you wanted (it's what miners do to get optimal performance per watt, or people with SFF systems), but you won't do that either I suspect.
 
Last edited:
It's the damn HEAT. Watts= heat. My sim rig with a 400+w 3080Ti requires a window-mount AC unit just for that room. Sure, I could turn down my central AC to compensate for all the heat dumped into that one room, but then my wife would get frozen out of every other room in the house.

Watts = heat.
I recall the 480 with an OC, 350 watts.. FORGET PLAYING IN SUMMER
 
Status
Not open for further replies.
Back
Top Bottom