Yeah, Lisa said they were on target to achieve their 50% improvement in performance per WattAt what power draw is the key point though. A better process should help here, but was there any info on that?
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Yeah, Lisa said they were on target to achieve their 50% improvement in performance per WattAt what power draw is the key point though. A better process should help here, but was there any info on that?
Yeah, Lisa said they were on target to achieve their 50% improvement in performance per Watt
IIRC the stated improvement was up to over 50% improvement in performance per watt. If so, then all that would be required would be for one example. A single 7xxx card at stock giving 50%+ better performance per watt than a single heavily overclocked and overvolted 6xxx card would do. Not even necessarily a single case of the same level of performance at a much lower power draw. 40fps at 120W is a 50% higher performance per watt than 80fps at 360W.
I think there will be an overall improvement in performance per watt, including for comparable performance, but I'd be surprised if it was >50% improvement across the board.
Just went back and checked she said over 50%.IIRC the stated improvement was up to over 50% improvement in performance per watt. If so, then all that would be required would be for one example. A single 7xxx card at stock giving 50%+ better performance per watt than a single heavily overclocked and overvolted 6xxx card would do. Not even necessarily a single case of the same level of performance at a much lower power draw. 40fps at 120W is a 50% higher performance per watt than 80fps at 360W.
I think there will be an overall improvement in performance per watt, including for comparable performance, but I'd be surprised if it was >50% improvement across the board.
Er, OK then.To be honest I would'nt give a damn whether it was 4 Mhz or 4 Thz , If it don't perform it don't mean squat!
I wouldn't get too excited just yet, don't forget RDNA2 was 65% performance per watt over RDNA1 according to AMD https://www.amd.com/en/technologies/rdna-2 yet comparing the 150w 5600XT to the 160w 6600XT the performance gain was only around 20% and let's not forget the price increased by 36%.Just went back and checked she said over 50%.
Regarding your example it does not work because voltage curves (?) are not linear so the efficiency of the 360w card would increase as the voltage is reduced. It is widely accepted that you either test at the same wattage or you show the card achieving the same result with half the power.
Yes AMD could pull some shenanigans but they would be slated by everyone if they tried that.
Cutting the CUs and upping clock speed is a great way to increase profits though especially coupled with a price increaseIt also doesn't work if you cut the number of CUs and up the clock speed to compensate.
The 6800 XT and 6900 XT have the same power consumption at stock.
The Ryzen 5900X and 5950X also have the same power consumption at stock.
Cutting the CUs and upping clock speed is a great way to increase profits though especially coupled with a price increase
Uhm, are you well off?It is a fair thing to do. People don't actually care about power consumption (as much as they make out) as long as they can cool it.
Waste of silicon not to be pushing it into the less efficient range.
Uhm, are you well off?
I think most of the UK are certainly worried my dude, well especially those who game for long hours, I don't but some do.
250w max with a MAX OC on a 3070 vs 450+ on a 4070 is massive though.50-100W extra for a few hours a day is still nothing when you are dropping £500 on a graphics card.
I wouldn't get too excited just yet, don't forget RDNA2 was 65% performance per watt over RDNA1 according to AMD https://www.amd.com/en/technologies/rdna-2 yet comparing the 150w 5600XT to the 160w 6600XT the performance gain was only around 20% and let's not forget the price increased by 36%.
250w max with a MAX OC on a 3070 vs 450+ on a 4070 is massive though.
It is a fair thing to do. People don't actually care about power consumption (as much as they make out) as long as they can cool it.
Waste of silicon not to be pushing it into the less efficient range.
It's the damn HEAT. Watts= heat. My sim rig with a 400+w 3080Ti requires a window-mount AC unit just for that room. Sure, I could turn down my central AC to compensate for all the heat dumped into that one room, but then my wife would get frozen out of every other room in the house.
Watts = heat.
I recall the 480 with an OC, 350 watts.. FORGET PLAYING IN SUMMERIt's the damn HEAT. Watts= heat. My sim rig with a 400+w 3080Ti requires a window-mount AC unit just for that room. Sure, I could turn down my central AC to compensate for all the heat dumped into that one room, but then my wife would get frozen out of every other room in the house.
Watts = heat.