• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Battle of the GPUs: Is power efficiency the new must-have?

Caporegime
Joined
24 Sep 2008
Posts
38,284
Location
Essex innit!
Even just a year ago, having a hot-running graphics card such as AMD's R9 290X, was par for the course. Admittedly, there have been hotter and cooler examples of 'the must have' GPU over the years but in general, if it's good value and performs well, I'm usually sold.

This is especially true with me as I usually rip the stock cooler off a new graphics card straight away and fit a waterblock, so heat has never really bothered me. The exceptions were excessively inefficient models such as Nvidia's GTX 480, which weren't that fast and could heat your average Olympic swimming pool. Equally, AMD's dual-GPU offerings have often generated too much heat and been overkill for my needs.

However, something changed with Nvidia's GTX 750 Ti, which was the first of its Maxwell GPUs for desktop. Here was a graphics card so power efficient, that it actually became popular with Bitcoin miners even though until that point, Nvidia had been lagging behind AMD, which had sold shedloads of cards to digital currency miners.

Likewise for gamers and anyone that appreciates an efficient bit of hardware, the GTX 980 and GTX 970 proved to be equally good when it comes to your electricity bill. The GTX 980, for example, drew 299W under load (that's the combined system load), while the GTX 780 comes in at 373W and AMD's R9 290X at 409W. I nearly couldn't believe my eyes when I saw Matt's review.

Battle of the GPUs: Is power efficiency the new must-have? Why I love Nvidia's 9-series: should AMD be worried?
Nvidia's GTX 980 drew over 100W less than the R9 290X in our tests - AMD has to deal with this deficit with its new GPUs.

That's a huge improvement, and with my little office getting unbearably hot in the summer if I even think about gaming, the ability to consume 100W less and have a more powerful graphics card at the same time is a godsend. In fact, part of my reason for not owning a 4K screen at the moment, as I discussed in my recent blog - AMD and Nvidia need to step up to the 4K challenge is that even if I could get my hands on the GPU horsepower to deal with all those pixels, the heat generated wouldn't be tolerable in the summer.

So, things are actually looking good in terms of GPU power efficiency but I inevitably began to compare the GPU market with the CPU one, specifically the fact that AMD's CPUs are so much hotter-running and less efficient than Intel's. Once you overclock them the difference is catastrophically huge, with AMD's FX CPUs drawing a huge amount of power. The contrast here, though, is huge compared to the battle between Nvidia and AMD. Here, the two are out of sync in GPU launches, so while Nvidia already has its high-end next-gen graphics cards out in the wild, all eyes are on AMD to come up with a competitive product, which we expect to land realtively soon.

Battle of the GPUs: Is power efficiency the new must-have? Why I love Nvidia's 9-series: should AMD be worried?
The R9 290X is a toasty customer, so much so that water-cooling it actually eliminates thermal throttling and boosts performance over reference cooler-equipped models, even at default frequencies - click to enlarge

In fact, you only have to look in our forum to see hardware spec-filled signatures sporting an at-a-glance equal number of Radeon or GeForce - both have offered up excellent products in the last 24 months and it's only Nvidia's recent 900-series launch that has started a cycle of GPU launches that began with the GTX 750 Ti, and will likely end with AMD's mid-range offerings next year, or possibly with the eagerly-awaited GTX 960.

While AMD might seem to be on the back foot at the moment, a lot can be put down to the fact it's out of sync and Nvidia was first to market with a new product. I'm genuinely excited to see what it comes up with as my aging GTX 660 Ti has seen better days. However, I do think that to really win the best GPU crown, AMD has to reign in its power consumption, even if the supposed R9 390X is a lot faster than the GTX 980. Absolute performance is all very well but bucking the trend of better power efficiency would be unwise. Hopefully we won't have to wait too long to find out if AMD has managed to do it.
http://www.bit-tech.net/blog/2014/11/17/why-i-love-nvidia-s-9-series-should-amd-be/

An interesting read there :)
 
The problem is I see the GTX960 barely being better than a GTX770 at 1920X1080,but with better power consumption:

http://tpucdn.com/reviews/ASUS/GTX_970_STRIX_OC/images/perfrel_1920.gif

That makes it about 50% better than my GTX660,two and a half years after it launched. At this rate it will take 4 to 5 years to double its performance.

This is where the performance stagnation is the worst - the under £200 market,especially when products are gimped to make sure they cannot really compete with higher end ones. My GTX660 has gimped power limits,meaning only a third party BIOS will show any gains from overclocking! :(
 
Last edited:
You don't spend £300+ on a GPU just to save pennies a day on the electricity bill, the whole efficiency deal is hype. Enthusiasts don't really care about it, it's just an aspect of the cards you can brag about, 'why my cards are faster than others and they save on my electricity bill too, how amazing is that?'
 
Power draw not so much an issue for me but if that power draw corresponds to alarmingly high levels of heat being expelled into the room then it can become an issue.

From a cost POV though: no.
 
You don't spend £300+ on a GPU just to save pennies a day on the electricity bill, the whole efficiency deal is hype. Enthusiasts don't really care about it, it's just an aspect of the cards you can brag about, 'why my cards are faster than others and they save on my electricity bill too, how amazing is that?'

Pretty much.

Before the 980/970 release, I have never seen anyone on this forum take power consumption & efficiency into consideration (certainly not being a main factor anyway) unless they had really bad PSU's and were on a strict budget....

Someone on here worked out the difference between the 970 and 290 to be £50 spread across 2 years with 4 hours of gaming every day.

Of course, the less heat and less noise due to better power "efficiency" are nice bonuses but again, not as big of a factor like bang per buck imo.

Now when there are "worthwhile" differences i.e. at least a 25+ degree, 30+ decibel and 150W difference between very similar performing cards then it will be a big deal :p
 
Last edited:
Change all your light in your house to LED's to notice a considerable drop to your electricity bill. Probably pay for a WC loop within a year. and the WC loop power draw negates the difference.

Switching to Nvidia won't really show up for the few hours you game a week over an AMD setup. And for heat, a decent full case gives the cooling you need but if you insist on an MAtx then Nvidia is a sensible choice and here power consumption is important along with noise if you dont use headphones.

All want to process high res and leading graphics but expect no increase in power to run them?

My Mrs will still leave the lights on, she will still 'drive' down hill in her 3.2ltr wheels, she'll still leave the heating on then let all the heat out by leaving the door open when drying the dog.

Time spent with GPU's at gaming power draw levels and the difference in watts between AMD & NVIDiA = pence per month.
 
You don't spend £300+ on a GPU just to save pennies a day on the electricity bill, the whole efficiency deal is hype. Enthusiasts don't really care about it, it's just an aspect of the cards you can brag about, 'why my cards are faster than others and they save on my electricity bill too, how amazing is that?'

But it does relate to heat. I care about heat/noise a lot. And it's due to the cool running nature of the 980's why I finally took the SLi plunge. So it mattered, albeit indirectly, to me.
 
This man gets it :cool:

I'm onboard with the efficiency lark. It's not about saving the electric bill, it's about reducing heat and noise too.

Having an efficient GPU means there's more headroom for overclocking and full bore cards like the titans.

Efficiency doesn't mean sacrificing overall power for low TDP - it's about a balance of both.
 
It's OTOH CAT... on the other hand. :)

I seem to recall when ATI were efficiency kings it didn't matter... which is it? :?

Where it really matters is mobile and Nvidia fans have been rubbishing the idea in the A8X vs K1 threads.
 
Depends.. I love having silly overclocked 970's and barely breaking 550w from the wall at worse case.

I could not have 2 silly clocked 290/290x's on PSU which would offer the same performance. To which I would have to buy a new power supply.

There's a fine line to which power efficiency is acceptable and when it's not.
 
I had put two 290s and a 1000 watt power supply in my basket on ocuk and was waiting till my mate gave me the money he owed me to take the plunge and press buy.

Then the 970s launched and after reading reviews about how much power they consume and more importantly to me was they had DSR then I changed my basket and bought 2 970s instead saving me around £150 because I didn't have to buy a new PSU.

In the long term its going to be power efficiency that wins the battles. NVIDIA have made a crape load of money with the 970/980 and its still using 28nm. Just imagine how powerful they could make their cards if they pushed the 250watt envelope per card again.
 
Personally it is the performance that count or I'd play Angry Birds on my phone, however I appreciated Nvidia compared to AMD as you get good performance without consuming as much as an electric oven.
 
Something else to consider is that some of the benchy's that humbug and others have put up show that the 980 isn't really a great deal more efficient than past GPU's. In certain tests / benchy's power spikes go to as much as the Titan's use. So the low TDP might be a tad exaggerated.

A die shrink (Really overdue) will bring tangible power reduction along with things like HBM and Nvidia's equivalent, to lower overall TDP. Although for the top cards I would rather that headroom was used for more shaders to give 'more performance' at same power use, rather than 'same performance' at lower power use.
 
Back
Top Bottom