• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Battle of the GPUs: Is power efficiency the new must-have?

It's OTOH CAT... on the other hand. :)

I seem to recall when ATI were efficiency kings it didn't matter... which is it? :?

Where it really matters is mobile and Nvidia fans have been rubbishing the idea in the A8X vs K1 threads.

You know what it's like, each side trying to make the most of what they have over the other.

Like when Nvidia had better frametimes than AMD and AMD owners would tell you that they couldn't notice the difference in smoothness (even before the framepacing patch). Now AMD have better frametimes it's suddenly super important and a game changer.
 
Something else to consider is that some of the benchy's that humbug and others have put up show that the 980 isn't really a great deal more efficient than past GPU's. In certain tests / benchy's power spikes go to as much as the Titan's use. So the low TDP might be a tad exaggerated.

Average power consumption through any benchmark or game scene is lower on the 970/980, regards of these spikes or not.
 
power consumption is important even if running costs absolutely are not. less power = less heat. less heat = cooler/simpler/cheaper/quieter cooling solutions.

Nexus18 said:
Pretty much.

Before the 980/970 release, I have never seen anyone on this forum take power consumption & efficiency into consideration (certainly not being a main factor anyway) unless they had really bad PSU's and were on a strict budget....

Someone on here worked out the difference between the 970 and 290 to be £50 spread across 2 years with 4 hours of gaming every day.

Of course, the less heat and less noise due to better power "efficiency" are nice bonuses but again, not as big of a factor like bang per buck imo.

Now when there are "worthwhile" differences i.e. at least a 25+ degree, 30+ decibel and 150W difference between very similar performing cards then it will be a big deal

Look at the people complaining about 290s running hot and throttling. imagine what dropping their power consumption by 100 watts per card would do....

People are still affected by consumption even if they aren't mentioning it. Rather than saying 'well, its never been a problem before', perhaps the answer is that people should be more aware of what their cards are doing and why. They probably would mention it a lot more then :)

Also, of course, lower consumption means more scope for bigger, faster gpus. That's something you lot should all be wanting!
 
In terms of running costs then no, you'll save a few quid over a year or two but nothing you'll notice.

However it is nice having cards that can acheive similar performance to the "high end" cards whilst using less power. Simply chucking more power at cards to gain more performance isn't the future.
 
I like my 980 SLI much more than my 7970/7990 trifire partly because it's many times quieter (I used to cap FPS on the trifire setup to keep the GPU load and therefore noise down) and it used to heat the room up. My gf would complain about the noise if she was watching TV.

At the same time, it is only a bonus to me. If there were faster and cheaper GPUs that were hotter, I'd buy them, unless the difference was very small.
 
Simply chucking more power at cards to gain more performance isn't the future.

Yes it is.

8zS7aWc.jpg
 
power efficiency is important but you dont want to upgrade to a slightly better card that uses less power. The performance jump needs to be there along with the lower power consumption. The 900 cards are a step in the right direction but now they need the performance increase.
 
You know what it's like, each side trying to make the most of what they have over the other.

Like when Nvidia had better frametimes than AMD and AMD owners would tell you that they couldn't notice the difference in smoothness (even before the framepacing patch). Now AMD have better frametimes it's suddenly super important and a game changer.

Exactly this! Guaranteed if AMD's next GPU's are better for power efficiency/consumption then the vocal AMD fanboys will be shouting from the roof tops about it and the vocal nvidia fanboys will be down playing it, in fact.... I am going to save this post for when/if that does happen :p

Look at the people complaining about 290s running hot and throttling. imagine what dropping their power consumption by 100 watts per card would do....

People are still affected by consumption even if they aren't mentioning it. Rather than saying 'well, its never been a problem before', perhaps the answer is that people should be more aware of what their cards are doing and why. They probably would mention it a lot more then :)

Also, of course, lower consumption means more scope for bigger, faster gpus. That's something you lot should all be wanting!

Agree with that!

My point was mainly referring to the competing GPU's i.e. a single 970 vs a single 290, there really isn't "enough" of a difference to be worth worrying about imo (at least I don't think it is worth the extra £80+ [purely for power efficiency]), the only time power efficiency will "really" come into play is when getting 2+GPU's and this is where 2x970's are FAR better than 2x290's.

Until AMD release a card which is power efficient you will only have one side agreeing to this.

IIRC has the power efficiency & consumption not been very similar between the 2 GPU brands but slightly in favour of AMD ever since the 4870??

Also not remember nvidia 470/480 compared to AMD's GPU's at the time? IIRC there was much more of a difference than what there is with the 290 and 970, yet power efficiency + consumption didn't seem to matter back then at all.....
 
Last edited:
All about the performance per watt for me, not overly bothered about saving on leccy, but when you get 780/290 matching performance from a 150w card, it makes me want to see what kind of performance you'd smash from a 250w+ card of the same architecture.
 
I'm not too bothered about power consumption the trouble is with increased power consumption has come more heat and noise which I hate, grapics cards have already gone way too far in terms of TDP which is why AMD/NVidia have long since started to implement artifical load caps to stop them from going nuclear or melting their inadequate VRM's under full load (i.e. Furmark etc). It's good to see at least one vendor looking at getting back to a sensible TDP as opposed to fitting increasingly exotic cooling measures.
 
All about the performance per watt for me, not overly bothered about saving on leccy, but when you get 780/290 matching performance from a 150w card, it makes me want to see what kind of performance you'd smash from a 250w+ card of the same architecture.

Depends on the situation, Maxwell has the ability to adjust how much power it pulls depending how much it needs resulting in lower averages, But regardless of what TDP Nvidia have given it will pull the same amount of Power as kepler when the situation calls for it, 300 Watts, not 150.
 
It's all about heat for me, lower power consumption = lower heat output. I don't care about how much you can save at the wall, means nothing. Probably only a few quid.
What's impressive with Maxwell is they've manage to increase the efficiency by so much on the same process :)
 
All about the performance per watt for me, not overly bothered about saving on leccy, but when you get 780/290 matching performance from a 150w card, it makes me want to see what kind of performance you'd smash from a 250w+ card of the same architecture.

^ That as well.
Maxwell scaled makes me go all wiggly
 
Depends on the situation, Maxwell has the ability to adjust how much power it pulls depending how much it needs resulting in lower averages, But regardless of what TDP Nvidia have given it will pull the same amount of Power as kepler when the situation calls for it, 300 Watts, not 150.

You do know what TDP is? :rolleyes:
 
It's all about heat for me, lower power consumption = lower heat output. I don't care about how much you can save at the wall, means nothing. Probably only a few quid.
What's impressive with Maxwell is they've manage to increase the efficiency by so much on the same process :)

+1

That is what makes Maxwell such a good architecture, excellent efficiency, balance of performance / poweruse / heat output, and that's still on 28nm with conservative TDP (We still haven't seen GM200 yet).

What Maxwell can achieve on a die shrink will likely make us giddy :p
 
Back
Top Bottom