• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

why do people care about power consumption when it comes to AMD but not Nvidia/ intel

Yet Vega comes out top in all recent games so far lol 1080 was faster on release not anymore.

There is an interesting take there - if you focus just on latest AAA releases (which might be all someone does) Vega 64 is often faster now in some cases even big gains over the 1080 like 20-30% - but if you look at a wider range of games say 50+ titles the 1080 comes out around 5% on top - with likewise some bigger gains over the Vega 64 in some titles.

EDIT: Or like the video above if you look at a few popular new titles, not necessarily AAA games, then there is mostly nothing much in it between the two.

Lower power consumption means less heat to deal with, less beefy PSUs required etc. its a bloody no-brainer that lower consumption should be preferable but people need to stop banging on about the cost of raw power consumption because it's poor argument and one that thankfully few people try to make. Stop assuming that people who mention consumption are concerned about the electricity costs - that is rarely ever the case.

Amen to that - especially if you are considering a multi GPU setup at the top end for 4K or whatever the less heat and lower power supply requirement the better - especially if you start moving into the 1+kw being ideal area where the power supplies are often a big jump in price and/or poor availability of the good ones.
 
Last edited:
What?

a 1080 is a 180w card which trades blows with Vega 64
a 1080ti us a 250w card and well beats Vega 64 cold

my undervolted 1080ti hits 1850mhz (vs a max OX of 2050mhz) and only slips 220w or less

Were you are getting 300w for a pascal card from I have no idea .....

I came here to say this as well. When people have to actually make up statistics it says a lot. A full GTX1080 system with a 8600K or whatever would draw around 300w under load.

Also all the talk of undervolting and what not is largely irrelevant. 90% of people, if not more, will never do this nor want to have to do it either out of the box. VEGA is a massive power guzzler out of the box, that's all there is to it, and is just not great value for money to the average consumer. (I bought a Vega 64 myself)

People care less about Intel power draw because it's generally fairly insigificant in the wider picture. If an 8600K needed to guzzle 100w extra power just to match or slightly beat a 2600X or whatever, then yeah, people would have a point....but it doesn't, so they don't.

Reminds me of the AMD FX 9-series monsters....neededly to gulp down about 250w power, under water, just to match a 60w i3 in some games, and slower than an 80w i5 in pretty much everything.
 
What?

a 1080 is a 180w card which trades blows with Vega 64
a 1080ti us a 250w card and well beats Vega 64 cold

my undervolted 1080ti hits 1850mhz (vs a max OX of 2050mhz) and only slips 220w or less

Were you are getting 300w for a pascal card from I have no idea .....

I could be wrong here but I *think* the confusion is due to him quoting the TDP his Vega was set to, and quoting what the 1080 was actually drawing. Though that still sounds off as an AIB 1080 will only peak around the 250w mark not 300 lol.
 
What?

a 1080 is a 180w card which trades blows with Vega 64
a 1080ti us a 250w card and well beats Vega 64 cold

my undervolted 1080ti hits 1850mhz (vs a max OX of 2050mhz) and only slips 220w or less

Were you are getting 300w for a pascal card from I have no idea .....

But then look at the price of a 1080/1080ti compared to Vega. Once you factor that in, any tiny savings made on power is wiped out and it looks poor value.

Then don't forget the Nvidia tax for gsync because they refuse to adopt adaptivesync. So that's another £150-200 on top of it.
 
But then look at the price of a 1080/1080ti compared to Vega. Once you factor that in, any tiny savings made on power is wiped out and it looks poor value.

Then don't forget the Nvidia tax for gsync because they refuse to adopt adaptivesync. So that's another £150-200 on top of it.

No one is talking about "cost saving" of power draw (or if they are they are massively missing the point) - the fact is that manufacturers don't want cards that exceed 300w - heat, power, noise all factor in to that. People want faster cards, its really difficult to deliver a faster card when the underlying architecture is an order of magnitude less power efficient without hitting heat/power/noise limits. Just a look at how AMD have been forced to include watercoolers on their reference cards shows how far off the mark they are at current. AMD have skipped 12nm and put all their effort in to 7nm, so hopefully it will be good... they can't get away with just die shrinking GCN yet again.
 
Your just talking about gaming performance, which is relying on the game to actually be optimised properly. AMD cards have a lot of raw power with huge memory bandwidth, which is why they are preferred for mining and non-gaming tasks.
 
People rationalise after the fact. The marketing teams do all the work in order to get the individual to want the product & brand and then the individual grasps unto whatever excuse they can in order to justify an already (sub-consciously) made decision, back in the day it was "but muh amd drivers", now it's "oh no, my energy bill will go up by £1 a week!", if things equalise then it will be some sort of subjective factor "the colours don't match my build", "the card is too heavy", etc.

Understand that people almost never make rational decisions and this will bother you a lot less. We are just animals after all.
 
If you've a load of cards mining, power consumption is an issue. For most gamers, I bet they don't even read the 'power consumption' section of reviews. It is so far down the list of priorities after price, general performance, specific performance in the games you play, overclockability etc. etc. that's why it's the last section in reviews and gets the least hits, most people don't even read it.
 
I don't care either way. I buy either the best value, or whatever meets my needs. Lower power consumption is preferable of course, but that isn't the priority.

Funnily enough, the same sort of people who moan about AMD power consumption didn't care about power consumption when the GTX 480 came out. It's funny how that works, isn't it?
 
Nvidia got a lot of flack for Power consumption with Fermi and the FX fiasco before that.

I don't see any double standards. Vega uses a lot of power but doesn't offer the performance to justify it. There is no double standard there. At least with Fermi the performance was there just about.

Nvidia got flack because Fermi was half the card they promised. Performance was terrible.
 
Funnily enough, the same sort of people who moan about AMD power consumption didn't care about power consumption when the GTX 480 came out. It's funny how that works, isn't it?

except this just isnt true! there were loads of memes about the fermi heat and power use

edit
i dug this out for you. actually all things considered the card has actually aged pretty well

https://www.youtube.com/watch?v=AMbX7E6ozag
 
Last edited:
except this just isnt true! there were loads of memes about the fermi heat and power use

Indeed - but also at the time it didn't lag significantly behind the competition for performance - a point that isn't particularly convenient to include in the equation for some posters I see.
 
except this just isnt true! there were loads of memes about the fermi heat and power use

Because it's performance was terrible. If the card hit its performance target nobody would have cared about the power use or temps. It's heat/power use and poor HSF just added to the failure.
 
Back
Top Bottom