• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How far could and should power consumption go?

Soldato
Joined
12 May 2011
Posts
6,297
Location
Southampton
The 3090 ti released with a 450Watt and greater TDP, and new power standards coming forward I've been wondering if a) we could and B) we should try to decouple increased power consumption from increased performance in our GPUs.

I thought that the ever decreasing nodes mean that graphics cards were meant to be more efficient for the same amount of power, but we seem to be going in the other direction. High end needed 75 watts (just hte PCI-E bus) 150watts (1 6-pin), 225watts (2 6-pin or 1 8-pin), 300 watts (1 6-pin + 1 8-pin) and now 375 watts for 2x 8 pins.

Cards have gone from a bare chip to heatsinks to heatsinks and fans to double slot to triple slot to just about 'needing' water cooling out the box. How far can we really take this?

TDP will affect my purchasing decisions from (in this order) a noise perspective, size of card/heatsink, PSU requirements, energy costs. Is it just me who will care about this?
 
I think it's primarily the fact that it is getting increasingly difficult to simply churn out better performance per Watt than in yesteryear. People expect newer generations of graphics cards to outperform the last, one way of helping that is by consuming more power.

As for the question, rightly or wrongly it's not really a consideration I pay much attention to. I probably ought to, because the TCO of a power hungry card might be higher than one with a higher price tag.
e.g. a card costing £500 that performs 10% worse than a £700 might look like good value. But if the £500 card costs £200 extra in electricity over the period I use it then it's not good value in comparison.
 
AMD got a lot better in recent years, there isnt any need to use the max power in the majority of usage bit like a sports car you dont run it at 9000 rpm sitting in traffic :p Usability and overall cards have improved a lot over the last twenty years. I swear Im remembering it right but my old card just used max watts all the dam time, latest cards are amazing compared to that.

Fastest way to cut your argument to the bone is that you cant rely on cutting edge release cards to assess trends too much. Mainstream is improving definetly and most people wont spend more then 100 on a card, ocuk is not normal :D +software has become more bloaty, they need to refine and improve this also
 
How far can we really take this? Good question but 450W+ is already too much for me its getting crazy and if some of the leaked pictures are accurate the next gen will be the literal size of house bricks as well!

I thought the days of the Prescott nuclear power furnace cpu's had hit the limits of lunacy but gpus are about to smash that particular barrier
 
I think it should be illegal in this day and age to sell cards with a TDP of more than 350W due to the carbon footprint.

The difference in performance between a 350W card and one with twice that much TDP is tiny.

Other industries like car manufacturing are having to make huge changes to be more eco friendly.
 
Remember when high power inefficient vacuum cleaners were banned? I can see something being implemented in the future. I believe one of the American states has already bought something in.

For me it just the amount of heat that it dumps into the room. Some of those small oil radiators are 600-800w and with the next gen cards it will literally be like playing next to a radiator.
 
For me, the 3090Ti is already a good step beyond what I think is reasonable. I guess the different factors include:

Carbon emissions /energy efficiency (although this may only be temporary for say a decade until the whole grid is nuclear + renewable (a man can dream...)), anything over a few hundred watts seems a bit wasteful to me considering the good experience you can have with a ~300/350W rtx 3080 or 6900xt. Guess it's worth noting that power usage is still very very small change compared to other stuff like heating a house, driving a car etc.

Cost of electricity. Possibly a conservative estimate, but let's say you game an hour a day. 365 hours per year. Let's set prices at 28p/kwh. Every 100W of extra power consumption would cost you £10/year. So a 3090Ti would cost. £20/year more to run than a 3080. Can scale that up or down depending on if you game more or less. In some ways that's still not very much, but does add up over time. Probably not a high enough cost to get most people actually changing their buying decisions, but might subtly affect how you view high power cards.

Keeping your PC cool. 500W seems like it can really significantly affect air temperature inside even a well ventilated case. Keeping it cool will mean noisier fans and all your RAM etc will still be getting roasted unless you get a water-cooled model. Again, 300W is high but seems like a much more comfortable level to me.

Heating the room up. In the warmer months of the year, even a 300W gpu with say another 100W from the CPU feels like a lot. That's already enough to get a room noticeably warmer. Basically the lower the better for comfort purposes. Only way to combat that would be to keep the pc in another room or install air con, both of which would be a hassle, not possible for everyone, and having to install and run air con would add to the cost massively.

PSU capacity: relatively small cost compared to the actual gpu, but having to get a 1000W PSU rather than say a 750W one is still an extra cost to factor in.

Basically overall I feel like 300W is a high, but still kind of reasonable level, but going higher than that you start to get a lot of tradeoffs for relatively little gain.
 
Last edited:
For high power cards (and CPUs combined), you'll need AC to cool the room, so for what's worth "the cost of ownership" would be higher in the Summer and probably late Sprint/early Autumn. If is a cold year it could compensate during Winter! :D
Enthusiasts probably won't care much, performance is king, but the rest should still get decent cards around 200-250w range.
 
Power usage has always been something I've cared about and I've also been seeing the trend towards throwing it out the window in the last decade. We seemed to be going in the right direction and then a few years ago, a complete U turn and back the other way. I'd personally be happy with each generation of CPU and graphics only being marginally higher performance if they started to address the power usage and it was more of the focus.
 
For me, the 3090Ti is already a good step beyond what I think is reasonable. [snip...]

This.

But, as long as people keep buying (and are willing to pay 2 grand), I doubt nvidia/amd will care. I find it hard to see much higher wattage being viable though, it'll make the requirements for the rest of the system pretty heavy.
 
With the 6900XT being a 300 watt card and 90% the performance of the 3090TI i don't think 500 watt GPU's are inevetable.

AMD have managed to get the performance up to this given level with reasonable power consumption, Nvidia haven't, and Jenson being who he is will not concede the performance crown at any cost, that cost is insane power levels.

If RDNA3 is another big step up in efficiency and performance Jenson will push his cards to 750 watts, or 1000 watts, it doesn't matter.

With that in mind now ask people if they care about power consumption, they don't.
 
My view on battery back-ups could be skewed because I live in Florida and have a lot of power flickering during the summer, but a good 900-1000w sine wave UPS is well over $200 nowadays.

My 3080Ti caused my racing sim to throw overload alarms on my 900w Cyberpower unit.

The 1500w unit I upgraded to cost over $500! This is just for a battery backup.

The cost of a more expensive PSU is one issue, but the direction Nvidia is taking us will also add significant cost for those of us that want to protect our equipment with battery backup systems.

The alarm going off on my 900w UPS was my wake up call that things are getting out of hand.
 
Back
Top Bottom