• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How far could and should power consumption go?

Soldato
Joined
26 May 2014
Posts
2,958
The 3090 Ti is already way past what I'd ever consider running. 500W+ for a GPU is insanity. I've owned many 250-300W cards in the past and even those made my room unreasonably warm, before you get to the cost aspect. I wouldn't even want to go back to that sort of level really.
 
Associate
Joined
28 Mar 2017
Posts
334
Location
Lincoln
I think it should be illegal in this day and age to sell cards with a TDP of more than 350W due to the carbon footprint.

I share the same thought, at least for the domestic market.


Modern day puritanism.

If it bothers you so much you should stop playing games on your PC completely, it's not like you need to do it, humanity survived millions of years without computer games.


On topic: Yes it's getting out of hand, I've always been a fan of the quiet, efficient PC and I've tried to squeeze the best performance/watt in a particular power envelope. Although saying that my current PC is a beast 5950x/3090, it is at least watercooled and undervolted so it's nice and quiet. Without the undervolt it shuts down randomly however because I think it goes over the UPS limit...
 
Permabanned
Joined
31 Aug 2013
Posts
3,364
Location
Scotland
Modern day puritanism.

Or perhaps just good old fashioned common sense.

If it bothers you so much you should stop playing games on your PC completely, it's not like you need to do it, humanity survived millions of years without computer games.

While my gaming is ~2 hours per day, the rest of my PC usage is mainly work related software developement. I'm certainly not one of these self entitled arses that mines.

On topic: Yes it's getting out of hand, I've always been a fan of the quiet, efficient PC and I've tried to squeeze the best performance/watt in a particular power envelope. Although saying that my current PC is a beast 5950x/3090, it is at least watercooled and undervolted so it's nice and quiet. Without the undervolt it shuts down randomly however because I think it goes over the UPS limit...

I do like to keep electronics running below default hence my undervolted 3080 draws up to 300W and my 12900k is also PL2 limited due to not requiring the full all core performance. In a Fractal Torrent, under the desk, my PC is silent with default fan profiles without having to resort to water. I'd recommend Alder Lake as it's surprisingly efficient - https://www.techpowerup.com/review/intel-core-i9-12900k-alder-lake-tested-at-various-power-limits/
 
Soldato
Joined
6 Feb 2019
Posts
17,617
The 3090 ti released with a 450Watt and greater TDP, and new power standards coming forward I've been wondering if a) we could and B) we should try to decouple increased power consumption from increased performance in our GPUs.

I thought that the ever decreasing nodes mean that graphics cards were meant to be more efficient for the same amount of power, but we seem to be going in the other direction. High end needed 75 watts (just hte PCI-E bus) 150watts (1 6-pin), 225watts (2 6-pin or 1 8-pin), 300 watts (1 6-pin + 1 8-pin) and now 375 watts for 2x 8 pins.

Cards have gone from a bare chip to heatsinks to heatsinks and fans to double slot to triple slot to just about 'needing' water cooling out the box. How far can we really take this?

TDP will affect my purchasing decisions from (in this order) a noise perspective, size of card/heatsink, PSU requirements, energy costs. Is it just me who will care about this?


Power consumption per frame is going and have been going down each generation but it's getting smaller and smaller because the node improvements are so minor - for example the last GPU generation that had great efficiency was when we went from 28nm to 12nm, a massive 16nm drop. Now we are going from 7nm to 5nm and 5nm to 4nm, these are very minor gains so the only way to get more performance that gamers expect is to push up the power limit
 
Last edited:
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Performance could be almost double for multi GPU solutions, so if they make it work for MCM designs the performance is there for those willing to buy into that.

Even for a multi GPU solution my point is still valid.

The performance difference between using 350W and 700W on a MCM card would not be huge.

You only have to look at all the people who undervolt their cards now to see that, what works for a single GPU will also work for a MCM solution.
 
Soldato
Joined
14 Aug 2009
Posts
2,820
Even for a multi GPU solution my point is still valid.

The performance difference between using 350W and 700W on a MCM card would not be huge.

You only have to look at all the people who undervolt their cards now to see that, what works for a single GPU will also work for a MCM solution.

In an ideal situations in SLI it would scale around 90% +/-. So if a "chip" of 350w would give you 60fps, two of them would give ~ 108fps for 90% scaling.

Double the power, but also almost double the performance. People still bought 2 cards or even 3 or 4 for a selected few when SLI was a thing. Ergo, 5-600w cards MCM design would be about the same as 2 cards in SLI.
 
Last edited:
Associate
Joined
3 May 2021
Posts
1,228
Location
Italy
If efficency cannot be increased (diminishing returns in smaller microarchitectures), the only solution is to split the problem:

- either you go towards multiple cards for different rendering aspects (one for raster, one for RT etc) which will not necessarily decrease the power usage but will make cooling the components easier
- or you go towards a self contained external GPU module with its own separate PSU and (likely liquid) cooling
- or you externalize the problem to the cloud (which is what big tech wants you to do)

The ideal answer would be to instead optimize the software to require less resources but that means either increased game cost or reduced quality at the same price so instead we got "cheating" via upscalers.
 
Caporegime
Joined
12 Jul 2007
Posts
40,632
Location
United Kingdom
I think we're now approaching a time where some people will seriously start to take notice of high total/average power draw for the top performing graphics cards. Especially given the soaring energy prices now in effect from today.

I've already started using ECO mode on my 5950X (along with curve optimiser), dialled my memory back to 3600Mhz CL16 and 1.35v, and reduced my 6900 XT clocks back to 2500Mhz so it runs at 250W or thereabouts under 100% load to try and save some energy and reduce costs a little.
 
Associate
Joined
4 Oct 2017
Posts
1,221
I think we're now approaching a time where some people will seriously start to take notice of high total/average power draw for the top performing graphics cards. Especially given the soaring energy prices now in effect from today.
.

Yeah power draw isn't going to keep increasing. AMD are already better than Nvidia in that regard, if the next gen of cards have ridiculous power draw and AMD are lower and offer better performance Nvidia will take notice.

The issue currently is that Nvidia offer better ray tracing perfomance and better perfomance at high resolutions. They are also better for mining etc so anyone with those priorities in mind go to Nvidia regardless of power draw.

If (or rather when) AMD offer similar or even better performance the market will change the way it did in the CPU market with less people looking towards intel.
 
Soldato
Joined
18 Feb 2015
Posts
6,485
I think the absolute maximum it could go is 750ish, just because of how difficult it is to actually cool it then, particularly due to the insane density of chips. Unlikely to see it go past 600w though, imo.

As for should. Should go all the way to the max, if the market bears it, why not? Environmental concerns are so irrelevant, it's laughable, there's no mystery as to how to get enough energy for our needs in a sustainable way, it's just a matter of will. Besides, such cards are generally luxury products and don't sell in high numbers at all, so their effect on anything is not going to even register. The personal responsibility angle for the climate conscious is just pure propaganda anyway, if you study the real numbers it's very clear that it's reductions for key major industrial and military players that are the most important. Rest is just hokum.
 
Soldato
Joined
18 Oct 2002
Posts
7,847
Location
Scun'orp
It certainly has put the kybosh on me wanting to eke performance by overclocking anymore, now that I have a 3080. It simply is not necessary and given the base performance and not worth the extra leccy. Before, banging up the core and memory would be the first thing I would do with previous cards.
 
Soldato
Joined
6 Feb 2019
Posts
17,617
Yeah power draw isn't going to keep increasing. AMD are already better than Nvidia in that regard, if the next gen of cards have ridiculous power draw and AMD are lower and offer better performance Nvidia will take notice.

The issue currently is that Nvidia offer better ray tracing perfomance and better perfomance at high resolutions. They are also better for mining etc so anyone with those priorities in mind go to Nvidia regardless of power draw.

If (or rather when) AMD offer similar or even better performance the market will change the way it did in the CPU market with less people looking towards intel.


And better streaming by far, saw some benchmarks today on Reddit testing across a range of bitrates and settings and a rtx2060 which is a last gen entry level card beats the best amd has the 6900xt


Then situation with streaming is even worse than Ray Tracing, if you do any streaming then you are just about forced to go Nvidia
 
Associate
Joined
4 Oct 2017
Posts
1,221
And better streaming by far, saw some benchmarks today on Reddit testing across a range of bitrates and settings and a rtx2060 which is a last gen card beats the best amd has the 6900xt


Then situation with streaming is even worse than Ray Tracing, if you do any streaming then you are just about forced to go Nvidia

Ah yes I forgot about streaming, so yet another reason why Nvidia to a degree can do what they want, espeically at the high end.

People that just want the absolute best don't tend to care about certain factors. Although I appreciate they make up a small percentage of the market.
 
Back
Top Bottom