• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How far could and should power consumption go?

Don
Joined
20 Feb 2006
Posts
5,218
Location
Leeds
I heard top sku was only 450w. With 3x the RT over rdna2 its looking a good option if nvidia dont pull out all the stops this round. :)

You heard? A bloke down the pub?

Do you have a link to info on this as I have had a search and can’t find that much. Three times the RT performance and NVidia wouldn’t have to improve that greatly to stay in front would they? Not looked at figures in great detail.
 
Soldato
Joined
6 Jun 2008
Posts
11,618
Location
Finland
I doubt it. I for example don't see how a GPU running at 60c would live significantly longer than a GPU running at 80c if you repeat this test over a sufficient sample size. As long as components run under their max temp rating then it doesn't matter. We've been doing this with CPUs for a long time and people who have CPUs at 40c haven't been living longer than ones at 90c as both temps are within the design of the components.

What shortens the life span is high voltages. And yes if components ran Above their maximum rated temp design then that would affect life span, but no one will release a product that comes from the factory overheating (on purpose) because that means a giant lawsuit, they will get sued and buyers will get refunds for faulty product
For example electromigration is temperature dependant and doesn't have on/off temperature.
https://www.synopsys.com/glossary/what-is-electromigration.html
And those higher temperatures mostly happen when GPUs are pushed with high volts/current.

Neither are polymer capacitors used in VRMs immune to temperature.
Their lifetime's temperature dependancy is just different from liquid electrolyte caps, but still shorter in higher temperature.


So in statistically sufficient sample size there should definitely be signs of cooler running cards surviving longer.
Manufacturer/brander needs them to survive only long enough over warranty period and has zero interest if it lives say four years instead of five.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
+1

I agree, the top end of the graphics card stack is completely losing it's way on power consumption, heat generation and on the consumer price tag.
They have become niche halo products that should be mostly avoided and certainly not recommended.

This generation our top end recommendation stopped at the 3080, but our most popular sellers by far have been the 3070, followed by the 3060 Ti and then the previous gen 1660 Super.
We have sold next to no 3090's and very few people asked after them, and those that did were put off by the price.

Times change. When I was a child in 1999, I remember the absolute top end GPU's being the Voodoo3 3000 (which I was lucky enough to own) and then the Geforce 256. Both didn't need a power connector. TDP for the Voodoo 3 was just 15W. If you'd have mentioned to people that the mainstream 3080 in 2022 was 320W, they'd have thought that was absolutely crazy.

The same thing is happening now, clearly made apparent by the introduction of the 600W new PCI-Ev5 power connector.


IMO PC gaming has been headed more towards the high end, enthusiast, exclusive and expensive domain for a few years. This will be exacerbated by the rising energy prices, making high end PC gaming exclusively for high end wallets. I don't think this is a good thing, but that's what's happening.

There'll be some (older generation) that will stick to older, dated resolusions such as 720P, 1080P, 1440P, as these don't require anywhere near as much power and still deliver a 'good' experience to them. Remember there were many that kept using black and white TV's, long after colour was available. This isn't quite a black and white (LOL), though is still a similar affair.

Of course no-where near the detail/IQ that a much more expensive, power hungry 4k HDR experience can offer, though this is undeniably what the industry is pushing towards.

Consoles aren’t immune to this either, gaming power consumption of the playstation is also ramping up:


PlayStation 1: 3 Watts

PlayStation 2: 46 Watts

PlayStation 3: 120 Watts

PlayStation 4: 140 Watts

Playstation 5: 200Watts
 
Last edited:
Soldato
Joined
14 Aug 2009
Posts
2,752
The recent rise boom in electric cost per unit only exacerbates this issue now. What was once an afterthought if you were paying something like 12p a unit is suddenly a double take now its more like 29p a unit (and only looking like going up again).

Someone paying north of 1-2k/GPU doesn't care that much about the power consumption.

@Kaapstad , I may be remembering wrong, but weren't you buying top end cards in SLI at some point? :)
 
Soldato
Joined
21 Jul 2005
Posts
20,018
Location
Officially least sunny location -Ronskistats
Someone paying north of 1-2k/GPU doesn't care that much about the power consumption.

@Kaapstad , I may be remembering wrong, but weren't you buying top end cards in SLI at some point? :)

Whilst that is true, lets say for example the worst guzzler so far is a 3090Ti with a jacked up build gaming would at the wall be consuming say 600w + 100w for display; your looking at around 75p a four hour session. This was only costing you maybe 30p last year.

Enough for most normal folk to sit up and take note anyhow.
 
Soldato
Joined
31 Oct 2002
Posts
9,860
Someone paying north of 1-2k/GPU doesn't care that much about the power consumption.

@Kaapstad , I may be remembering wrong, but weren't you buying top end cards in SLI at some point? :)

Kaap's monitor has been a 60hz model for many years (unless he's upgraded recently?), so not using full TDP of the cards in gaming, unless vsync/gsync off and he's getting tearing galore :p

Think he used to do it mostly for benchmarks and just for fun to play with them, which I can fully appreciate :)
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Someone paying north of 1-2k/GPU doesn't care that much about the power consumption.

@Kaapstad , I may be remembering wrong, but weren't you buying top end cards in SLI at some point? :)

Indeed I was and would do so again if the conditions were right.

The cards I ran in 4 way SLI and Crossfire were around the 300W TDP. What this meant is they were giving near maximum performance with a reasonable cooling solution and they did not cost ridiculous sums of money. When run in mGPU setups they gave a lot of performance at a reasonable cost.

Unfortunately Nvidia have got themselves into a situation of diminishing returns by building cards which need huge TDPs, using huge dies and needing ridiculous triple slot coolers.

Nvidia's upcoming cards are a bit like doing your shopping in a Tank, you will get the job done, you will be pretty safe from accidents (unless you are passing through Ukraine) but the petrol consumption will be awful. I think most people would rather use something more appropriate like a small car.

We need to get back to seeing cards rated at no more than 300W TDP which give 85% - 90% of the performance that these upcoming cards will give.
 
Man of Honour
Joined
21 May 2012
Posts
31,940
Location
Dalek flagship
Kaap's monitor has been a 60hz model for many years (unless he's upgraded recently?), so not using full TDP of the cards in gaming, unless vsync/gsync off and he's getting tearing galore :p

Think he used to do it mostly for benchmarks and just for fun to play with them, which I can fully appreciate :)

Still 60hz but I have changed it to a LG one as the extra room it gives me on my desktop is great.

Benchmarks was indeed the reason to do it.

Unfortunately it is hard to beat some of the old benchmark scores with modern GPUs as 2 3090s in SLI is just not enough (that's providing you can even find a motherboard that has enough room).

:)
 
Associate
Joined
4 Feb 2009
Posts
1,368
8nm 3080 ~300W @ 47FPS = 0.156 frames per watt
8nm 3080 ~340W @ 47FPS = 0.138
8nm 3060Ti ~200W @ 27 FPS = 0.135
8nm 3090Ti ~500W @ 62FPS = 0.124
7nm 6900XT ~300W @ 29FPS = 0.097
7nm 6800X ~300W @26 FPS = 0.087
I'm glad to see that you've now started to include actual frames per watt. But for some balance, here are some mumbers for a pure raster example. I've pulled them from Guru3d like you, for Witcher 3 (selection criteria - it's the one I have installed, suggest an alternate and I'll do the math.) Note that I'm assuming your identical 3080s are a 3080 and a 3080TI. Source: https://www.guru3d.com/articles_pages/asus_geforce_rtx_3090_ti_tuf_gaming_review,14.html

3080 ~300W @115fps = 0.383333333333333
3080 ti? ~340W @120fps = 0.352941176470588
3060 ti ~200W @68fps = 0.34
6900xt ~300W @91fps = 0.303333333333333
6800x ~300W @83fps = 0.276666666666667
3090ti ~500W @133fps = 0.266

This goes from at worst, half the performance, to a much more mixed and balanced picture. More interesting is that all the cards for raster provide >60fps performance at a sane power level (or could do, if limited to 60fps). For RT, nothing (sane) manages that.
 
Soldato
Joined
1 Apr 2014
Posts
18,610
Location
Aberdeen
Cards have gone from a bare chip to heatsinks to heatsinks and fans to double slot to triple slot

Have you looked at CPU coolers recently? I've seen CPU coolers go from bare chip to huge coolers - never mind fluid cooling. Really, I don't care about power usage while gaming. Even after recent price increases the cost of electricity for a gaming session is trivial.
 
Back
Top Bottom