• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How far could and should power consumption go?

I think the absolute maximum it could go is 750ish, just because of how difficult it is to actually cool it then, particularly due to the insane density of chips. Unlikely to see it go past 600w though, imo.

I'm not even sure that's possible - just look at the eVGA card Gamers Nexus reviewed - it's a four-slot chonky beast of a GPU which is 90% fin stack for 500W of cooling - that heat's gotta go somewhere and we're reaching the limits of what's possible (liquid cooling notwithstanding).
 
Yeah power draw isn't going to keep increasing. AMD are already better than Nvidia in that regard, if the next gen of cards have ridiculous power draw and AMD are lower and offer better performance Nvidia will take notice.

The issue currently is that Nvidia offer better ray tracing perfomance and better perfomance at high resolutions. They are also better for mining etc so anyone with those priorities in mind go to Nvidia regardless of power draw.

If (or rather when) AMD offer similar or even better performance the market will change the way it did in the CPU market with less people looking towards intel.

The only reason that AMD seems better at the moment is that they are using a smaller much more power efficient node. If both companies were on the same node, I doubt there would be much, if anything between them.

And you are forgetting the real problem. Die Shrinks no longer offer performance increases. The only way to increase performance going forward is to push more power. There is no way around it, it isn't just a Nvidia problem, next gen GPUs from both companies are rumoured to be over 400W.

Who here would buy a next gen GPU with the same performance as last gen, but with reduce power consumption?

So be prepared for increased power use, that is until they come up with a new way of designing GPUs or push us all into cloud gaming.
 
Who here would buy a next gen GPU with the same performance as last gen, but with reduce power consumption?

So be prepared for increased power use, that is until they come up with a new way of designing GPUs or push us all into cloud gaming.

Not many would change if all that was offered was lower power consumption as you say, but Im ok with increased power consumption as long as I can keep the temps in check.

I tend to undervolt mine anyway as running at core clocks of over 2000Mhz offer minimal gains in games, so I run at 1800Mhz at around 60 degrees and it's fine.

I think that would be my approach on newer cards too as there would still be gains to be had.
 
Not many would change if all that was offered was lower power consumption as you say, but Im ok with increased power consumption as long as I can keep the temps in check.

I tend to undervolt mine anyway as running at core clocks of over 2000Mhz offer minimal gains in games, so I run at 1800Mhz at around 60 degrees and it's fine.

I think that would be my approach on newer cards too as there would still be gains to be had.

That's something I wonder about too. Will there be any savings from Undervolting, especially in an MCM GPU.
 
with very high power consumption cards I'd be worried about their longevity, I'm thinking they probably wouldn't be able to efficiently cool all the components so their lifespan would be shortened.
 
with very high power consumption cards I'd be worried about their longevity, I'm thinking they probably wouldn't be able to efficiently cool all the components so their lifespan would be shortened.
great news for nvidia as long as it lasts the length of the waranty and then you have to buy another one! they dont care about long life product as long as they dont have to replace it within warranty
 
with very high power consumption cards I'd be worried about their longevity, I'm thinking they probably wouldn't be able to efficiently cool all the components so their lifespan would be shortened.


I doubt it. I for example don't see how a GPU running at 60c would live significantly longer than a GPU running at 80c if you repeat this test over a sufficient sample size. As long as components run under their max temp rating then it doesn't matter. We've been doing this with CPUs for a long time and people who have CPUs at 40c haven't been living longer than ones at 90c as both temps are within the design of the components.

What shortens the life span is high voltages. And yes if components ran Above their maximum rated temp design then that would affect life span, but no one will release a product that comes from the factory overheating (on purpose) because that means a giant lawsuit, they will get sued and buyers will get refunds for faulty product
 
If one company or the other can lay claim to be 2fps faster at the cost of 50+ more Watts power is gonna continue to increase. Pretty much where we are with 3090 ti.
 
I'm not sure I'd go so far as illegal but personally I wouldn't buy a card over 350w.


And what footprint, some countries have nearly 100% of their power generation from 0 carbon sources. Just because Europe can't fix its carbon BS doesn't mean the rest of the world should suffer
 
And what footprint, some countries have nearly 100% of their power generation from 0 carbon sources. Just because Europe can't fix its carbon BS doesn't mean the rest of the world should suffer

They still have to pay their electricity bill.

The real problem here is Nvidia selling poorly designed power hungry cards.

If they stuck to a TDP of around 300W - 350W, we would still get most of the performance and a decent cooling solution at a fraction of the cost.

The truth of the matter is Nvidia are planning to sell us very poorly designed cards at ridiculous prices just to try and stay 2fps ahead of anything AMD can produce.
 
I dont think it matters at the top end tbh as I can't imagine you're buying with KWH cost whilst gaming in mind. 3060ti is very efficient until it's going flat out and even then it's only around 200W, can't see 4060ti being any different.
 
The real problem here is Nvidia selling poorly designed power hungry cards.

If they stuck to a TDP of around 300W - 350W, we would still get most of the performance and a decent cooling solution at a fraction of the cost.

The truth of the matter is Nvidia are planning to sell us very poorly designed cards at ridiculous prices just to try and stay 2fps ahead of anything AMD can produce.

I've said this many times already, Nvidia are using a far less effficient node to compete hence power usage is higher. Was this a poor design choice or just good business sense? It certainly helped keep costs down and supply up. There is nothing stopping you, an end user, from restricting the power a modern GPU from Nvidia uses, indeed my own 3080 maxes out at 300W using a custom curve. I've also shown in another thread that Ampere, when running modern RT workloads, is far more efficent than the competition. Here I'll add a few more entries -

8nm 3080 ~300W @ 47FPS = 0.156 frames per watt
8nm 3080 ~340W @ 47FPS = 0.138
8nm 3060Ti ~200W @ 27 FPS = 0.135
8nm 3090Ti ~500W @ 62FPS = 0.124
7nm 6900XT ~300W @ 29FPS = 0.097
7nm 6800X ~300W @26 FPS = 0.087

I'd like to see how well a tamed, ~400W, 3090Ti performs.
 
They still have to pay their electricity bill.

The real problem here is Nvidia selling poorly designed power hungry cards.

If they stuck to a TDP of around 300W - 350W, we would still get most of the performance and a decent cooling solution at a fraction of the cost.

The truth of the matter is Nvidia are planning to sell us very poorly designed cards at ridiculous prices just to try and stay 2fps ahead of anything AMD can produce.

+1

I agree, the top end of the graphics card stack is completely losing it's way on power consumption, heat generation and on the consumer price tag.
They have become niche halo products that should be mostly avoided and certainly not recommended.

This generation our top end recommendation stopped at the 3080, but our most popular sellers by far have been the 3070, followed by the 3060 Ti and then the previous gen 1660 Super.
We have sold next to no 3090's and very few people asked after them, and those that did were put off by the price.
 
Back
Top Bottom