• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

3080 + 3090 - Most power hungry cards in recent memory

Soldato
Joined
31 Oct 2002
Posts
10,387
Does anyone else think it's ironic that the 3080 (320W) and 3090 (350W) are the most power hungry, highest heat generating cards in recent memory, yet this is glossed over by the vast majority of fans, influencers and reviewers?

Does anyone else remember all the negative comments, reviews etc, when the 290X, 390X, FuryX, Vega64, Radeon VII launched, due to the insane power draw and heat? These GPU's were competitively priced at the time, yet were widely scorned due to their power draw, and heat.

290X = 290W TDP
390X = 275W TDP
FuryX = 275W TDP
Vega64 = 295W TDP
Radeon7 = 295W TDP

While it's true that AMD's reference cards for most of the above (exception for the FuryX, Vega64 liquid) had terrible STOCK coolers, there were AIB cards available with perfectly good coolers, that tamed the high heat output and created 'cool' running cards, with great noise levels.

This didn't matter to the massive Nvidia fan army, influencers on youtube etc, or majority of official reviews. These cards were pounced upon, and torn apart for their unacceptable power draw and heat output. As I mentioned earlier, most of these cards were competitively priced and were valid options for many, yet this didn't matter.

Point I'm making is that power draw, heat, price etc, doesn't matter when it's an Nvidia card. It only matters when it's an AMD card. I find this quite pathetic, it's like watching sheep follow their 'leader', the leader being most people financially, or emotionally invested in Nvidia. It's a sad state of affairs, as you know that even if AMD were to launch a competitively priced card, with similar TDP, similar performance, the Nvidia alternative would outsell it 10 to 1, due to the insane mindshare that Nvidia commands. This has led us to the current price climate, and sad state of PC gaming. Consoles will continue to gain more market share than PC, simply due to the majority being priced out of the market.

I think a great many people are going to be disappointing when they feel the heat these cards will generate. Many buyers will underestimate the PSU requirements, as these cards are over 100W more power hungry than previous Nvidia cards. I see rough times ahead for many owners!

Personally, I'm going to wait for the AMD RX6000 cards to launch, then make my decision. I'm hoping for a a card performing in between a 3080/3090, with around 295W or less. AMD have the process advantage this time around, so I think this is a real opportunity. Either way, I need a HDMI 2.1 card to enable 4K120Hz on my CX48, so may the best vendor win!
 
I had the same thoughts about this series, makes perfect sense why Jensen Huang was holding one in his kitchen for the reveal he just used it to fry his eggs for lunch
 
Highly recommend watching this if you haven't already.. interesting analysis from Adored. Hold on to your hard-earned until all the chips are on the table for sure..

 
Problem I felt with AMD power draw was even on there lesser cards it seemed too high when compared with the actual performance you got from them.
 
If it runs cool the power draw is fine with me. Cards like the 290x had great price performance but ran at 90+c with very loud fans and no AIB cards for like 6 months.

If we find out the 3080 runs at 85+c with 3000r fans then it would get similarly blasted in reviews.
 
yet this is glossed over by the vast majority of fans, influencers and reviewers?
I'm finding it interesting that some cards have a 4th little fan to attach! You know if some AMD designs did that, it would be an instant meme, and a funny contrast to all the howls on release over x570's chipset fan which no-one even hears.
 
If it runs cool the power draw is fine with me. Cards like the 290x had great price performance but ran at 90+c with very loud fans and no AIB cards for like 6 months.

If we find out the 3080 runs at 85+c with 3000r fans then it would get similarly blasted in reviews.

BHHE1O7.jpg
:P
 
Coincidentally we also have the most power hungry desktop CPUs. Pretty sure i9 10900K is (or can be when limits are lifted) hotter than Pentium 4 based abominations.
And Threadripper while not strictly desktop, is up there at 250W+ actual consumption.

The problem is factory overclocking. Manufacturers prefer cards/chips pushed to the limit out of the box. Not leaving much free performance (for us overclockers) on the table. Come out fancy boost algorithms and single thread voltages at what used to be LN2 level overclocks.
Yet modern silicon is much happier at reduced voltage. Sweet spot is around 0.95V.

I owned Vega64. Will never buy a 250W+ video card again. Yet even Vega64 was almost a good card when undervolted. Thinking AMD had a performance target of 1080Ti and Raja forced them to push power envelope to the limit.

Nvidia going above 300W is curious. There was no real need for it.
Is it due to Samsung 8nm being disappointing? GDDR6X too power hungry? Hoping to fix these things in Ti versions?
Or being genuinely scared of Big Navi?
 
Haha yeah looks like they run under 80c... worth noting that nvidia are taking a big price hit with the fe coolers.. how will the aib ones fare?
 
Haha yeah looks like they run under 80c... worth noting that nvidia are taking a big price hit with the fe coolers.. how will the aib ones fare?

I think the AIB cards will be much louder, and run hotter. Nvidia used their £££ to create a fantastic stock cooler, once that's beautiful, and obviously very costly. Got to admit, fantastic bit of engineering.

Obviously the 320W of heat will still be dumped into your room, though as mentioned in the OP, I just find it funny that everyone seems to shrug off the heat just because it's Nvidia!
 
Its fine, its Nvidia. Of course if it was AMD you'd never hear the end of it but thats par for the course.

Nvidia going above 300W is curious. There was no real need for it.
Is it due to Samsung 8nm being disappointing? GDDR6X too power hungry? Hoping to fix these things in Ti versions?
Or being genuinely scared of Big Navi?

They needed it to push the cards to the limit presumably as they would have been underwhelming compared to previous generations. Or maybe they're worried or at least want to trump the opposition before it comes out of the starting gate. Either way they pushed them to the limit hence the lack of overclocking headroom as well as the power consumption - Ampere, well named it seems, for all the current they're drawing...
 
Last edited:
This didn't matter to the massive Nvidia fan army, influencers on youtube etc, or majority of official reviews. These cards were pounced upon, and torn apart for their unacceptable power draw and heat output. As I mentioned earlier, most of these cards were competitively priced and were valid options for many, yet this didn't matter.

Conveniently glossing over the more important bit - they weren't market leaders in performance - it is acceptable to be more forgiving of a GPU that uses a lot of power and hence generates a lot of heat if it does something useful with it - in many cases these cards simply didn't or were significantly less efficient than the equivalent cards from the competition.

If AMD comes along with cards that beat the 3000 series on performance while using the same or less power then that is another story.
 
Seems Nvidia wanted to save some money on 7nm TSMC so went 8nm Samsung and can't get the performance increase without massively increasing power draw.

I foresee a complete **** show once the faithful realise that 320watts+ (AdoredTV expects upwards of 400watts once you account for Power Supply efficiency) of heat being pumped into your room is rather uncomfortable.
 
Wow a more powerful card uses more power!

Due to advances at least that power is not as linear as the performance, and so long as it can cool it who cares.

Not happy do not buy one, I am probably sitting it out again (depends on the value/devaluation of my current card soon).

Why not put in an excisting thread instead of yet another.


Near end of DigitalFoundry Video.

Power usage to do that certain frame (No Vega 7 on there).


Screenshot-2020-09-16-150014.png
 
Nvidia going above 300W is curious. There was no real need for it.

nVidia spent a lot of time evaluating cost and availability, etc. before choosing where to produce Ampere - it will be interesting whether this move is a commentary on TSMC 7nm - meaning that would involve some delay, cost or poor availability, etc.
 
Back
Top Bottom