• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia 480GTX's power usage lie...

Associate
Joined
9 Mar 2007
Posts
169
Location
Cornwall
anyone starting to think nvidia is just full of BS?
need to run a power station in your house.

im still looking towards the 5850 or 5870

although the site name i quoted from doesn't help.

IF YOU WANT to learn about Nvidia's Tesla and GTX480 cards at GDC, don't ask Nvidia, it has problems with the truth. The real story is found with the users, and they have interesting things to say about the upcoming card's upward bound TDP.

If you recall, the official story is that the card, in it's cut down and underclocked version, pulls 225W. That number, along with stunningly poor performance, has lead to some notable backpedaling. If that isn't bad enough, some sources at GDC told SemiAccurate that Nvidia jacked up the TDP by 50W last week without warning.

We will be the first to admit we were wrong about the TDPs of the cards. At CES we said the GTX480s shown there were pulling 280W, something Nvidia vehemently denied. Engineers beavering away at the things Dear Leader thinks are important, like the style of the wheels on his Ferrari, have been pulled off to work on cards for some unfathomable reason. Working hard, they have managed to reduce the TDP of the cards 5W to 275W. Yeah, Nvidia finally admitted that the card is the burning pig anyone who has used one knows it is.

There are two problems here, one internal and one external. The internal one is that this is a big flag saying Nvidia admits defeat and has no hope of fixing the problems that plague the chip. Nvidia can't get the power to a reasonable level, and that is the end of it. The only way to get numbers of chips to salable quantities is to jack power through the roof to mask the broken architecture, so that is what it is doing.

More problematic is what about the OEMs? Officially raising the TDP three weeks before launch by a very substantial 50W is massively stupid. Nvidia just can't do this to OEMs without causing them lots of pain. For high end desktops with lots of space, that can be worked around, but if the system is a little closer to the edge, 20+ percent more TDP can have a profound and negative effect on cooling.

Even worse, think about all the companies that make Fermi based Tesla cards. If you put four in a system, and Nvidia jacks TDP 50W per card, that is 200W more you have to dissipate. Three weeks before launch, your cases are built and in a warehouse, your cooling system is finished, and you don't have time to change things, much less test them. 200W is a lot in a 2U server case. 21 of these in a 42U rack is an added 4.2kW that you need to dissipate, roughly 3 hair dryers on full blast.

Then there are the laptops. I feel bad for those guys, first Bumpgate, now this. There is no way you can redesign a laptop cooling system in six months, much less three weeks. Silly ODMs, no cookie, but you will probably be blamed by Nvidia PR for 'screwing up' so badly.

In the end, Fermi is turning into a running bad joke. You have to wonder about how many high margin orders will be shown the door when word of this leaks out. Nvidia might be "Oak Ridged" a few more times yet

http://www.semiaccurate.com/2010/03/12/semiaccurate-wrong-about-nvidia-480gtx-power-use/
 
seriously though

480GTX 275W compared to 190W on a 5870 and the only difference the 480gtx can out perform on is the tessellation
 
480gtx can out perform on is the tessellation
Well, if some crazy game developer suddenly decide to make a very good game for the PC (not the multi-platform console port craps) that employ the very heavy use of tessellation, then the Nvidia cards would be much more desirable.

I think the mistake of GTX470/480 is that it is a 'future focused' card that's released too soon, as the game industry is not yet ready to make games with such high tessellation level yet. And because it is 'future focused', its performance for dx9/10 games may not be as good as the ATI 5000 series.

But just curious though...how many people still ACTUALLY play Crysis instead of using it for testing graphic cards?
 
Last edited:
Well not quite the same as the HD2900 scenario, ATi will be going over to a new architecture on the next generation. nVidia kept the 8800 generation and just respun it.
 
To be honest, until the cards are actually in proper reviewing websites hands, take everything you read with a global sea level of salt as most of it is pure drivel as usual.

This is coming from a happy 5870 user before I get accused of being a fan boy lol
 
Well, if some crazy game developer suddenly decide to make a very good game for the PC (not the multi-platform console port craps) that employ the very heavy use of tessellation, then the Nvidia cards would be much more desirable.

Only the 480 has the tessellation advantage, so I don't think any game dev will make a game for 50 GTX480's


I think the mistake of GTX470/480 is that it is a 'future focused' card that's released too soon, as the game industry is not yet ready to make games with such high tessellation level yet. And because it is 'future focused', its performance for dx9/10 games may not be as good as the ATI 5000 series.

But just curious though...how many people still ACTUALLY play Crysis instead of using it for testing graphic cards?

Not sure how you work that one out, when Fermi seems to suck at current DX11 games considering it's die size...

Sorry mate, but we'r not living in a fairy tale... :p
 
To be honest, until the cards are actually in proper reviewing websites hands, take everything you read with a global sea level of salt as most of it is pure drivel as usual.

This is coming from a happy 5870 user before I get accused of being a fan boy lol

I would agree with you if the leaks contradicted each other...
 
I would agree with you if the leaks contradicted each other...

You dont know where the leaks are from though. For all we know its just another fanboy making up crap as usual. Its the same with these "benchmark videos", all which can be fabricated easily by a monkey using movie software.

Seriously though, wait for the PROPER reviews of it and then make an opinion instead of making rather uneducated and explosive comments at each other.

Not directly aimed at the person I quoted but to the whole forum...
 
^^^
The Nvidia released Heaven benchmarks kinda validate Fermi's ballpark performance of the leaked benchmarks and what Charlie says his 'sources' are reporting.

Obviously they are not going to be 100% accurate as Nvidia were still deciding specs at that stage, but it gives a good indication.

Even Fud says it's gunna be hot & 'sometimes' faster than a 5870, and we all know Fud isn't going to post something mildly negative unless he believes it to be true.
 
Man... I feel kinda bad for Nvidia...

Still... If true, it would keep the prices of my GTX's inflated long enough to sell.
 
Of course to be sure we need these to get into the hands of decent honest reviewers so not the first lot that will be doing reviews as per nvidias strict instructions. But lets be honest the same stuff is coming out and not just from semiaccurate either so there clearly is a problem the only question really is how big a problem it will be for the enthusiast end user and being honest i don't power and heat are massive problems for most. These cards will sink or swim on their performance in games were all playing not synthetic benchmarks or the few games nvidia has authorised reviewers that it likes being played.

Being honest this whole fermi thing is getting really boring now for the love of god nvidia just release the damn things and put us all out of your misery because a few weeks here and there is not going to polish a turd if thats what it is so just release and let people judge for themselves..
 
Back
Top Bottom