• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia 480GTX's power usage lie...

In a race it doesn't matter if you win by an inch or a mile. Winning is winning!! (thanks vin diesel).

I couldn't care less what the power consumption turns out to be. If the card is the fastest in the world (single chip) then it's the fastest in the world and nvidia have won regardless of power consumption. At this end of the gpu Market it's expected that cards will be expensive and that they will use a lot of juice. I don't see what the fuss is and why there are so many people moaning on about it like little boys in the school playground who have just have a football launche into their face...

If nvidia need twice the power for a 10% boost the great! Games will run quicker f the cards quicker and when your spending that sort of money on a card you should be able to foot the leccy bill.



Andy
 
480 is confirmed to be a 480 core is it not.....and that chart looks like a real bad MS paint job.

confirmed...laughing out loud, no it hasn't been confirmed none of these things have been confirmed. we will find out for sure on the 26th until then its all just rumour and speculation.
 
Yeah it's hilarious, Nvidia a bunch of comedians..


nvidiak.jpg
 
got to love the way your all out there for openCL up until the point when its shown in favour of the other team.

anyway.

gtx480lol.jpg
 
It's odd that they say the 5870 doesn't support OpenCL as my 4870 sure does! In fact I had to go run the AMD Nbody sample again just to check! :p
 
I cant see Nvidia going out of the way to openly advertise the fact the card has a 300watt power requirement, it will put people off.
I also hear Nvidia have been hammering away trying to reduce power draw of the as much as possible. It doesn't make sense to point a weakness like this.
 
I cant see Nvidia going out of the way to openly advertise the fact the card has a 300watt power requirement, it will put people off.
I also hear Nvidia have been hammering away trying to reduce power draw of the as much as possible. It doesn't make sense to point a weakness like this.


How so? when I build a system, my only concern is that the PSU I buy for it is good enough the job, regardless of what the actual consumption figures are - it's just not a concern.

It would be a concern if I was building a mini pc like an HPTC system, but then I wouldn't be looking at £300+ GPU's for that either.

Of course they want it to use less power if possible, its common sense - part of product refinement, but its not the be all and end all, certainly for a high end specialist item, and its certainly not a concern for its target audience.

When was the last time you saw a potential Ferrari customer whining about petrol consumtion, or the price of road tax? lol :D
 
I cant see Nvidia going out of the way to openly advertise the fact the card has a 300watt power requirement, it will put people off.
I also hear Nvidia have been hammering away trying to reduce power draw of the as much as possible. It doesn't make sense to point a weakness like this.

It was probably being shown to OEMs and system integrators who would have raised a massive collective eyebrow if the TDP wasn't disclosed, as they need TDP information to design systems to accommodate the graphics cards they use.
 
How so? when I build a system, my only concern is that the PSU I buy for it is good enough the job, regardless of what the actual consumption figures are - it's just not a concern.

It would be a concern if I was building a mini pc like an HPTC system, but then I wouldn't be looking at £300+ GPU's for that either.

Of course they want it to use less power if possible, its common sense - part of product refinement, but its not the be all and end all, certainly for a high end specialist item, and its certainly not a concern for its target audience.

When was the last time you saw a potential Ferrari customer whining about petrol consumtion, or the price of road tax? lol :D

Advertising rule number one, point out the positives of your products. It just seems strange considering the fact Nvidia have a very cute marketing department.

Power consumption and TDP is very important to the overclocker, you have a better chance of big clock with a cool card.


It was probably being shown to OEMs and system integrators who would have raised a massive collective eyebrow if the TDP wasn't disclosed, as they need TDP information to design systems to accommodate the graphics cards they use.

That would make more sense.
 
300w? pffft. I'm glad its that low tbh, since i might end up with three if the water blocks from EK are single slot (this is very doubtful, I will only consider it if it gives noticeable gains over my current set up @ 2560 res, and by noticeable I mean more than 10% for a £600 cost after ive sold my 295's)

GTX 295 = TDP 289 watts. Top card = top card, I'll start to /care about the leccy bill when I cant afford it, although by then I should have converted my house to renewables so I wont have to pay no one shizzle.

edit: correcting words etc.
 
Advertising rule number one, point out the positives of your products. It just seems strange considering the fact Nvidia have a very cute marketing department.

Sorry but I fail to see how Nvidia are being 'cute' by letting thier customers know the power requirement of the product, A) it's not that high, and B) its informative, and therefore good advertising.

Would you rather they kept it a secret?
 
300 watt is simply awful for a 40nm chip. The card needs to offer something like double the performance of a 5870 to justify its power usage.

Yes I expect the Nvidia marketing people to keep very quite about it, hence why I'm sceptical about this image being offical/coming direct from Nvidia.
 
Back
Top Bottom