GTX 480

They're on their way. Released today.

Bit-Tech have done a gaming review on them and the news isn't brilliant. Only marginal gains on the HD5870m for a lot more money and a lot more wattage for the most part. Ahead of the 5870 in some tests but behind in others - it even got it's arse handed to it by the GTX295 in a couple of games.

LegitReviews managed to get samples of the new GPU3 core to test it (it won't work with GPU2). It scored up to 70% higher than a GTX285 but uses 100W more power and runs a damn sight hotter. It also uses nearly three times as much CPU to run it.

All in all, I think I'm better off with my GTX295 - more points for less watts and less CPU used. Hopefully there will be a load of 295s to be had cheap once these finally hit the shelves.

Apparently the new GPU3 cores will be out sometime in April and will work with these 4** and the 2** cards but nothing else at the moment, according to the LegitReviews article.

If these reviews are accurate, Nvidia have dropped the ball somewhat.
 
Some people have already commented on the massive differences in bench figures between sites. There's some suggestion that some reviews may use older drivers, since nVidia released a new set yesterday, around the time the NDA came off the cards. If reviewers were already finished then they may have been reluctant to redo all the benches with new drivers, in order to be first out with their review. One review which does use the new drivers is this one. Summation seems to be: good, but uses a lot of electricity.


M
 
Those gaming results do look much better than the previous reviews. I wonder if the new drivers will improve the Folding performance as much. I'm sure someone will test it soon.
 
Last edited:
Power hungry, hot, noisy, expensive and apparantly is at best on par with the 295 and 5870.

Why does anyone want these cards again? :confused:

I am with Stan on this, they don't seem particularly suited to folding if you are in anyway concerned about your already astronomical leccy bill and from what I have seen they aren't anything special at gaming.

Still, if it means some cheap GTX295s are around I might be tempted to pick one up :D
 
Thing is about the PCPerspective review is it doesn't specify the drivers used on either card. What drivers did they use on the AMD cards? 10.2? 10.3? 10.3a?



From Ryan there:

Thanks, I got a message on Twitter about this. We'll be updating in the piece later. 197.17 for NVIDIA, 10.1 for AMD.

But more tests to be run:

Yah, we are going back and retesting all the 5000-series cards with 10.3 this week in prep for the 2GB card release, so we'll have the latest results for them and updated comparisons to the GTX 480/470.



M
 
Its a bit carp tho, testing with 10.1 why would you do that? Other than to try and make Nvidia look as good as possible in a bad situation.

And sorry but I can't see how this is justified, it uses more power than a GTX295???

capturezz.jpg
 
Last edited:
Fermi is absolute pants. Even if it does come out ahead in some benchmarks compared to ATI or even NVidia's GTX2xx range, the amount of power it needs to do that is insane.

All been said before though, really. Crap architecture and the only way to make it work is to remortgage your house and buy a 2000W PSU :p
 
Its a bit carp tho, testing with 10.1 why would you do that? Other than to try and make Nvidia look as good as possible in a bad situation.



I would guess because they are an old set of benchmarks form a while ago, done whilst testing something else? Believe it or not, most review sites do NOT have a bucket full of video cards they can pull out at a moment's notice, unless they've bought them themselves. And the review ones are loaners, and the reviewer may only have then for 24 or 48 hours. This means the review has to be done quickly in order to get out on the first tide: miss that and you may as well not bother.


M
 
Sad fact is most magazines and websites would rather have the claim they were one of the first than worry about their credibility.
 
The amount of slagging off GTX 480 gets is insane. It's no where near as bad as people make out, the only downside is its expensive and uses a lot of power.

22218.png
 
Meaningless graph imho, and best not to hotlink the Dons get cross.

Best quote I have seen in a review is:

As far as expected points per day go, I will quote Nvidia "a GTX280 does about 7700 points per day (PPD), so the PPD for the GTX480 would be based on the speed up. So, if the GTX 480 is doing 70% better, then it would get 1.7 x 7700 = 13,090 PPD."

Which means its more economical by far to run either the SMP windows client on an i7 (around 15KPPD) or even more the -bigadv client for around 50KPPD

Even a GTX295 gets 14KPPD not even overclocked and that uses less power than a GTX480 :)

At the end of the day, lets wait a few weeks after this paper launch till real folders get them in their hot little hands then we will know for sure :)
 
Last edited:
I think I still have F@H set-up on my Win7 installation, so when they arrive I might let them waste (a'hem....I meant it in a good way :D) a day or so to see what they are capable of.

:)
 
Meaningless graph imho, and best not to hotlink the Dons get cross.

Best quote I have seen in a review is:

As far as expected points per day go, I will quote Nvidia "a GTX280 does about 7700 points per day (PPD), so the PPD for the GTX480 would be based on the speed up. So, if the GTX 480 is doing 70% better, then it would get 1.7 x 7700 = 13,090 PPD."

Which means its more economical by far to run either the SMP windows client on an i7 (around 15KPPD) or even more the -bigadv client for around 50KPPD

Even a GTX295 gets 14KPPD not even overclocked and that uses less power than a GTX480 :)

At the end of the day, lets wait a few weeks after this paper launch till real folders get them in their hot little hands then we will know for sure :)

Again, I find myself in agreement with Biffa. As far as Folding is concerned, The GTX295 appears to give better ppd for roughly the same power consumption, a lot less heat (using the stock cooler) and less money. Also, it looks like the 480 uses more of the CPU than the 295 which would make a difference if running SMP alongside it. It will be interesting to see how it actually fares outside of the test environment.

On the subject of drivers used in testing, I revisited the Bit-Tech review and their tests were done using 10.3 for the 5870 and 197.17 for the 480 so, both cards using the latest drivers. Interestingly, the tests which included results from the GTX295 were done using 196.21 drivers (presumably old tests). I would be interested to see a comparison between the 480 and the 295 using the latest drivers as 197.13 supposedly gives a significant performance increase in some games. The 295 already beats it in some games using older drivers, I wonder how it would fare on level terms?
 
On the subject of drivers used in testing, I revisited the Bit-Tech review and their tests were done using 10.3 for the 5870 and 197.17 for the 480 so, both cards using the latest drivers. Interestingly, the tests which included results from the GTX295 were done using 196.21 drivers (presumably old tests). I would be interested to see a comparison between the 480 and the 295 using the latest drivers as 197.13 supposedly gives a significant performance increase in some games. The 295 already beats it in some games using older drivers, I wonder how it would fare on level terms?

Yes the Bit-Tech ones were particularly thorough I thought. Also the [H]ardOCP ones. I was pretty dissapointed by Anand and Toms this time round but then thinking back a bit they both seem to have declined somewhat imho.
 
Oh dear, new Afterburner 1.6.0 Beta is out. Shows that on the 4xx series the core clock is locked as a multi of the shader clocks and you can't change the memory speed. :(

We're happy to announce that we start Afterburner 1.6.0 Beta process, in this version we have early support of NVIDIA GeForce GTX 470/480 along with other many exciting features.

A few important notes on initial GTX 400 series support.

* It is no longer possible to adjust core and shader clocks independently on GTX 400 series. Shader clock is the primary and the only controlable GPU clock for this NVIDIA GPU architecture, the rest parts of core are clocked @ half of shader clock (1/2 ratio is hardwired). So you will have only shader clock slider unlocked on GF100 GPUs, core clock slider is no longer available on these cards. That is by design and must be so.
* It is no longer possible to downclock memory for 3D mode. So the minimum limit for memory clock matches with default 3D clock. It is also by design and must be so.
 
Back
Top Bottom