• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

New Dual Nvidia 5X0 card coming THIS year! GTX580 15-20% faster than GTX480! GTX560 possible release

Raven is a little like Charlie Demerjian. He posts up some good info and benchmarks but comes across anti amd when he has a nv card and vice versa if he has a amd card. You just need to read his posts and take the info and forget the anti amd feel to some of his posts.
 
Raven is a little like Charlie Demerjian. He posts up some good info and benchmarks but comes across anti amd when he has a nv card and vice versa if he has a amd card. You just need to read his posts and take the info and forget the anti amd feel to some of his posts.

What he said. He sometimes reminds me of Justin Bieber, you either Love him or Hate him with all your might. But lets give it a rest now.
 
11% and he benches dirt2 in DX9 which apart from being plain stupid it gives AMD cards better performance figures over DX11 in which Nvidia dominate.


Sorry to pick on your post, but the numbers are COMPLETELY wrong, the 5970 is WAY faster than a 480gtx(and 580GTX).

Want to know why, the BS reviews I've been having a go at for so long, here would be an example.

http://www.techpowerup.com/reviews/HIS/Radeon_HD_6870/11.html

Here you have you comparison of speeds wait, whats that, I see no 5970 in this particular benchmark, oh well they've left it out, it probably doesn't make a difference right.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/8.html

Here, the same site, the same cards, 5970 included, now the Fermi is at about the same speed, how much faster is that 5970?

Oh yes, its OVER 45% faster, but more recently they just leave out the best results.

You will also notice several 5870 scores with two results, a slower and faster result, judging by future results he tends to take the lowest 5870 results.

Basically, you can in no way have fair benchmarks when you happen to leave out all the games that aren't CPU limited that also show the 5970 WAY ahead, but they often do.

THeres ZERO consistancy in benchmarking , certainly between sites, but within the same sites, using the latest, or oldest, or often seemingly the worst AMD driver possible and just randomly excluding results when it just seems like they should.

Basically, you add in the results they missed,

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/32.html

38% ahead on average at the highest res, slightly different picture no?

http://www.techpowerup.com/reviews/HIS/Radeon_HD_6870/29.html

26% ahead in top res here, pretty huge swing when you remove some of the best results.

Likewise that even includes a bunch of CPU limited results which only dilutes the scores further(faster cards are more held back when you hit a CPU limitation than a slower card).

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/22.html

Those kinds of results, low end res, are used in the "overall performance sumary" so anythign but 1920x1200/2560x1600 are literally worthless results, even then there are some heavily cpu limited results. Notice how 3dmark 03 is actually WAY less cpu limited than 05, which I didn't even know.

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/17.html

Theres the 5970 OVER 70% faster than a 480gtx, notice how Prey is no longer used(because you know its one of the games it utterly destroys Nvidia). Quake wars, over 50% faster, thats gone aswell. Riddick, 50% faster again, though thats one of the few they've left in AND left in the 5970 results.

http://www.techpowerup.com/reviews/HIS/Radeon_HD_6870/20.html

Supreme commander, heavily CPU limited, remember a faster card can still be faster, but simply not show the full benefit, the closeness at all res should be obvious its cpu limited, yet its left in and drags down the fastest cards results the most.

Add that all up and realistically the 5870 is averaging more than the 38% faster shown in the initial review.

EDIT:- Also note how little the 480gtx has "sped up" between launch and now, some people on here said they had hardware for 6 months with a very basic/similar architecture to previous gen and that drivers were almost as good as they were going to get, and others insisted there would be huge gains because its "new", wonder who was right on that one.

Also worth noting that Fud insisted, almost daily from September last year when Fermi "launched" till around June this year I think that there would be a dual GF100 based card.............
 
Last edited:
lol prey, very much relevant to todays DX11 cards. Wizzard is focusing more on recent titles that push hardware, he's one of the best reviewers out there.
 
Last edited:
lol prey, very much relevant to todays DX11 cards. Wizzard is focusing more on recent titles that push hardware, he's one of the best reviewers out there.

yes, there is a SLIGHT argument for taking out the oldest games, however the very obvious nature of why those games are chosen is clear to all but the most biased (currently) Nvidia fanboy.

Lets add some things up, the single game that shows the single best 5970 result, is removed, the second best result for the 5970, is removed, neither are gpu limited and both show the 480gtx ahead of many other cards.

One game is LEFT IN but ONLY the 5970 result is removed, its over 45% ahead, this drags down its scores.

Seriously he leaves out a 5970 result in a game he LEFT IN, thats where your argument, and his motives fall apart, I believe its also the single best result left in the line up, which he removes, for a single card which makes the 480GTX looks much better than it is.

He also leaves in 3Dmark 03, 05, and 06, none have DX11, none are DX10, but your entire argument as to my he left out old games is, they aren't relevant? Even if they were DX11 3dmark isn't remotely relevant to anything whatsoever, and Prey came out after 3dmark 06.

He's also since added a game like Supreme Commander 2, which is completely cpu limited, and again hurts the 5970 MUCH more than it hurts the 480GTX result.

Sorry but removing old games would be a valid excuse if he wasn't also leaving out 5970 results in games he's still using. Techpowerup are being INCREDIBLY dodgey.

EDIT:- Hmm, left in UT3, took out prey, took out DX10 Company of heroes where the 5970 had a massive lead and added a completely cpu limited Sup Commander 2, took out the 5970 result in one of the best games he left in. Left in old dx9 benchmarks that are 7 years old.

Also do try and remember, these benchmarks(that all show AMD in the best possible light, while I might add MANY people talking up AMD lacking in DX9 performance......) were fine for the release of Fermi, the fastest single GPU DX11 core, and the most recent DX11 architecture, he since removed all the ones where AMD did its best and massively altered the appearances of the speed difference in the cards.

Seriously a benchmark that shows the difference in gpu speed is FAR more relevant than Sup Com 2 that shows no difference across the board in most of his tests. You might also remember he forgot to use Cat 10.3 for the review when Fermi came out, instead using cat 9.12, which by the way gave the 5870 a marked performance INCREASE

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/10.html

and was lambasted the day after he did the review, I just remembered why he added the "new" results.

He's really not one of the best reviewers out there, writes some nice software, thats where it ends.

Also he removed the irrelevant games AMD had a 50%+ lead in, excludes a 70% faster result and adds in the technologically advanced....... World of Warcraft aswell as some cpu limited games.

EDIT 2:- Sorry I didn't notice he LEFT out 5970 results, again, for World of warcraft, which based off the 5870 getting so close to the 480gtx, would destroy the 480gtx. Opps, clear fairness, removed AMD's best results, added new games, left out best results of the new games he added.
 
Last edited:
Going through a few recent Nvidia threads, a lot of people go on about Nvidia cards using a lot more power, i seen the quote "it's like having a XX watt lightbulb in the system" quite a few times now

Serious question here, as i never really gave this aspect of a graphics card any consideration what so ever, is it a lot more expensive to run say a 480 over a 5870, in terms of electric costs over a year?

I seen it come up awful lot about how bad Nvidia are compared to ATI in this regard and how much extra power they need, and i was wondering what are the implications of it?
 
Going through a few recent Nvidia threads, a lot of people go on about Nvidia cards using a lot more power, i seen the quote "it's like having a XX watt lightbulb in the system" quite a few times now

Serious question here, as i never really gave this aspect of a graphics card any consideration what so ever, is it a lot more expensive to run say a 480 over a 5870, in terms of electric costs over a year?

I seen it come up awful lot about how bad Nvidia are compared to ATI in this regard and how much extra power they need, and i was wondering what are the implications of it?

Im no expert but in money terms for 12 months running a 480 vs a 5870 i dont think there is a lot of difference maybe £20-30,somebody will no doubt correct me if im wrong:)

As for the heat well that is a easy answer, the 480 will heat up your case and components a lot more than a 5870,that is a fact.
 
Going through a few recent Nvidia threads, a lot of people go on about Nvidia cards using a lot more power, i seen the quote "it's like having a XX watt lightbulb in the system" quite a few times now

Serious question here, as i never really gave this aspect of a graphics card any consideration what so ever, is it a lot more expensive to run say a 480 over a 5870, in terms of electric costs over a year?

I seen it come up awful lot about how bad Nvidia are compared to ATI in this regard and how much extra power they need, and i was wondering what are the implications of it?

£20 a year, its very ackward to say because, both sides idle power usage goes up when you use two screens(not sure if thats true of the 6870), how much do you game and use 300W of your 480gtx, and how often is it at 40W(whatever its idle power draw is), its very difficult to calculate.

Its not a deciding factor, its performance, price, size, noise, availability, whose drivers you like, how much power it uses.

If you can't decide based on everything else it might make up your mind for you, but its not a huge deal. Reviews make a big deal because to people like Dell, where they will sell 99% of the cards rather than the 1% to us lot, they DO care about the power. It is a big deal in the grande scheme of the design/success of the 480gtx altogether, to the individual user, its Meh.
 
Rumour has it that reviewers benching these upcoming GPU's put the 580 at 6950 performance!

Looks like the 580 needs to be a good clocker...

I wouldn't be surprised, though the question becomes, how cut down is the 6950 and at what price. If AMD are going to put it at £220-250 and Nvidia want to sell their 580GTX at £400...........

I also said this before, a overclocked 5870 isn't going to come within 30% of a 6970, theres no comparison, if the 580GTX and 480GTX both hit a max say 900Mhz overclock, then the clock speed advantage goes away and you're left with 6.5% extra shaders, which depending on the game will probably mean between 4-7% faster depending on the game. If it doesn't overclock further than a 480gtx it will be a joke, but it "should" overclock further. Its a similar situation to a 5850/5870, both at max overclocks, which are basically the same(with same volts/same cooler) then it comes down to the shaders alone, not the different default clocks, it goes from 20% slower, to 5-8% in general.
 
If anyone honestly thinks they will notice the extra juice a 480 uses(cost wise) i would suggest looking at your own home first as many things eat the kW's well before your humble little gfx card. If it is your psu or the heat you are worried about then fair enough, but if its the electricity bill then do not worry.
 
Back
Top Bottom