• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

[LegionHW]CPU scaling with the Radeon 5970

Soldato
Joined
7 May 2006
Posts
12,183
Location
London, Ealing
Originally Posted by Conclusion
The Phenom II X4 results were quite different to those recorded when testing with the Core i7 processors, though this was not necessarily a bad thing. When operating at lower clock speeds, the Phenom II X4 did not fair all that well, as we saw a sharp decline in performance. However when clocked at 3.0GHz and beyond, the Phenom II X4 really picked up the pace, and in many cases was able to outclass the Core i7.

In games such as Wolfenstein, Call of Duty: Modern Warfare 2, Tom Clancy’s H.A.W.X, BattleForge and Far Cry 2 the Phenom II X4 processors were actually faster when clocked up near 4GHz! This is quite amazing as out of the 9 games tested, the Phenom II X4 series was faster than the Core i7’s in 5 of them. Although the margins were very limited, the Phenom II X4 was found to be faster, and had it just managed to match the Core i7 series with the Radeon HD 5970, we would have been impressed.

While the Phenom II X4 matched the Core i7 in Crysis Warhead, the only games where it failed was Company of Heroes Opposing Fronts, Left 4 Dead 2 and Batman Arkham Asylum. The Phenom II X4 was noticeably slower in these games, making the Core i7 the better choice here. Still, for the most part we found the Phenom II X4 to be every bit as good as the Core i7 processors when gaming with the new Radeon HD 5970.

Having said that, we recommend that AMD users looking at buying this powerful graphics card make sure that they have a Phenom II X4 processor that is clocked at 3.0GHz or greater. Most Phenom II X4 processors are capable of overclocking to 3.0GHz and beyond, while the more high-end options, such as the Phenom II X4 955 and 965 processors, come clocked at 3.2GHz and 3.4GHz respectively.

While we hardly expect there will be many users trying to pair a $600 US graphics card, such as the Radeon HD 5970, with a budget processor, it is nice to see that the sub-$200 US processors are up to the task. The Intel Core i7 920 proved to be more than powerful enough at $280 US, while the AMD Phenom II X4 955 will work just as well at $165 US, giving users plenty of great options.

Reviewed By Steven Walton
http://www.legionhardware.com/document.php?id=869&p=0
 
Phenom II doesn't actually beat the i7 in any of those tests...

The times it "does" are all cases where its obviously GPU limited rather than CPU limited and the fps difference comes down to margin of error - within 1-2fps which can be due to slight variations in the benchmark. In tests that don't show a GPU limit the i7 is the clear winner - with the 2gig result often exceeding or matching upto 2.6-2.7gig on the PII.

Extremely flawed conclusion.

The only real conclusion that you can get out of this is that an overclocked PII is quite capable of pushing a single 5970 in most cases - you won't quite get the max out of it as a few tests show but whether thats worth the extra money is debatable.

EDIT: To give a typical example:

Battleforge results:

2gig i7: 46

2.6gig PII: 44
3gig PII: 45
3.6gig PII: 47
3.8gig PII: 47
4gig PII: 47

4gig i7: 46

You can't really give the win to any CPU with those results - its clearly not CPU limited... if anything the i7 wins by way because even at 2gig its within margin of error of the top end PII OC.
 
Last edited:
Not to mention they are all 2560x results - meaning more likely to hit GPU limits before CPU - if your going to compare CPUs you have to atleast do one lot of benchmarks at a lower res.
 
Not to mention they are all 2560x results - meaning more likely to hit GPU limits before CPU - if your going to compare CPUs you have to atleast do one lot of benchmarks at a lower res.
Exactly. Just because graphics cards are brought down to their knees at 2560x1600 doesn't mean that the same is true at more common resolutions like 1680x1050 or 1920x1200, where it's possible that the CPU is the bottleneck. These results are interesting but hardly definitive.
 
All this proves is those who throw the phrase "cpu bottleneck" around are fools.

Not really; some 1680/1920 results might've shown us something interesting.

Choosing 2560x1600, as the sole resolution, in a CPU test, destroys the entire point of the endeavour. They also used anti-aliasing in a lot of the tests which on it's own would be deeply lolsome but when placed in addition to the res, borders on depressing.

It's as if they wanted to guarantee a performance flatline.

No minimum frame results either. Ho hum, hopefully other sites will do something similar but in a much less stupid way.

Also, hi unbiased forum contributor Final8y, again. Again, and endlessly again. Whenever something pops up showing AMD in an okay light, no matter how tenuous the claim, no matter how flawed the method, I know that you will be there to tell us of it.

Because you are unbiased forum contributor Final8y, and that is what unbiased forum contributor Final8y does *solemnly salutes*.
 
Also, hi unbiased forum contributor Final8y, again. Again, and endlessly again. Whenever something pops up showing AMD in an okay light, no matter how tenuous the claim, no matter how flawed the method, I know that you will be there to tell us of it.

Because you are unbiased forum contributor Final8y, and that is what unbiased forum contributor Final8y does *solemnly salutes*.

Come on man let it go, Rroff is no stranger to selective "hearing", are you going to deride him for it too.

No offence Rroff, no malice intended.
 
No offence taken but I'd love someone to give examples of it...

I don't pretend to be unbias - but I don't go out of my way to selectively show nvidia in a good light or ATI in a bad...

I get pretty sick of the whole nVidia fanboy thing - if anyone takes 2 seconds to actually look at the other posts I make - rather than hating coz they know in their hearts I'm correct - you'd see I quite often have positive things to say about ATI and quite often have negative things to say about nVidia.
 
No offence taken but I'd love someone to give examples of it...

I don't pretend to be unbias - but I don't go out of my way to selectively show nvidia in a good light or ATI in a bad...

I get pretty sick of the whole nVidia fanboy thing - if anyone takes 2 seconds to actually look at the other posts I make - rather than hating coz they know in their hearts I'm correct - you'd see I quite often have positive things to say about ATI and quite often have negative things to say about nVidia.

Unbiased perhaps? I see people saying they're not "Bias" around here, it doesn't make any sense?

Also, come on Rroff, you can hardly ask for evidence, you never give any yourself and then come with weak excuses for why you can't or don't need to give evidence.

We all know you are biased, though I wouldn't call you an out and out fanboy like the resident nVidia trolls, but you, even by your own admissions are biased.
 
Exactly. Just because graphics cards are brought down to their knees at 2560x1600 doesn't mean that the same is true at more common resolutions like 1680x1050 or 1920x1200, where it's possible that the CPU is the bottleneck. These results are interesting but hardly definitive.

This
 
I didn't ask for evidence as such I just don't remember doing it myself and was open to someone showing me differently...
 
Not really; some 1680/1920 results might've shown us something interesting.

Choosing 2560x1600, as the sole resolution, in a CPU test, destroys the entire point of the endeavour. They also used anti-aliasing in a lot of the tests which on it's own would be deeply lolsome but when placed in addition to the res, borders on depressing.

It's as if they wanted to guarantee a performance flatline.

No minimum frame results either. Ho hum, hopefully other sites will do something similar but in a much less stupid way.

Also, hi unbiased forum contributor Final8y, again. Again, and endlessly again. Whenever something pops up showing AMD in an okay light, no matter how tenuous the claim, no matter how flawed the method, I know that you will be there to tell us of it.

Because you are unbiased forum contributor Final8y, and that is what unbiased forum contributor Final8y does *solemnly salutes*.

I would have thought a lot of 5970 users would be using such at card at 2560x1600 anyway but I do a agree there should have been some 1920 tests as well , any lower wouldn't show a real word scenario imo
 
Pointless test imo, if you want to do this kind of cpu test then you need to use low resolutions.

I would be interested to see this test done again by someone with some common sense and would like to see cpu's like the Q6600 included so I can see how it holds up against the i7.
 
Choosing 2560 as the only res was deliberate.

Just because someone has top end VC's doesn't automatically mean they have 30inch dells running at 2560.
Some wont want that high a res.

Same game engines are heavy at 1920 and the extra grunt of the latest GPU's could be used to maintain a higher average framerate.
Maybe even attain the goal of being able to sync on refresh without FPS drops.

Personally i find a desktop at 2560 very hard on the eyes and much prefer
1920 on a 27inch.
 
Please quit bitching about the 'bias' from forum uses and discuss the results, rather than picking each other apart, pick apart/discuss the arguments presented. Jumping on the bandwagon of "He's biased!!!111" gives you less credibility in the long term.

Thanks for the post final8y :) And good points put forward about the GPU limitations. Show's that you can't take things at face value.
 
I would have thought a lot of 5970 users would be using such at card at 2560x1600 anyway
Not necessarily. You've got to remember that even topend cards often struggle at that high a resolution, especially with more recent titles, and that there aren't any panels that large with decent response times. There are quite a few 24" S-IPS monitors with decent response times, while 24" TN-panels offer even better response times (while sacrificing viewing angles). Currently 1920x1200 is for the high-end gamer and 1680x1050 for the average gamer, though I'm sure that will change going forward. It's good to see titles performing well at higher resolutions and cards that are better able to handle it.
 
Surely you wouldn't need a 5970 unless you were running at 2560x1600 (or 3 monitors in eyefinity) anyway so it wouldn't be relevant to do this test at lower res.
 
I bet that the majority of people who have a 5970 are running 1920x1200 because they are silly.

I have seen idiots running TRI-SLI GTX 285s on 720P resolution....see they need all that power for their "HD" gaming.
 
Back
Top Bottom