• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU Limitations @ 1920x1200 - Results

Soldato
Joined
25 Jan 2006
Posts
3,071
Location
Gateshead, Newcastle
Thought id post my findings, people have been worrying about CPU speed being a bottle kneck of the GPU's performance at high res, so heres my results 3870 X2 AT STOCK CLOCKS!!!!

These tests were ran at 1920x1200 with 4AA and 16AF with:

my Q6600 @ 2.4GHz to get - 10373 - http://www.ste0803.pwp.blueyonder.co.uk/06/3d0619202.44aa16af.jpg

SM02 - 4409
SM03 - 4231
CPU - 3389
OVERALL - 10373

and then @ 3.69GHz to get - 11361 - http://www.ste0803.pwp.blueyonder.co.uk/06/3d0619203.694aa16af.jpg

SM02 - 4696 (+287)
SM03 - 4249 (+18)
CPU - 4999 (+1610)
OVERALL - 11361

as you can see the 1.29GHz increase make very little difference to the actual graphics performance only.

These test were ran at 1920x1200 with 0AA and 0AF with:

my Q6600 @ 2.4GHz to get - 12712 - http://www.ste0803.pwp.blueyonder.co.uk/06/3d0619202.40aa0af.jpg

SM02 - 5063
SM03 - 6020
CPU - 3463
OVERALL - 12712

and then @ 3.69 to get - 16485 - http://www.ste0803.pwp.blueyonder.co.uk/06/3d0619203.690aa0af.jpg

SM02 - 6865 (+1802)
SM03 - 6967 (+947)
CPU - 5217 (+1754)
OVERALL - 16485

These tests were ran at 1280x1024 with 0AA and 0AF with:

my Q6600 @ 2.4GHz to get - 12815 - http://www.ste0803.pwp.blueyonder.co.uk/06/3d0612802.44aa16af.jpg

SM02 - 4785
SM03 - 6639
CPU - 3241
OVERALL - 12815

and then @ 3.69GHz to get - 18317 - http://service.futuremark.com/compare?3dm06=5245054

SM02 - 7454 (+2669)
SM03 - 8457 (+1818)
CPU - 5061 (+1820)
OVERALL - 18317

As you can see from the lower resolution the GPU performance is greatly increased when overclocking the cpu.

Extras:

COD4: http://www.ste0803.pwp.blueyonder.co.uk/06/cod4results.jpg
4AA and 16AF with MAX settings in game were used to get these results.
I recorded 3 minutes of game play online and then re-ran that with FRAPS to get the results.

1280 2.4
Frames - Time (ms) - Min - Max - Avg
23161 180000 - 70 - 252 - 128.672
1280 3.69
Frames - Time (ms) - Min - Max - Avg
24038 - 180000 - 72 - 260 - 133.544
1920 2.4
Frames - Time (ms) - Min - Max - Avg
16527 - 180000 - 48 - 166 - 91.817
1920 3.69
Frames - Time (ms) - Min - Max - Avg
16640 - 180000 - 48 - 167 - 92.444

Crysis:

2.40GHz http://www.ste0803.pwp.blueyonder.co.uk/06/crysis2.4.jpg
1280x1024 Average FPS = 44.04
1920x1200 Average FPS = 34.385

3.69GHz http://www.ste0803.pwp.blueyonder.co.uk/06/crysis3.69.jpg
1280x1024 Average FPS = 52.55
1920x1200 Average FPS = 35.425

Conclusion:
The Extra CPU Speed makes next to NO difference AT ALL at higher res in REAL WORLD tests. so all you people worrying about cpu limitations at high res theres no need.

lower res people are still getting decent(ish) gains from more cpu clocks.

So... all you with overclocked CPU's.... Why bother if all you use your pc for is gaming.

StevenG
 
Last edited:
must you be so condescending


thanks for the contribution StevenG, would have been nicer to see the difference in fps on some games to show some real world performance in addition tho :)

well i only own cod4 so cant really do many tests which is why i didnt.

but if people want to send me their games i can run some tests LOL

StevenG
 
Why do you have a top end system if you don't game?

i said i had cod4 :) im in a clan atm so dont have time to play anything else.

i'll get that crysis benchmark and give that a go. and is there anything in cod4 that i can use to get some frame info or an average frame rate?

StevenG
 
I dont mean to sound mean... but this is very well known stuff. It's been like this since GPU's existed. And SO many sites have done tests like this over the years.

This stuff really should be common sense even, but from reading these forums it isn't to quite a few people. So i hope your results atleast educates some of them.
 
Last edited:
Unforunately none of these results really means a thing as 3DMark06 puts far more emphasis on CPU performance than do the majority of games.

Swap a dual for a quad of equal clock speed and your 3DMark06 score will go up a significant amount whereas virtually all your games will be totally unaffected.

3DMark06 is an outdated piece of crap, the relevance of which to actual games was debatable when it was launched, let alone now.
 
Benchmarks show the majority of games being quicker with equal clock duals (ie 3ghz 8400 vs Q6600), and in the case of the amd9600 is gets destroyed buy all processors around the same price or cheaper. Although the C2D 8000 range are newer tech than the quads so probably have some advantage.
 
I dont mean to sound mean... but this is very well known stuff. It's been like this since GPU's existed. And SO many sites have done tests like this over the years.

This stuff really should be common sense even, but from reading these forums it isn't to quite a few people. So i hope your results atleast educates some of them.

common knowledge doesnt tell you how much of a difference specific step ups make for specific configs :p
 
I dont mean to sound mean... but this is very well known stuff. It's been like this since GPU's existed. And SO many sites have done tests like this over the years.

This stuff really should be common sense even, but from reading these forums it isn't to quite a few people. So i hope your results atleast educates some of them.

Well..we all have to start somewhere and if they use 'ocuk forums' to find out then all the better for us to pass on the knowledge.

Common sense?? When someone starts out learning about computers they know diddly-squat..so I don't agree with that statement. We all have to learn somewhere and we all sometime in the past knew little or nothing about computers - let alone about 'resolutions' and knowing what they are...geeesh.

:p
 
Last edited:
So you all think that if you're running a recent game at 1920x1200 that the CPU doesn't really come into play as your average frame rate over a given time is going to be similar no matter what CPU you have as your bottlnecked by the graphics card at that high resolution?

Total ********

Actually....

Lets take a graphic test run in 3d mark for example, there are some parts that can run at 90+ fps, and there are some parts that run at around 25-30fps....Yeah?

I've found that the bits with the high fps are much more common than the low fps...

So you will still end up with a similar-ish score, even though if you actually watched the run, the times the frame rate bottoms out to unplayable levels happens with much less frequency.

A prime example of this is the game FEAR.

I've seen review after review of graphics cards showing an average frame rate that is pretty much identical for a given cpu compared to a much faster cpu - bascally saying the game is completely bottlenecked by the graphics card at any resolution over 1024x768 (assuming the latest 8800 xxx is being used)...

Well these tit boxes aren't taking into account the LOWERST FRAME RATE DURING THAT RUN.

...Which is far more important than any AVERGAGE overall of a run...

What would you rather 50fps constantly, or 90fps but with the odd drop to 20-25fps?

I know you all *know* this stuff....

But sometimes you seem to conveniently forget!
 
So you all think that if you're running a recent game at 1920x1200 that the CPU doesn't really come into play as your average frame rate over a given time is going to be similar no matter what CPU you have as your bottlnecked by the graphics card at that high resolution?

Total ********

Actually....

Lets take a graphic test run in 3d mark for example, there are some parts that can run at 90+ fps, and there are some parts that run at around 25-30fps....Yeah?

I've found that the bits with the high fps are much more common than the low fps...

So you will still end up with a similar-ish score, even though if you actually watched the run, the times the frame rate bottoms out to unplayable levels happens with much less frequency.

A prime example of this is the game FEAR.

I've seen review after review of graphics cards showing an average frame rate that is pretty much identical for a given cpu compared to a much faster cpu - bascally saying the game is completely bottlenecked by the graphics card at any resolution over 1024x768 (assuming the latest 8800 xxx is being used)...

Well these tit boxes aren't taking into account the LOWERST FRAME RATE DURING THAT RUN.

...Which is far more important than any AVERGAGE overall of a run...

What would you rather 50fps constantly, or 90fps but with the odd drop to 20-25fps?

I know you all *know* this stuff....

But sometimes you seem to conveniently forget!

You're absolutely right, bottlenecks need to be evened out, and CPU speed plays a very important part in making sure a system is reliable, the annoying thing for me in Crysis is the low dips, not the average FPS which @ 35fps or so is just fine for me.
 
Cheers StevenG :)


Would def be interested in how well cod4 fares at higher res as have got a "low" spec cpu and thinking of going upto a 24" from present 19". Any benchies would be greatly appreciatd :)
 
Well i think people have taken my results (which i took my time to do, to help people out) in the wrong way.

i was simply pointing out that in 3D06 the CPU at higher RES doesnt make THAT much difference then it does at LOWER RES.

i will however, if people want me to run the crysis benchmark at the stock and overclocked speeds & to test COD4 would be un-even as it doesnt have an inbuilt benchmark. but i can run it for a few minutes on the same place in the game and record my results (if people are interested)

at the end of the day im going to lose free time and gaming time doing this and if people are going to critisize my results then i may as well not bother.

What i post in this thread is suppose to help people who want higher res monitors and are worrying about CPU limitations so im testing at the highest res i can and the *normal* res at stock and at an overclock of 1.29GHz to provide an example of how the CPU contributes to the game action you will see.

Thanks for Reading all those who found this useful

StevenG
 
Back
Top Bottom