• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Conroe above 3.6Ghz is USELESS for games on current tech

Soldato
Joined
10 Oct 2003
Posts
5,518
Location
Wiltshire
Controversial topic time. :D

Disclaimer: This doesn't necessarily apply to 8800GTX...

Compare these 3DMark06 results:

Conroe @ 3.6Ghz, X1900 Crossfire @ 680/800, Catalyst 6.8
Code:
[size=5][b]3DMark Score	11335 3DMarks[/b][/size]
SM 2.0 Score	4713 Marks
SM 3.0 Score	5145 Marks
CPU Score	3117 Marks

Detailed Test Results

Graphics Tests
1 - Return to Proxycon	36.314 FPS
2 - Firefly Forest	42.243 FPS

CPU Tests
CPU1 - Red Valley	0.982 FPS
CPU2 - Red Valley	1.583 FPS

HDR Tests
1 - Canyon Flight (SM 3.0)	49.688 FPS
2 - Deep Freeze (SM 3.0)	53.222 FPS
Conroe @ 4.3Ghz, X1900 Crossfire @ 680/864, Catalyst 6.9
Code:
[SIZE=5][b]3DMark Score	11672 3DMarks[/b][/SIZE]
SM 2.0 Score	4684 Marks
SM 3.0 Score	5066 Marks
CPU Score	3766 Marks

Detailed Test Results

Graphics Tests
1 - Return to Proxycon	36.29 FPS
2 - Firefly Forest	41.769 FPS

CPU Tests
CPU1 - Red Valley	1.193 FPS
CPU2 - Red Valley	1.902 FPS

HDR Tests
1 - Canyon Flight (SM 3.0)	49.025 FPS
2 - Deep Freeze (SM 3.0)	52.297 FPS
Gained about 650 points on the CPU tests going from 3.6Ghz to 4.3Ghz, but lost FPS and points on the other tests - presumably because the 3.6Ghz run was done on a totally fresh install of XP with nothing else installed except updates, whereas the 4.3Ghz run is on my everyday XP install with various things installed.

Sobering reading though, just goes to show you don't really need exotic cooling to get the best out of the current cards on the market.
 
Kesnel said:
Try running the cards at 680/800 in both tests. The x1900 memory gets really werid with scores on 3dmark when you're pushing it.

It just doesn't add up that the shader scores have gone down, I think this is the only explaination.
675/792 was the closest I could get without going over 680/800 (strange frequency multipliers I guess)

Conroe @ 4.3Ghz, X1900 Crossfire @ 675/792, Catalyst 6.9
Code:
[size=5][b]3DMark Score	11301 3DMarks[/b][/size]
SM 2.0 Score	4494 Marks
SM 3.0 Score	4873 Marks
CPU Score	3776 Marks

Detailed Test Results

Graphics Tests
1 - Return to Proxycon	34.641 FPS
2 - Firefly Forest	40.254 FPS

CPU Tests
CPU1 - Red Valley	1.197 FPS
CPU2 - Red Valley	1.906 FPS

HDR Tests
1 - Canyon Flight (SM 3.0)	47.149 FPS
2 - Deep Freeze (SM 3.0)	50.312 FPS
Pretty consistent results really.

I'm killing all non-essential processes, systray programs, etc. I'm also setting 3DMark06.exe to "High" priority.

With the exception of the install of XP which is now a "proper" install rather than with nothing installed at all, and the fact I'm now using a 3007WFP monitor instead of a CRT (could this lag the score at all?) nothing has really changed.
 
Still doesn't really explain why my actual graphics tests are now lower than what they were with the same cards, same memory (at same timings), same everything really.... unless my Windows install is bodged somehow (it hasn't been installed for very long though)
 
I'll admit the thread title is a bit contentious, but the results seem to be consistent.

Regardless of whether 3DMark is doing realtime physics, AI, and everything else regular games have to do - the bottom line is that whilst the CPU is running 700Mhz faster, the score isn't changing (or rather it's only changing with the graphics card clocks).

This is the very definition of being GPU limited.

By extension, if a Conroe at 3.6Ghz is already giving the same results as one clocked to 4.3Ghz - my point about Conroe above 3.6Ghz being useless for current technology (7950GX2/X19x0) is true isn't it? 8800GTX may change these results slightly, but I can't see it making a dramatic difference.

I haven't tried running it again at 3.6Ghz on this install of XP yet, but I suspect I would get slightly less than 11335 based on the results showing that this install of XP is consistently slower than the one I ran 3DMark at 3.6Ghz on.

This is quite significant when you consider that 3.6Ghz is achievable by most people on air cooling. Right now, if my 3DMark results are to be believed - overclocking my Conroe to 4.3Ghz, right now with my X1900 Crossfire setup, is a waste of time, for games at least.
 
Last edited:
james.miller said:
in which case how can you draw any conclusion from the test? in fact how can you even compare the two? run the test again at the same clocks as the first test, otherwise you wont have achieved anything at all.
K I'll do it now. :)
 
Ok here we go...

Conroe @ 3.6Ghz, X1900 CF @ 681.75/801.00 (closest I could get to 680/800)
Code:
[SIZE=5][B]3DMark Score	11014 3DMarks[/B][/SIZE]
SM 2.0 Score	4540 Marks
SM 3.0 Score	4917 Marks
CPU Score	3177 Marks

Detailed Test Results

Graphics Tests
1 - Return to Proxycon	34.943 FPS
2 - Firefly Forest	40.724 FPS

CPU Tests
CPU1 - Red Valley	1.003 FPS
CPU2 - Red Valley	1.61 FPS

HDR Tests
1 - Canyon Flight (SM 3.0)	47.704 FPS
2 - Deep Freeze (SM 3.0)	50.629 FPS
Same OS install, same procedure for overclocking, etc. Same procedure for running 3DMark06 (killing non-essential systray apps, 3DMark06.exe set to High priority).

Only thing changed in the BIOS was to set the FSB from 430(x10) to 360(x10). Nothing else changed.

Pretty conclusive no?
 
Two things I know to be the case:

1) This XP install is "slower" than the one I originally ran 3DMark on. This isn't too surprising given the original install was a completely bare "driver and updates only" install, whereas this one is my everyday "apps & utilities installed" one. This could also be because of differences between Catalyst 6.8 and 6.9.

2) The CPU score is the only thing that changes significantly from 3.6Ghz to 4.3Ghz, the graphics scores stay roughly the same. Therefore you can assume by extension that at least as far as 3DMark06 is concerned - the difference in "gaming speed" between 3.6Ghz to 4.3Ghz is non-existant.

Obviously as other have said I don't know how directly 3DMark translates to actual games - but seeing as it is usually quite easy to see where hardware changes make a significant difference I'm more inclined to believe that I am GPU limited in bleeding-edge games whether I'm running 3.6Ghz or 4.3Ghz.

You could argue that the extra CPU speed might make games slightly quicker, but that would assume that your PC was having to wait for the CPU to calculate "something" before displaying the results on screen. If 3DMark is rendering the scenes in real-time (which obviously it is, even if its a fixed camera path) then increasing the speed of my CPU isn't making any difference at all to it.

I guess it doesn't matter a great deal at the end of the day, but it's food for thought for people looking to similar setups (phase or water).
 
Last edited:
Core @ 800? Not sure where that's come from? :) Highest I can get on stock voltages is around the 685 mark, any more and lockups are a factor.

Vegeta said:
Don't use 3dmark scores to prove this, the cpu can only power a game upto a point, it is dependant on the GPU and if the GPU is fully powered when the conroe is at 3.6ghz, then why should it run any faster at 4ghz?
That's my point though? If games are going to be progressively more like the sort of graphics 3DMark06 is rendering (e.g. Crysis, Alan Wake, etc) then it stands to reason that I'm going to be GPU-limited. I might as well just run the CPU at 3.6Ghz then?
 
"Better" is a state of mind though really.

If phase cooling was as quiet, risk free and cheap as air cooling then we'd all have it.

My point really was that I was expecting a significant jump in my 3DMark score commensurate with the 700Mhz increase in clockspeed, when in reality it's practically unchanged.

All things considered would you spend a few hundred pounds for a cooling solution that is quite noisy by most air/water cooling standards, when as far as you can see there is no tangible benefit in the latest games?

I'm going to try running the HL2:Lost Coast timedemos tonight at 4.3Ghz and 3.6Ghz, and if there is no real difference then I think I'm going to be looking more seriously into water-cooling!
 
Back
Top Bottom