• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Conroe above 3.6Ghz is USELESS for games on current tech

CR

Just because task manager shows 100% cpu does not mean the CPU is maxed out. A simple loop doing nothing will show 100% cpu usage. The same loop processing the windows message queue every 10 iteration will then show 0% CPU usage.
 
Concorde Rules said:
Sorry, but my CPU in EVERY game I own is at 100% ALL THE TIME.

I game at 1600x1200, X1900XTX.

Therefore its using all the CPU.
Since processors have had pipelining and out of order execution since the Pentium Pro then no it is not using all the CPU ;)

And because most games consist of a do {} while(); statement as their main control block then of course it will try to suck up all the CPU time.

Richdog said:
I would hzard a guess that all of your 23 years of software coding and playing with multi-threaded apps
What is to say he doesn't also play games and has done so since he was a kid? :p
 
Last edited:
NathanE said:
What is to say he doesn't also play games and has done so since he was a kid? :p

Because modern games haven't been around for 23 years, and it's only fairly recently that as a whole they started to become more frequently GPU and less CPU dependant. :p
 
3DMark is a crummy test for measureing the gaming performance of a computer. Its an 'ok' test of the potential of your graphics card, nothing more, as its designed to be GPU capped not CPU capped.

In games the CPU's often doing a lot more work, running AI, Physics, preloading new areas, networking, acting as game servers etc etc etc.

In the real world, some games have shown to be cpu limited at the worst possible places, giving very low minimum FPS, while at other times rocketing ahead to astronomical high max FPS. A good processor can often help boosting the minimum FPS, while the affect on average, and max fps might not show the computer with the better CPU often gives the better 'gaming experience'.

That said, for gaming you need a pretty insane graphics system to even make full use of a 2.66Ghz Conroe. But thats not a bad thing, more CPU power for the developers to utilise in future titles. Cant see much future in Ageis's physics card though, I would imagine multicore CPU's will be able to handle advanced physics without too much effort.
 
what you're all missing, and what i fnd most puzzling, is that is cpu score is higher in the 2nd test on all cpu tests therefore it is NOT the cpu causing the problem.


how could you guys not see this? too busy arguing? graphics card drivers or the card clocks - it'll be something to do with one of them i could almost bet on it. if the cpu were throttling it would affect the cpu scores being, almost 100% cpu driven. Think about it.
 
Last edited:
I'll admit the thread title is a bit contentious, but the results seem to be consistent.

Regardless of whether 3DMark is doing realtime physics, AI, and everything else regular games have to do - the bottom line is that whilst the CPU is running 700Mhz faster, the score isn't changing (or rather it's only changing with the graphics card clocks).

This is the very definition of being GPU limited.

By extension, if a Conroe at 3.6Ghz is already giving the same results as one clocked to 4.3Ghz - my point about Conroe above 3.6Ghz being useless for current technology (7950GX2/X19x0) is true isn't it? 8800GTX may change these results slightly, but I can't see it making a dramatic difference.

I haven't tried running it again at 3.6Ghz on this install of XP yet, but I suspect I would get slightly less than 11335 based on the results showing that this install of XP is consistently slower than the one I ran 3DMark at 3.6Ghz on.

This is quite significant when you consider that 3.6Ghz is achievable by most people on air cooling. Right now, if my 3DMark results are to be believed - overclocking my Conroe to 4.3Ghz, right now with my X1900 Crossfire setup, is a waste of time, for games at least.
 
Last edited:
Durzel said:
I haven't tried running it again at 3.6Ghz on this install of XP yet, but I suspect I would get slightly less than 11335 based on the results showing that this install of XP is consistently slower than the one I ran 3DMark at 3.6Ghz on.

in which case how can you draw any conclusion from the test? in fact how can you even compare the two? run the test again at the same clocks as the first test, otherwise you wont have achieved anything at all.
 
james.miller said:
in which case how can you draw any conclusion from the test? in fact how can you even compare the two? run the test again at the same clocks as the first test, otherwise you wont have achieved anything at all.
K I'll do it now. :)
 
Ok here we go...

Conroe @ 3.6Ghz, X1900 CF @ 681.75/801.00 (closest I could get to 680/800)
Code:
[SIZE=5][B]3DMark Score	11014 3DMarks[/B][/SIZE]
SM 2.0 Score	4540 Marks
SM 3.0 Score	4917 Marks
CPU Score	3177 Marks

Detailed Test Results

Graphics Tests
1 - Return to Proxycon	34.943 FPS
2 - Firefly Forest	40.724 FPS

CPU Tests
CPU1 - Red Valley	1.003 FPS
CPU2 - Red Valley	1.61 FPS

HDR Tests
1 - Canyon Flight (SM 3.0)	47.704 FPS
2 - Deep Freeze (SM 3.0)	50.629 FPS
Same OS install, same procedure for overclocking, etc. Same procedure for running 3DMark06 (killing non-essential systray apps, 3DMark06.exe set to High priority).

Only thing changed in the BIOS was to set the FSB from 430(x10) to 360(x10). Nothing else changed.

Pretty conclusive no?
 
yeah. that says your cpu scores are right on the button. infact they are better than the results displayed in the first test on the old install, while the gpu results are slower. was that what you were expecting? what it it does show is that different drivers and installs account for a 321 point difference which is actually less than the difference between your first 3.6ghz run and the 4.2ghz run (337 points)

we all know already that '06 is very gpu limited and perhaps you are just right at the cards limits. However, i've never found '03 and onwards to be fair representatives of gaming in general and the only way to find out is to play those games.
 
Two things I know to be the case:

1) This XP install is "slower" than the one I originally ran 3DMark on. This isn't too surprising given the original install was a completely bare "driver and updates only" install, whereas this one is my everyday "apps & utilities installed" one. This could also be because of differences between Catalyst 6.8 and 6.9.

2) The CPU score is the only thing that changes significantly from 3.6Ghz to 4.3Ghz, the graphics scores stay roughly the same. Therefore you can assume by extension that at least as far as 3DMark06 is concerned - the difference in "gaming speed" between 3.6Ghz to 4.3Ghz is non-existant.

Obviously as other have said I don't know how directly 3DMark translates to actual games - but seeing as it is usually quite easy to see where hardware changes make a significant difference I'm more inclined to believe that I am GPU limited in bleeding-edge games whether I'm running 3.6Ghz or 4.3Ghz.

You could argue that the extra CPU speed might make games slightly quicker, but that would assume that your PC was having to wait for the CPU to calculate "something" before displaying the results on screen. If 3DMark is rendering the scenes in real-time (which obviously it is, even if its a fixed camera path) then increasing the speed of my CPU isn't making any difference at all to it.

I guess it doesn't matter a great deal at the end of the day, but it's food for thought for people looking to similar setups (phase or water).
 
Last edited:
Durzel said:
Pretty conclusive no?

Yes, it is conclusive that a faster CPU is useless in 3DMARK 06, a GPU limited application of specific nature designed with the purpose of testing GPU's, with current tech. But then this has been commmon knowledge for a long time.

As for games, no, it's not really evidence of anything i'm afraid, as multiple people have pointed out.

No offence intended. :)
 
Not even bothered reading it, no doubt the topic will get people pouncing on the op so I will also, slightly :D.
Don't use 3dmark scores to prove this, the cpu can only power a game upto a point, it is dependant on the GPU and if the GPU is fully powered when the conroe is at 3.6ghz, then why should it run any faster at 4ghz?
 
james.miller said:
what you're all missing, and what i fnd most puzzling, is that is cpu score is higher in the 2nd test on all cpu tests therefore it is NOT the cpu causing the problem.


how could you guys not see this? too busy arguing? graphics card drivers or the card clocks - it'll be something to do with one of them i could almost bet on it. if the cpu were throttling it would affect the cpu scores being, almost 100% cpu driven. Think about it.

well i actually mentioned gpu mem throttling on the first page before you entered this thread. it's not going to be the core throttling @ 800 but the memory @ 864 is a likely candidate.

how could you not see this? ;)
 
Core @ 800? Not sure where that's come from? :) Highest I can get on stock voltages is around the 685 mark, any more and lockups are a factor.

Vegeta said:
Don't use 3dmark scores to prove this, the cpu can only power a game upto a point, it is dependant on the GPU and if the GPU is fully powered when the conroe is at 3.6ghz, then why should it run any faster at 4ghz?
That's my point though? If games are going to be progressively more like the sort of graphics 3DMark06 is rendering (e.g. Crysis, Alan Wake, etc) then it stands to reason that I'm going to be GPU-limited. I might as well just run the CPU at 3.6Ghz then?
 
marscay said:
well i actually mentioned gpu mem throttling on the first page before you entered this thread. it's not going to be the core throttling @ 800 but the memory @ 864 is a likely candidate.

how could you not see this? ;)

i believe i did
james.miller said:
how could you guys not see this? too busy arguing? graphics card drivers or the card clocks - it'll be something to do with one of them i could almost bet on it.

thank you and goodbye. take your wink with you.
 
Back
Top Bottom