OCing CPU or GPU will result in better FPS in Crysis?

Associate
Joined
3 Feb 2008
Posts
256
Location
Bristol
Hi,

I have a QX9650 and 2 x 8800 GTX OC2 and am wondering whether OCing the CPU or GPU will result in more FPS?
I have got my QX9650 up to 3.67GHz by increasing the multi from 9 to 11, however this is not entirely stable, probably need to bump up VCore a smidging.
I have however got the QX9650 to 3.33Ghz at 10x multi with full stability.
My next job is to go slightly further, perhaps getting the CPU to 11x multi? Maybe upping the FSB?

Where would you guys start? From what I gather OCing the CPU is quicker and cooler, and potentially results in more stability?


Thanks
 
I saw some interesting tests somewhere (I think on xtremesystems) that showed it to be both CPU & GPU bound.

Certainly I saw a healthy increase when going from an AMD 4000+ @ 3.0GHz to a Q6600 at 3.2GHz, both with the same HD 3870.

Simon.
 
Well mate, not sure this answers, but I've spent the last two days trying to get better performance out of Crysis... for info I run...

1920x1200 resoultion with no AA or AF on... when I run the crysis benchmark I get 29.5fps...

Infact, here's a list just for you.. bear in mind I can't be bothered clocking the GPU lower than my 24/7 settings of 725/1798/2200

CPU Clocking:
Q6600 @2.8ghz and the 8800GT at 725/1798/2200 = 28.4fps
Q6600 @3.0ghz and the 8800GT at 725/1798/2200 = 29.4fps
Q6600 @3.2ghz and the 8800GT at 725/1798/2200 = 29.5fps
Q6600 @3.3ghz and the 8800GT at 725/1798/2200 = 29.6fps
Q6600 @3.4ghz and the 8800GT at 725/1798/2200 = 29.6fps

GPU Core Clocking:
Q6600 @3.2ghz and the 8800GT at 725/1798/2200 = 29.5fps
Q6600 @3.2ghz and the 8800GT at 735/1798/2200 = 29.5fps
Q6600 @3.2ghz and the 8800GT at 745/1798/2200 = 29.6fps
Q6600 @3.2ghz and the 8800GT at 755/1798/2200 = 29.6fps
Q6600 @3.2ghz and the 8800GT at 765/1798/2200 = 29.6fps
Q6600 @3.2ghz and the 8800GT at 775/1798/2200 = 29.8fps

GPU Shader Clocking:
Q6600 @3.2ghz and the 8800GT at 725/1798/2200 = 29.5fps
Q6600 @3.2ghz and the 8800GT at 725/1800/2200 = 29.6fps
Q6600 @3.2ghz and the 8800GT at 725/1810/2200 = 29.8fps
Q6600 @3.2ghz and the 8800GT at 725/1820/2200 = 29.9fps
Q6600 @3.2ghz and the 8800GT at 725/1830/2200 = 30.0fps (becomes unstable)

Now, I would say that from what I've checked, CPU, you could probably run the Q6600 at 2.4ghz and you won't probably notice much drop at all, as there's NO difference between 2.8ghz and 3.4ghz... really, I mean 1fps??/ It's just not worth it... it's like this game is CPU and GPU dound now... and it's going to need a major change in hardware to move to the next level?

As in core on the GPU... well with a 50mhz increase, the fps just didn't increase, although I'm sure if I stuck the core to 625mhz say, there would be a drop me thinks...

The biggest increase in performance if you can call a few 10th's of a second increase come from the shader clock being increased...

So, what has all this proved... not a lot and sure I'm waffling now... however mate, it seems to me, that it seems to hit a ceiling with current hardware and tech and simply clocking up doesn't really make that much of an increase... I don't think until I go SLi in 3 weeks or the new 9900 series comes out, we're going to see much of an increase in Crysis I'm afraid :mad:

Cheers
Pug
 
I'd say the cpu...

you've got an overclockers cpu so that really ought to be running quite a bit higher than it is, I'd concentrate on that right now because with 8800GTX in sli

a) you might struggle to clock them (is that still the case? I recall the last time I had sli with 7900GTXs they didn't like being clocked in a pair)

and b) in properly configured games it scales quite well with cpu speed.

You're looking at potentially 40%+ gains in sli, you just need the cpu to be able to chuck instructions at the cards fast enough, and a well coded game that can take advantage of 2 cards which Crysis seems to do apart from these annoying minimum frame rates, it seems to drop into the 20s no matter what gfx setup you've got even if the average is in the high 40s/50s.
 
I'd say the cpu...

you've got an overclockers cpu so that really ought to be running quite a bit higher than it is, I'd concentrate on that right now because with 8800GTX in sli

a) you might struggle to clock them (is that still the case? I recall the last time I had sli with 7900GTXs they didn't like being clocked in a pair)

and b) in properly configured games it scales quite well with cpu speed.

You're looking at potentially 40%+ gains in sli, you just need the cpu to be able to chuck instructions at the cards fast enough, and a well coded game that can take advantage of 2 cards which Crysis seems to do apart from these annoying minimum frame rates, it seems to drop into the 20s no matter what gfx setup you've got even if the average is in the high 40s/50s.


Yeah this is what i figured. Do you reckon just upping the multi from 9x to 10x to 11x is good enough, or do you think that raising the FSB is a better option?
I am after ease of getting stable vs. performance improvement.

many thanks
 
not sure, never had an extreme edition cpu so I'm always stuck with the paupers fsb route.

I would imagine there's a sweet spot where you'd get both clocked.

Especially considering you're on DDR3, there's no reason not to go for a good fsb clock with a lower multi, but again I'm not familiar with your board it might not be up to the task, current sli boards aren't known for being great clockers iirc.

I'd be aiming to have that cpu at or around 4ghz though otherwise you might as well just have bought a Q6600.
 
yeah, I probably have a lot of tweaking ahead of me if i want to ensure a awesome OC'd system.
The problem I can see with oc'ing the FSB is that i will be effectively tweaking the memory settings, therefore I have two things to worry about? thus mutliplying the potential scope for problems.
arhhhh - the joys of overclocking!

anothing thing - overclocking my cpu to around the 4GHz level - is this alright on air? Or will I just pay the price of this by increase cpu fan speed which i really don't want! Would this be the time for H20? --- More money!! :( :( :(
 
a system like yours I would say needs cooling to match, especially the cpu. I doubt you'd get to 4ghz on air but you never know.. water ought to get you there and beyond but I agree it is a lot of money but its not as if you've built your pc on a budget is it!!

Plus its lots of fun :)

Also I would expect your memory would be able to cope with any fsb you manage, you'd be looking for something like 400-500fsb which is pc6400/8000 so 10666 is easily capable of keeping up. Just look for the setting in your BIOS that sets the memory speed and make sure its tied to your FSB 1:1 to start off with and then worry about using a divider to up the speed when you've got a decent cpu clock.
 
I haven't benchmarked it in any depth but just playing i see a massive difference between running my Q6600 at stock vs a good overclock. Seems like the cards (trisli) need a clock of at least 3.2 or so to be fed well. I can't tell much difference beyond about 3.5. I can only get relatively minor GPU overclocks with sli but it doesn't seem to have that great an impact using crysis benchcmark may be 1 fps or so
 
Not knocking pugs results at his res as at 1900 x 1200 the results will probably be totally different to mine. I suspect with only a GT, he will be very gpu limited and hence the very little gains in cpu overclocking.

I went from a 3ghx 4200+ x2 to a 3.4Ghz Q6600 and my Crysis fps went up be 11 fps from 38fps to 49fps at 1280 x 1024 benchmark, everything on high.

So at that res, my cpu certainly was the limiting factor and made a big difference in my results.

If you look here:

http://forums.overclockers.co.uk/showthread.php?t=17858884&highlight=ocuk+crysis+dx10

You will see from the 1680 x 1050 dx10 all very high results the following:

q6600 @ 3.7Ghz GTX 646/1000/1566 gets 22.96 fps
q6600 @ 3.3Ghz GTX 576/900/1350 gets 19.26 fps

not a great gain of 3.7 fps from overclocking both cpu and gpu but still 19.2% from a 12% overclock of both cpu and gpu.

You will get different results from different games. I suspect with SLI GTX and depending on res, you will see a big gain from overclocking your cpu as you may well be cpu limited in a lot of games.

Overall, i'd just aim to get the best overclock out of both.
 
i'm shocked that you bought a QX9650 and havn't clocked the nuts off it. and you seem to have no idea how to.

honestly, if you're not overclocking it, you've wasted £450 by not just buying a pre-overclocked Q6600.

sort it out to be honest, read the guides and get a good overclock on that chip, then on the cards, and see how much better crysis (and everything else) runs.

p.s. if not, i'll trade you a Q6600 for your chip.
 
i'm shocked that you bought a QX9650 and havn't clocked the nuts off it. and you seem to have no idea how to.

honestly, if you're not overclocking it, you've wasted £450 by not just buying a pre-overclocked Q6600.

sort it out to be honest, read the guides and get a good overclock on that chip, then on the cards, and see how much better crysis (and everything else) runs.

p.s. if not, i'll trade you a Q6600 for your chip.

hahahahaha.

Yeah the reasons for not overclocking are quite complicated, but essentially it was due to loads and loads and loads and loads of problems with my old mobo (the POS striker II Formula) which shook the confidence out of me.
Also I am too busy to spend hours upon end at the moment to seriously overclock and find a stable system.

So far, by spending about 20mins, I have got the multi up to 11x, thus my CPU is running at 3.66, and I am getting over 18+k in 3DMark06 under vista.
I do appreciate that I can go way way further, but I haven't the time or the inclination at the moment.

The reason for the post is for some serious overclocking efforts during the summer. Need to start the preperation! lol.

thanks
 
Back
Top Bottom