Whats the fps effect of 1/5 of a GHZ?

Associate
Joined
19 Jun 2008
Posts
99
I have my E8500 overclocked to 3804ghz what increase in fps would I see if I overclocked it to 4ghz please because I heard that once CPU's get over 3.6ghz further overclocking is almost pointless, is this true or just a urban myth created by the manufacturers to stop people damaging their chips?
 
If the speed gets pointless over 3.6 why do people clock to 5Ghz+ for benchmarking?

E-peen.

As per the ops post, can some one maybe do some kind of benchmark at stock, 3.5, 3.8, 4.0, 4.2 & 4.5 ghz?

I've seen a few stable 4500mhz screenies in this forum lately so it would be nice to see.
 
At high res you'll find that in most games you gain very very little from going over 3.2Ghz, and it actually starts to level off at about 2.8Ghz (i'll try and find the article in a mo), sad fact is once you start going into the 1680x1050/1600x1200+ res you are far more GPU limited than CPU.
 
it depends a lot on the game as well. UT3, CoD4 and a few other games benefit significantly from faster clockspeed.

That said, a lot of people also just like a nice even 4ghz as a clock...but the thing is once you get there you want more, or I do at least. I wouldn't mind a 5ghz 24/7 clock :D
 
Little real world difference and as already said depends a lot on your graphics card and resolution.

if you are cpu limited in a game you will see a boost. COH is an example.

You are only likley to see a gain in more than the odd game if you have a GTX280 or 4870x2 and game at 1680 x 1050.

At higher resolutions and/or lessor graphics cards the extra Mhz will show you no difference.

I did notice when trying to get the best benchmark out of crysis, that increasing my cpu speed increased my fps but hardly worth it for the effort.
 
I have my E8500 overclocked to 3804ghz what increase in fps would I see if I overclocked it to 4ghz please because I heard that once CPU's get over 3.6ghz further overclocking is almost pointless, is this true or just a urban myth created by the manufacturers to stop people damaging their chips?

4000 / 3800 = 5%. That's theoretical; real-life gains (e.g. fps) will be lower as other components need to be taken into account. There's absolutely no reason overclocking past 3.6GHz should be pointless as long as you're not GPU limited, but again we're talking very small gains in most games. Of course 4GHz sounds much better than 3.XGHz which if probably why half the people do it.

That aside pushing PC components to their limit is a hobby like any other, and I don't see why people need to repeat this oh-so-popular "e-pen" term. What about racecar drivers, runners, swimmer, any competitive activity really, not to mention the size of someone's house, breed of dog, or sallary? It becomes an issue when people keep blowing their trumpets and when we let it bother us. Personally I'm very happy to get free performance by overclocking, but I really couldn't care less if it's the fastest system around. That was certainly the driving force 10 years ago when I started with that Celeron 300A but now, nah. Besides, those who don't yell at the top of their lungs obviously have nothing to prove ;)
 
At high res you'll find that in most games you gain very very little from going over 3.2Ghz, and it actually starts to level off at about 2.8Ghz (i'll try and find the article in a mo), sad fact is once you start going into the 1680x1050/1600x1200+ res you are far more GPU limited than CPU.

Would it be worth my while clocking my Q6600 from 2.4Ghz to 2.8Ghz then?

Would a stock cooler take that?
 
Back
Top Bottom