• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

what would be better for games?

Soldato
Joined
22 Mar 2009
Posts
7,754
Location
Cornwall
hi, now i have a computer desk i will start using my pc more for games and the like. so my question is, would i be better overclocking it to a 4-4.2GHz Dual Core or a 3.5-3.8GHz Tri Core?
cheers.
 
Depends what your playing, most games like a higher clock rate so personally I would go for that. What processor do you have?

-eidt - Just re-read the clocks you were talking about, I dont think you would have a problem either way with them speeds, presonally I would still go dual at 4ghz tho.
 
its a athlon x2 555. i had it running fine at 3.8GHz tri core, but since moving it it doesnt seem to like it any more. have had it running on prime now for 30mins or so on a dc 4GHz, which i have never managed to get to work before, maxing out at 48c which i think is pretty safe. will let that go overnight i think to be safe, but cant see why if i can now hit the 4GHz i couldnt aim a tad higher. failing that, i should easily be able to find a stable 3.8GHz tri core setup as it did work before.
but if nothing will be gained having the extra core unlocked, then maybe i would be better concentrating on getting the highest stable clock on the 2 factory enabled cores.
 
right the 4.1GHz failed :( but it was stable at 4GHz. 3dmark11 gave a score of around 4500. got the 3.8GHz tri core stable again (needed a little voltage bump) and 3dmark11 gives it around a 5500 score. so i guess this means that the tri-core wins overall?
the next thing is the HT Bus Speed and HT Bus Width. what does these do and what should they be set to (is the higher i can get these the better) and how do i test if upping these is stable, what do they relate to in terms of stressing (cpu, ram.....)?

cheers.
 
Last edited:
200Mhz is a worthy drop for another core, at 3.8 you should be fine in most things.

To put it into perspective I play at 1920x1080 and I get 30/35fps outside in "The Witcher 2" with my [email protected] + 5850stock on ultra settings minus ubber sampling.

So you will be getting much more power behind you especially since your CPU wipes the floor with mine in games. Just have a look at the bottom of this chart (link below) to see (yours is light blue), your nearly getting double framerates when GPU is not a bottleneck.

http://www.anandtech.com/bench/Product/120?vs=66
 
ok thanks :)

what about the HT settings? if i do it using ASRocks auto overclocking tool it up the HT aswell, but it OCs by upping the cpu fsb not the multi, so i dont know if thats why it has to do the HT too. i just up the multi and voltage and thats it. so am i missing something by having a lower HT bus speed and bus width?
 
Sorry mate, cant help you on that one as I dont have any experience with HT. If I was you I would try a couple of newer games now and see what framerates I am getting, you might be going to a lot of trouble for an extra 1 or 2% when games are perfectly playable now.
 
games always played well before i moved the pc. but i would get a 5min stutter every now and then, which i can only put down to something running in the background.
 
well got round to playing BF:BC2 for the first time (bought it ages ago in a steam sale) but its terrible. i get really bad screen tearing :( any ideas? is this a common fault or could it be down to my OC?
 
Screen tearing is down to V-sync not being turned on. It will look like it killed your framerate, but dont worry about that, its a ******* to explain but go have a quick read of a v-sync faq.
 
yeh turned vsync on and tearing has gone. although now it can be laggy in places it wasnt before (mainly the cutscenes) so lowered the AA and its better now :)
if vsync is required, then why is it not on by default? or do some monitors/gfx cards handle things better so they dont need it :confused:
 
Here is a brief description taken from another website on v-sync and why you get screen tearing, it is acutally a bit more complicated than this but you will get the idea. I found an article a while back that explained it perfectly and quite in depth but for an idea this will do.

"Tearing

It is an unfortunate fact that if you disable VSync, your graphics card and monitor will inevitably go out of synch. Whenever your FPS exceeds the refresh rate (e.g. 120 FPS on a 60Hz screen), or in general at any point during which your graphics card is working faster than your monitor, the graphics card produces more frames in the frame buffer than the monitor can actually display at any one time. The end result is that when the monitor goes to get a new frame from the primary buffer of the graphics card during VBI, the frame may be made up of two or more different frames overlapping each other. This results in the onscreen image appearing to be slightly out of alignment or 'torn' in parts whenever there is any movement - and thus it is referred to as Tearing. An example of this is provided in the simulated screenshot below. Look closely at the urinals and the sink - portions of them are out of alignment due to tearing"
 
what going to a 990FX mobo from an 890FX chipset give a possability of greater OC/stability and maybe getting the 4th core to work? or is that down to the cpu hitting its limit and all upgrading to the 990FX chipset will do is allow me to stick in a BD if/when they ever come out?
 
Back
Top Bottom