System Bottleneck

Associate
Joined
3 Nov 2011
Posts
90
Location
Grantham
I'm currently running:

Asus Sabertooth 990fx
AMD FX 6200 @ 4.7ghz
16GB Crucial Ballistix 1866mhz
Corsair Force SSD 550mb/s read 525mb/s write
Gainward GTX 570 1280 GLH @ 790mhz
Gaintward GTX 570 1280 GS @ 790mhz

Is my processor likely to cause a bottleneck because it cant match the GPU power?

Always regretted buying AMD Bulldozer processor... It expected everything to run at 60fps on 1080p all maxed out, but sleeping dogs seems to run at around 40
 
no way.

I'm currently running 2 X 570's sli with a phenom X6 1075t @ 4.0ghz and its getting the same benchmark results as intel owners with 2 X 570's.

Saying that my 670 is arriving today :)
 
Is my processor likely to cause a bottleneck because it cant match the GPU power?

Always regretted buying AMD Bulldozer processor... It expected everything to run at 60fps on 1080p all maxed out, but sleeping dogs seems to run at around 40
It could be, but if you are using Extreme settings in Sleeping Dogs, that setting would use SuperSampling, which would kill a Nvidia's card frame rate by around 55%. Best thing to do is check your GPU usage on both GPUs when you are hitting the low frame rate of 40s. If the GPU usage on both GPUs are at 99%, it would mean it is the cards are performing as they should, and it is the SuperSampling (Extreme AA) that's making your frame rate suffer; but if the GPU usage is fair lesser than 99% on both GPUs, then the lower frame rate would be likely caused by CPU bottleneck.

no way.

I'm currently running 2 X 570's sli with a phenom X6 1075t @ 4.0ghz and its getting the same benchmark results as intel owners with 2 X 570's.
Problem is not all games uses 6 cores like the BF3 that you play...there are games that are taxing on the CPU which AMD's simply can't hold constant 60fps at all time.

As much as bulldozer sucks i stillcannot see it being a battleneck at those speeds
"At those speed", the FX-6000 is only around on par with a Phenom II X6 on 4.0GHz.
 
Last edited:
cheers, I'll keep my eye on MSI Afterburner during gameplay, and I'll try removing super sampling.
I also assume 1280mb graphic cards can handle all extreme settings, or will it likely spill over into system memory, causing the drop in frame rate?
 
It could be, but if you are using Extreme settings in Sleeping Dogs, that setting would use SuperSampling, which would kill a Nvidia's card frame rate by around 55%. Best thing to do is check your GPU usage on both GPUs when you are hitting the low frame rate of 40s. If the GPU usage on both GPUs are at 99%, it would mean it is the cards are performing as they should, and it is the SuperSampling (Extreme AA) that's making your frame rate suffer; but if the GPU usage is fair lesser than 99% on both GPUs, then the lower frame rate would be likely caused by CPU bottleneck.


Problem is not all games uses 6 cores like the BF3 that you play...there are games that are taxing on the CPU which AMD's simply can't hold constant 60fps at all time.


"At those speed", the FX-6000 is only around on par with a Phenom II X6 on 4.0GHz.

Don't these generation of processors have a turbo in where it turns off some of the cores and 'turbos' the remaining cores?
 
Don't these generation of processors have a turbo in where it turns off some of the cores and 'turbos' the remaining cores?
Not when people already overclocked the CPU to near max. Think it's only relevant if people left the CPU at stock clock, and only if the application/system only use ONE thread (but most games nowadays would use at least 2 threads). And even if there was another 300MHz added to the FX at 4.7GHz and bump up to 5.0GHz, real world performance increase would be very little, considering the FX's IPC is worse than even Core2/Phenom II.
 
Last edited:
Ok did a benchmark of sleeping dogs with every maxed except the anti aliasing, set that to normal, and got 172fps max and 54 minimum with an average of 139. This super sampling business seems to be a little BS, but at least my system crunches through games at 60+. Just kindof expected with as much investment in a computer it might run everything at top notch.
Yeah i tried a 5Ghz run on my 6200, just didnt work, was marginally unstable on one of the cores (or should i say modules...) and even down to 4.78ghz is got the same calculation error, so i left it at 4.7 and it's stayed smooth.

Currently praying for AMD to release a processor that doesnt require 2 modules to share a single floating point. Sadly Vishera will not asnwer my prayers!
 
Ok did a benchmark of sleeping dogs with every maxed except the anti aliasing, set that to normal, and got 172fps max and 54 minimum with an average of 139. This super sampling business seems to be a little BS, but at least my system crunches through games at 60+. Just kindof expected with as much investment in a computer it might run everything at top notch.
Yeah i tried a 5Ghz run on my 6200, just didnt work, was marginally unstable on one of the cores (or should i say modules...) and even down to 4.78ghz is got the same calculation error, so i left it at 4.7 and it's stayed smooth.

Currently praying for AMD to release a processor that doesnt require 2 modules to share a single floating point. Sadly Vishera will not asnwer my prayers!

3570K would answers your prayers.
 
Yeah i tried a 5Ghz run on my 6200, just didnt work, was marginally unstable on one of the cores (or should i say modules...) and even down to 4.78ghz is got the same calculation error, so i left it at 4.7 and it's stayed smooth.
Think you need a bit more voltage to truly make the higher overclock stable...and at above 4.6GHz, the FX consume much higher power (probably due to the higher vcore needed) and need a good CPU cooler to keep the temp in check due to the extra waste heat it gives off. There's not really much noticable different between FX chip between 4.7GHz and 4.78GHz anyway, so yea if you can run 4.7GHz stable but not 4.78GHz, keeping it at 4.7GHz would be the sensible thing to do.
 
3570K would answers your prayers.


Purchasing a new processor and motherboard that is sli capable isn't a financial option at the moment :D
It would be nice to use tge same am3+ socket but for something good' though I guess the architecture after vishera will likely be completely different socket type.
 
Back
Top Bottom