• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU Bottleneck

Associate
Joined
18 Apr 2012
Posts
77
Location
Isle of Wight
Hey y'all, I've been pondering novel ways of keeping my GPUs quiet and balancing the temps on another thread but this issue has come up a couple times. People are suggesting my CPU is bottlenecking my GPUs. My specs below but short version is I have an OCed AMD 6300 @4.2GHz and two GTX780 GHz editions.

Full system specs:

asrock 990fx extreme3
AMD FX-6300 Six-Core Processor OCed to 4.2GHz
8gb dual channel DDR3 1866Mhz
Corsair H55 cooler
128gb Samsung 840 Pro
2x Seagate 1TB Hybrid Internal Solid State Drive in RAID 0
Windows 7 Home Premium 64bit
2x Gigabyte GTX 780 GHz edition GPUs in SLI (same BIOS, rev etc despite differences with decal and heat pipes)
Superflower leadex 750w Gold
Inateck PCI-E to USB 3.0 2-Port PCI Express Card (for the USB 3 front panel as the mobo didn't have a USB 3 header)
Asus PB287Q 28" 4K 60Hz 1MS monitor

So the thing is, one chap suggested that my CPU was probably bottlenecking my GPUs so I did a bit of research. The symptom of CPU bottlenecking is having high CPU use and low GPU utilisation. My CPU doesn't seem to have high usage but the chap said it was probably because the games didn't use all the cores and were maxing out 4 cores so what looked like 66% usage was actually like a 100% bottleneck. So I posted a screenshot from my Roccat Power Grid on my phone showing that while playing Metro 2033 Redux in 4k I was getting 50% usage spread across all 6 cores:

29xt4eh.png

In response to which Threepwood later said this:

You are looking at the wrong useage to know if your CPU is a bottle neck.

Your CPU load % does not matter, what matters is your GPU load %.

If, for example your 780's were running at 100% load each then yes, your CPU is not bottlenecking your GPU's.

It would not matter a jot if your CPU load was 10% or 100%.



You have already had a correct answer, sell your board and cpu and put that and your £150 towards a better CPU, one that will not bottle neck your 780's.


You should monitor your GPU load.

So I've just run the Valley benchmark in 4k with some other system monitors and it shows high usage of the GPUs, 98% and 74% in this shot, and quite low utilisation of the CPU spread over 4 cores for the most part.

raybvc.jpg

That screen cap has been halved from my native resolution but you can still just about read it. Ultra settings, no AA and no v-sync almost 60fps.

So my question is, is my CPU bottlenecking me? What other tests should I run? Don't get me wrong, I'm no AMD fanboy, I just figured the OCed 6300 would do the job 2 years ago when I bought it. I worry that behind the scenes, something to do with PCI-e lanes is slowing me down, I'd like to go Intel but I really don't want to upgrade until I can afford something DDR4.

Any advise, suggestions please? I'm happy to run more tests and post results. :)
 
GPU benchmark programs are designed in a way to stress the GPU with a minimum amount of CPU action needed.

Because it is a GPU stressing program it wants to remove as much potential bottlenecking from the CPU.

It is also designed to get the most load out of your gpu as possible, because it is a stress testing program.


The best way to know if your cpu is bottlenecking your gpu is to play the games you play, with cpu and gpu load overlayed onto your screen.

Some games you won't see bottlenecking because of how they are designed (typically on an newer graphics engine) but others you will (typically older, single thread games.)

So yeah, your best test is to test while you play whatever you play...
 
Once again, thanks for your replies everyone.

So yeah, your best test is to test while you play whatever you play...

Sure, sounds like a good next step. I hope you take this in the way it's intended, in a spirit of open minded scientific investigation. If I can make my GPUs even better by upgrading my CPU then that's a good thing right?

Right so, time for some !!Science!! as we used to say in the Dwarf Fortress community. Firstly I've plugged in an old 1024*768 monitor from the scrap yard to run my system monitors in whilst playing fullscreen 4k in the main monitor. I've turned it to windows basic mode and removed my wallpaper for the purposes of the testing. I've also activated the SLI and Physx indicators.

First test, State of Decay, ultra settings, 4k, with 2k textures mod, SweetFX lighting mod and a gameplay overhall mod that shouldn't have much effect. I drove around the trailer park honking my horn then got out and started hitting zombies with a crowbar (just to make sure the CPU had something to do).

eklw9j.jpg

Result CPU 52% GPU 29% GPU 20% - I's say it was probably too easy to really push it, neither GPU or CPU were fully utilized. Note all 6 CPU cores were in use.

More testing later, Rome 2: Total War up next?
 
Last edited:
Play around with your CPU overclock to see how that impacts performance. 3.5, 4.0, and 4.5 GHz, and run in-game benchmarks as well as normal gameplay.

Then, do the same with gpu clocks.

At lower resolution, a fx-6300 will bottleneck SLI 780s, but at 4k it's not so clear cut.
 
run in-game benchmarks as well as normal gameplay

Good point, to be really scientific I'd want to run benchmarks. I'll be glad to run that series of tests after a few more games though. I think Rome 2 has a benchmark mode...
 
Quick update, dropped desktop to 1080, added Shadowplay FPS counter, ran State of Decay in 1080.

oppic3.jpg

CPU 47% GPU 47% GPU 42% - odd result eh? FPS was around 30 but I think Shadowplay is halving it. 4k has the same FPS, 30ish but I think it's actually 60ish.
 
That is one odd set up.
At 1080p, I'd say you'd have a huge bottleneck.

You're at 4K though.

Play a game at 4K, just using MSI Afterburner OSD on GPU usage.
Leave shadowplay off, literally just play the game with MSI Afterburner with GPU usage on the OSD.

But 50% on both GPU at 1080p doesn't sound far fetched on a game like State Of Decay.
Ignore the CPU usage.
 
Last edited:
Martini said:
That is one odd set up.
At 1080p, I'd say you'd have a huge bottleneck.

Odd how? :) Everybody keeps saying I should have a bottleneck but I can't see any evidence of it, as you can see on my 1080 results. The FX6300 benchmarks about the same as a i5 4440 without the 20% overclock, is it really that poor?

Martini said:
Play a game at 4K, just using MSI Afterburner OSD on GPU usage.
Leave shadowplay off, literally just play the game with MSI Afterburner with GPU usage on the OSD.

That sounds like a good idea, I think the Shadowplay and the 2nd monitor are just confusing things. I'll have a play with MSI Afterburner tonight.

Martini said:
Ignore the CPU usage

Now I'm confused... the CPU usage isn't related to CPU bottlenecking? Or are you saying just don't monitor it for now, because I can monitor the CPUs fine on my phone with Roccat Power Grid and it doesn't go through the VGA so it shouldn't affect the test results at all.

This article is a bit old but it seems to suggest that AMD CPUs are actually better at SLI than Intel's are http://www.tomshardware.co.uk/crossfire-sli-scaling-bottleneck,review-32668-3.html

**edit** that's not a very good summary of the article I've made, maybe it's be more correct saying that AMD CPUs do better at SLI then one might expect!
 
Last edited:
As in, the CPU usage isn't overall useful to monitor bottlenecking.

And an FX6300 is a pretty lower/mid range CPU, pairing it with two 780's is the definition of top heavy. That's why it's odd.

I can't comment on the results yet until you do some proper testing (And an i5 4440 with two 780's would be just as silly)

At 1080p you *would* have a bottleneck, it's that simple. Because you're at 4K however would probably change that, 4 times the pixel count. Whether that's enough to eliminate a CPU bottleneck is very much in the air.
 
Cool, thanks for the feedback, will definitely post some most results tonight, think Metro 2033 Redux will be a better test than State of Decay.
 
Sure thing, I'll get the methodology right then run a series of tests on different games, I got a few!
 
'Right so, time for some !!Science!!'
Hurrah, graph time, we want to see cpu speed against fps for 1080 and 4k maxed for a selection of games, both single and multi threaded. I think arma series/ dayz sa are single, skyrim tombraider are mutli.
 
'Right so, time for some !!Science!!'
Hurrah, graph time, we want to see cpu speed against fps for 1080 and 4k maxed for a selection of games, both single and multi threaded. I think arma series/ dayz sa are single, skyrim tombraider are mutli.

They're not "single threaded", they're perhaps single thread heavy, but not single threaded.

Skyrim isn't exactly a great example of multi-threading either.

It doesn't have to be specific titles.
Just 10 random games from 2013/2014 or something (Not indie)
 
Now I'm confused... the CPU usage isn't related to CPU bottlenecking?



Cpu % only matters if the cores that are running the game/GPU's are at 100% load with the GPU at less than 98/99/100% load.

This is what would typically indicate a cpu bottlenecking a gpu.



While you are doing some proper testing, you are getting to the nitty gritty of tweaking your game settings.

With the load readouts you can up the graphics settings until you are getting the most load out of your GPU's, until either your cpu begins to bottleneck your gpu's or your gpu's are running at max load (the former is a cpu bottleneck the latter is a gpu bottleneck.)

If all settings are maxed, and you are running at your highest resolution with cpu and gpu load below 100%? Then you have hit the game being the bottleneck.

But you have to do these things with games that can actually be pushed. Many games would never even have the capability to push a gpu (nor cpu) to 100% even on absolute max settings.
 
The opposite side of the "bottleneck" coin is scaling, i.e. how does performance vary with CPU or GPU speed. You could try increasing/decreasing CPU/GPU clock speeds (to keep it simple for now) by 10%, 20%, 50%... and see how performance changes.

For example, if you think your CPU is not the bottleneck try reducing your 6300 multiplier and measuring performance. If it starts going down then... ;)

Also be cautious relying just on average FPS. Quite often a weak CPU results in a wider range of frame times, particularly at the long time (low FPS) tail. In other words, your min FPS suffers. At the very least recording min FPS would be useful.

A 6300 will DEFINITELY bottleneck two 780's.

Let's have no more posts like this please, facts only.
 
Minimum FPS is what you should be concerned about, and as long as your comfortably above 30fps most games feel smooth, 60fps being desirable as a minimum. I say that because regardless of 120hz plus monitors, some games cap at 60fps or just act up above it.

I was comfortable with an overclocked E8500 and a 7950 for some time, mainly played World Of Tanks, Crysis 3 and Battlefield titles during that time. I just didn't go maxing out every single setting, and switched from 120fps to 60fps which resulted in smoother games with a higher minimum fps.

Not sure how much of a bottleneck an overclocked 6300 would introduce?
 
Back
Top Bottom