• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU Bottleneck

Associate
Joined
18 Apr 2012
Posts
77
Location
Isle of Wight
Hey y'all, I've been pondering novel ways of keeping my GPUs quiet and balancing the temps on another thread but this issue has come up a couple times. People are suggesting my CPU is bottlenecking my GPUs. My specs below but short version is I have an OCed AMD 6300 @4.2GHz and two GTX780 GHz editions.

Full system specs:

asrock 990fx extreme3
AMD FX-6300 Six-Core Processor OCed to 4.2GHz
8gb dual channel DDR3 1866Mhz
Corsair H55 cooler
128gb Samsung 840 Pro
2x Seagate 1TB Hybrid Internal Solid State Drive in RAID 0
Windows 7 Home Premium 64bit
2x Gigabyte GTX 780 GHz edition GPUs in SLI (same BIOS, rev etc despite differences with decal and heat pipes)
Superflower leadex 750w Gold
Inateck PCI-E to USB 3.0 2-Port PCI Express Card (for the USB 3 front panel as the mobo didn't have a USB 3 header)
Asus PB287Q 28" 4K 60Hz 1MS monitor

So the thing is, one chap suggested that my CPU was probably bottlenecking my GPUs so I did a bit of research. The symptom of CPU bottlenecking is having high CPU use and low GPU utilisation. My CPU doesn't seem to have high usage but the chap said it was probably because the games didn't use all the cores and were maxing out 4 cores so what looked like 66% usage was actually like a 100% bottleneck. So I posted a screenshot from my Roccat Power Grid on my phone showing that while playing Metro 2033 Redux in 4k I was getting 50% usage spread across all 6 cores:

29xt4eh.png

In response to which Threepwood later said this:

You are looking at the wrong useage to know if your CPU is a bottle neck.

Your CPU load % does not matter, what matters is your GPU load %.

If, for example your 780's were running at 100% load each then yes, your CPU is not bottlenecking your GPU's.

It would not matter a jot if your CPU load was 10% or 100%.



You have already had a correct answer, sell your board and cpu and put that and your £150 towards a better CPU, one that will not bottle neck your 780's.


You should monitor your GPU load.

So I've just run the Valley benchmark in 4k with some other system monitors and it shows high usage of the GPUs, 98% and 74% in this shot, and quite low utilisation of the CPU spread over 4 cores for the most part.

raybvc.jpg

That screen cap has been halved from my native resolution but you can still just about read it. Ultra settings, no AA and no v-sync almost 60fps.

So my question is, is my CPU bottlenecking me? What other tests should I run? Don't get me wrong, I'm no AMD fanboy, I just figured the OCed 6300 would do the job 2 years ago when I bought it. I worry that behind the scenes, something to do with PCI-e lanes is slowing me down, I'd like to go Intel but I really don't want to upgrade until I can afford something DDR4.

Any advise, suggestions please? I'm happy to run more tests and post results. :)
 
Once again, thanks for your replies everyone.

So yeah, your best test is to test while you play whatever you play...

Sure, sounds like a good next step. I hope you take this in the way it's intended, in a spirit of open minded scientific investigation. If I can make my GPUs even better by upgrading my CPU then that's a good thing right?

Right so, time for some !!Science!! as we used to say in the Dwarf Fortress community. Firstly I've plugged in an old 1024*768 monitor from the scrap yard to run my system monitors in whilst playing fullscreen 4k in the main monitor. I've turned it to windows basic mode and removed my wallpaper for the purposes of the testing. I've also activated the SLI and Physx indicators.

First test, State of Decay, ultra settings, 4k, with 2k textures mod, SweetFX lighting mod and a gameplay overhall mod that shouldn't have much effect. I drove around the trailer park honking my horn then got out and started hitting zombies with a crowbar (just to make sure the CPU had something to do).

eklw9j.jpg

Result CPU 52% GPU 29% GPU 20% - I's say it was probably too easy to really push it, neither GPU or CPU were fully utilized. Note all 6 CPU cores were in use.

More testing later, Rome 2: Total War up next?
 
Last edited:
run in-game benchmarks as well as normal gameplay

Good point, to be really scientific I'd want to run benchmarks. I'll be glad to run that series of tests after a few more games though. I think Rome 2 has a benchmark mode...
 
Quick update, dropped desktop to 1080, added Shadowplay FPS counter, ran State of Decay in 1080.

oppic3.jpg

CPU 47% GPU 47% GPU 42% - odd result eh? FPS was around 30 but I think Shadowplay is halving it. 4k has the same FPS, 30ish but I think it's actually 60ish.
 
Martini said:
That is one odd set up.
At 1080p, I'd say you'd have a huge bottleneck.

Odd how? :) Everybody keeps saying I should have a bottleneck but I can't see any evidence of it, as you can see on my 1080 results. The FX6300 benchmarks about the same as a i5 4440 without the 20% overclock, is it really that poor?

Martini said:
Play a game at 4K, just using MSI Afterburner OSD on GPU usage.
Leave shadowplay off, literally just play the game with MSI Afterburner with GPU usage on the OSD.

That sounds like a good idea, I think the Shadowplay and the 2nd monitor are just confusing things. I'll have a play with MSI Afterburner tonight.

Martini said:
Ignore the CPU usage

Now I'm confused... the CPU usage isn't related to CPU bottlenecking? Or are you saying just don't monitor it for now, because I can monitor the CPUs fine on my phone with Roccat Power Grid and it doesn't go through the VGA so it shouldn't affect the test results at all.

This article is a bit old but it seems to suggest that AMD CPUs are actually better at SLI than Intel's are http://www.tomshardware.co.uk/crossfire-sli-scaling-bottleneck,review-32668-3.html

**edit** that's not a very good summary of the article I've made, maybe it's be more correct saying that AMD CPUs do better at SLI then one might expect!
 
Last edited:
Cool, thanks for the feedback, will definitely post some most results tonight, think Metro 2033 Redux will be a better test than State of Decay.
 
Sure thing, I'll get the methodology right then run a series of tests on different games, I got a few!
 
I'm about to run a batch of 4 games in 4k and 1080p and post GPU load, FPS, CPU load. I won't spam you with a bunch of screenshots, I will watch it run for a while and take something average/representative. You'll just have to trust me I guess but I'm happy to back up anything I post with screenshots.

I'll just say though in response to comments, I actually bought this FX6300 because I was getting bottlenecks on my old Q6600 and HD 5870. Dwarf Fortress, very poorly optimised retro sandbox game, only uses 1 core and ASCII graphics so obviously it was CPU bound. Shogun 2:Total War really bugged me because it maxed out 1 core before spilling onto other cores, it would cause stutters and such. So I went for the FX6300 because at the time it offered the best single core/multi core performance in the price range. My wife has an i7-3610M @ 2.3GHz on her laptop which I'm confident would actually be worse that my CPU for most tasks, especially those two games!

Bottlenecks in general can be caused by a lot of factors, not just GPU and CPU, your I/O can slow you down, not such and issue now we have SSDs. You can have 12gb of VRAM but if you load your Skyrim mod with more than 3gb of textures it will crash because it's a 32 bit application. Poorly written scripts, your whole PC could be sat there waiting for some weak programming to resolve itself, no amount of hardware tweaking will fix. Old pentiums are said to run Dwarf Fortress better than a modern computer because of the very low latency RAM, your system may look busy but it's mostly just a lot of pointless wheel spinning. And to be honest I thought that's what one of you was going to tell me, that my chip architecture on the 990fx couldn't keep up, not enough lanes, something like that.

Anyway, on with the benchmarking...
 
Last edited:
Martini1991 said:
I can't imagine any price point an FX6300 offers best single core performance :p

At the time I thought the higher the GHz the better, it was only £80.

Threepwood said:
Indeed, but the topic is "CPU bottlenecking my GPU?"

And my point is that something apart from the CPU might be creating the bottleneck which explains why neither of them seem to be maxing out. Like if I have the v-sync on then it gets to 60 FPS and thinks 'job done'.

Regarding Arma and MMOs, you are almost certainly right, Battlefield 4 another notorious CPU hog. I don't actually play those games so maybe that's why it's never bothered me before.

4k testing done, about to start 1080...
 
I already have the X99 and 5820K in my shopping basket!

I'm trying to understand why it wasn't obvious to me there is a bottleneck, I was expecting to see 100% CPU if so. At the moment I'm looking at the Metro 2033 Redux data. Basically 4k is 30 FPS and 1080 is 60fps and the 1080 may be hitting the v-sync cap. But the GPU's aren't maxed out in 4k, they're only about 60% so yeah, it suggests that something is holding them back or they would be giving 100% trying to give me 60 FPS surely?

My system is the result of a series of upgrades, I bought the FX6300 when I had less disposable income, I got the 780 just before the 780ti came out as I got into Skyrim Realvision ENB mod. The second one I picked up recently as a 'like new' B-grade for £230 after I learnt they weren't making 7 series any more. So that's how I got where I am.

You know, the thing is, Max Payne 3, Borderlands 2 and Deus Ex: HRDC are all running at 60fps in 4k with maxed out settings except AA off and v-sync on. So maybe the isn't totally crippling me so much as occasionally holding me back? Especially in 4k.

One thing though, Borderlands 2 if I put Physx on med or high it goes to the CPU and CTDs. If I set one GPU to dedicated Physx then it works fine. Another nail in the coffin of the FX series?

I'm actually glad because I told myself I wasn't allowed to upgrade my CPU until at actually failed to do something I wanted it to do. And now maybe it has... ;)
 
OK, I think I get it now. Max Payne 3, 1080, GPU 10%, CPU 50%, FPS 60. Take v-sync off and FPS goes to 105 or so but GPU only goes up to 30%.

So why doesn't the GPU go upto 100% now the FPS is uncapped? Something must be holding it back. That's why people were telling me to try it at different CPU OCs, because they knew that the CPU utilisation % didn't tell the whole story. That's where I went wrong, I was stuck in the past where CPU bottleneck meant 100% on the CPU.

However, in my defence, the kind of single player FPS, RPG, RTS and turn based games I play, especially in 4k with the FPS capped at 60, meant that the bottleneck wasn't particularly apparent and possibly didn't actually hold me back at all. Although there is no point in having GPUs that your CPUs can't keep up with, I get that, I could have settled with something better matched and saved money.

Does that sound about right? Are we all in agreement now or is more science required?
 
Yes, well thanks for being patient with me Threepwood (and others), if I had spent more time on the forum BEFORE buying things it may have been more logical! But I got the 2nd 780 for a good price and I can sell the AM3 set-up to for 50% of what I paid for it (which to be fair wasn't much) and get a better CPU/motherboard to unlock the full potential of my MIGHTY 780s! Then I can actually play my Skyrim ENB mod in 4k at 60fps and have high Physx on Borderlands 2 and 60fps 4k Metro 2033.

I really do want a DDR4 quad channel motherboard but that's another thread :)
 
I can completely understand the urge to upgrade and frankly if I could come up with an excuse to get an X99 system I would do. However, I would say that even if you are currently bottlenecked then it may not neccesarily be by that much and the gains from upgrading to X99 wont be proprtional compared to the cost. I'm not saying you shouldn't do it but just don't expect to suddenly get a 50% boost in frame rates from it.

Actually I think a few people may have facepalmed when I said about the x99 as I now realise that may just be crazily lurching from one imbalance to another!

I've been researching and budgeting and marketing; I've set up a buyer for my old motherboard and CPU, he has a 750ti and an old Q6600 quad core so I think he's going to be happy with his upgrade for £125. Then I think I'm going for a z97 4790K like your own Nails. I'm not quite decided on the RAM yet but I'm going to wait until everything is in my basket and I'm ready to buy before posting to ask for a sanity check.

A lot of what I do runs perfectly fine at 60fps with the v-sync on in 4k. I'm not one of these people who like to see the FPS up as high as possible, it really doesn't bother me. But now I've noticed it - it does bug me that my GPUs can't even get above 45% in certain situations. AND what really bugs me is I can't play Borderlands 2 with the physx on any higher than low without it crashing!

Nah, I think a 4790k will be a better fit and then I can just let everything be for 3 years or so and just get on and enjoy my gaming.
 
The CPU couldn't handle busy scenes. I'd imagine that's how your 6300 is feeling...

...That's the benchmark running at whatever the standard settings are at 1080p. That is what your graphics cards want to do! :D

Funny enough, when the FX 6300 is fine, everything is super smooth, on things like Metro Redux it just feels a little underwater chugging along at 30~fps, like it's OK but just not so 'light'.

When I ran Valley benchmark I was getting 40fps and my GPUs were sat at 30C chilling. Since the last nvidia driver update my cards seem be doing less work than ever. They're achieving the exact same result so I'm not complaining.

A 8350/8370 might be OK but I'd hate to spend that dosh and find out it doesn't cut it. I like the 4790k because I like having the 4GHz cores, still a little old school/funny about that, not going to 'upgrade' to something with a slower clock speed(I know Intel/AMD, apples/pears). But with Hitachi finance at £18 a month who can't afford NOT to have a 4790k :)

*edit* also the 2011 socket CPUs and the FX 8000 series all push the wattage over 750w so I might end up with a £100+ bill for a new PSU too.
 
Last edited:
Back
Top Bottom