• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU Bottleneck

OP a good game to test for CPU bottleneck is Splinter Cell blacklist(if you already own it)

On my 6300 at 4.3 was bottlenecked in this game on a 280x.Numerous areas and parts of levels where the framerate would drop well below 60fps despite GPU usage not being near 100%.

The actual framerate itself isnt the issue,its the slowdown and minor stutter that is more noticeable and annoying.And imo breaks immersion and the lowers the overall experience.

Having used the same CPU with a high end single GPU,i know frome experience the CPU is a bottleneck in many games,i absolutly have no doubt your CPU is holding back your 780's in sli.

I feel like with your current setup your not really gonna get the best out of your gpu's


Also try any MMO or Arma 3 you will instantly notice the bottleneck in them.

Edit; Regarding the benchmakrs compared to the I5,These sort of benchmarks are misleading,as it takes no account for frametimes and minimums.The bench shows them as equal CPU's in gaming ,but if you played the same games on the I5 then the same games with the FX,you would notice the I5 is far smoother and more consistant.
 
Last edited:
As above, in arma3 your cpu will bottleneck your gpus. In arma I work it as a balancing act of increasing graphical quality up until such point that the gpu usage is a pretty constant 80-90%, if it is consistently lower then the cpu is bottlenecking (regardless of the cpu usage).

In the case of arma3 it will load core1 to around 95% and cores2-4 to around 30-60%, the cpu technically still has overhead available but cannot release that overhead to the gpu because of how the game engine is coded; the cpu is still bottlenecking the gpu.
 
I'm about to run a batch of 4 games in 4k and 1080p and post GPU load, FPS, CPU load. I won't spam you with a bunch of screenshots, I will watch it run for a while and take something average/representative. You'll just have to trust me I guess but I'm happy to back up anything I post with screenshots.

I'll just say though in response to comments, I actually bought this FX6300 because I was getting bottlenecks on my old Q6600 and HD 5870. Dwarf Fortress, very poorly optimised retro sandbox game, only uses 1 core and ASCII graphics so obviously it was CPU bound. Shogun 2:Total War really bugged me because it maxed out 1 core before spilling onto other cores, it would cause stutters and such. So I went for the FX6300 because at the time it offered the best single core/multi core performance in the price range. My wife has an i7-3610M @ 2.3GHz on her laptop which I'm confident would actually be worse that my CPU for most tasks, especially those two games!

Bottlenecks in general can be caused by a lot of factors, not just GPU and CPU, your I/O can slow you down, not such and issue now we have SSDs. You can have 12gb of VRAM but if you load your Skyrim mod with more than 3gb of textures it will crash because it's a 32 bit application. Poorly written scripts, your whole PC could be sat there waiting for some weak programming to resolve itself, no amount of hardware tweaking will fix. Old pentiums are said to run Dwarf Fortress better than a modern computer because of the very low latency RAM, your system may look busy but it's mostly just a lot of pointless wheel spinning. And to be honest I thought that's what one of you was going to tell me, that my chip architecture on the 990fx couldn't keep up, not enough lanes, something like that.

Anyway, on with the benchmarking...
 
Last edited:
Bottlenecks in general can be caused by a lot of factors

Indeed, but the topic is "CPU bottlenecking my GPU?"


Maybe you don't do anything that will have your gpu bottlenecked by your cpu. That's great, there will be applications that will not bottleneck.

However what people are saying is that under some instances sli 780 will be bottlenecked by a mildly overclocked 6300, people (myself included) have seen the 6300 bottleneck a single GPU, let alone two.

Like said, go load up some arma or almost any MMO, and head to a busy multiplayer area or somewhere where there is a lot of action going on.



Another way to think/test about is my cpu bottlenecking my gpu is simply, when you turn the settings down, if there is no frame rate increase then the cpu is bogged down.
 
Martini1991 said:
I can't imagine any price point an FX6300 offers best single core performance :p

At the time I thought the higher the GHz the better, it was only £80.

Threepwood said:
Indeed, but the topic is "CPU bottlenecking my GPU?"

And my point is that something apart from the CPU might be creating the bottleneck which explains why neither of them seem to be maxing out. Like if I have the v-sync on then it gets to 60 FPS and thinks 'job done'.

Regarding Arma and MMOs, you are almost certainly right, Battlefield 4 another notorious CPU hog. I don't actually play those games so maybe that's why it's never bothered me before.

4k testing done, about to start 1080...
 
No offence, but you kind of sound like you're trying to convince yourself your CPU isn't an odd choice for that GPU set up.
A CPU doesn't have to max out to have a bottleneck, not even cores have to max out.

BF3 used to show this with i5's and dual GPU's, you'd be bottlenecked but you still had "oomf" left in the i5, but an i7 (For the extra threads) was required to get the most out of the dual GPU's.
 
Last edited:
I already have the X99 and 5820K in my shopping basket!

I'm trying to understand why it wasn't obvious to me there is a bottleneck, I was expecting to see 100% CPU if so. At the moment I'm looking at the Metro 2033 Redux data. Basically 4k is 30 FPS and 1080 is 60fps and the 1080 may be hitting the v-sync cap. But the GPU's aren't maxed out in 4k, they're only about 60% so yeah, it suggests that something is holding them back or they would be giving 100% trying to give me 60 FPS surely?

My system is the result of a series of upgrades, I bought the FX6300 when I had less disposable income, I got the 780 just before the 780ti came out as I got into Skyrim Realvision ENB mod. The second one I picked up recently as a 'like new' B-grade for £230 after I learnt they weren't making 7 series any more. So that's how I got where I am.

You know, the thing is, Max Payne 3, Borderlands 2 and Deus Ex: HRDC are all running at 60fps in 4k with maxed out settings except AA off and v-sync on. So maybe the isn't totally crippling me so much as occasionally holding me back? Especially in 4k.

One thing though, Borderlands 2 if I put Physx on med or high it goes to the CPU and CTDs. If I set one GPU to dedicated Physx then it works fine. Another nail in the coffin of the FX series?

I'm actually glad because I told myself I wasn't allowed to upgrade my CPU until at actually failed to do something I wanted it to do. And now maybe it has... ;)
 
OK, I think I get it now. Max Payne 3, 1080, GPU 10%, CPU 50%, FPS 60. Take v-sync off and FPS goes to 105 or so but GPU only goes up to 30%.

So why doesn't the GPU go upto 100% now the FPS is uncapped? Something must be holding it back. That's why people were telling me to try it at different CPU OCs, because they knew that the CPU utilisation % didn't tell the whole story. That's where I went wrong, I was stuck in the past where CPU bottleneck meant 100% on the CPU.

However, in my defence, the kind of single player FPS, RPG, RTS and turn based games I play, especially in 4k with the FPS capped at 60, meant that the bottleneck wasn't particularly apparent and possibly didn't actually hold me back at all. Although there is no point in having GPUs that your CPUs can't keep up with, I get that, I could have settled with something better matched and saved money.

Does that sound about right? Are we all in agreement now or is more science required?
 
Why are you testing with V-sync on and sli?

Lag and stuttering are worse with v-sync and sli, and, you are limiting your tests FPS. You're not allowing the cards to stretch their legs.



Edit: posted before you posted... In answer to your question it is because you are now game bound. (in the max payne 3 test) to see higher GPU utilisation you would need to disable one card, you will see similar cpu usage, but the single card load will go up.


Double edit:

However, in my defence, the kind of single player FPS, RPG, RTS and turn based games I play, especially in 4k with the FPS capped at 60, meant that the bottleneck wasn't particularly apparent and possibly didn't actually hold me back at all. Although there is no point in having GPUs that your CPUs can't keep up with, I get that, I could have settled with something better matched and saved money.

Welcome to the world of min/maxing hardware for certain tasks, or all tasks.

You have to do things and experience them to get a hold of how to make future choices, it all depends on what you are going to do with the hardware.

Anyway, you are on a good start :) now you just have to decide if you are happy with what you have or not, but don't make the mistake of upgrading if you don't need to. You could sell a 780 and carry on as you are if the games you play are maxed out enough for you.
 
Last edited:
Yes, well thanks for being patient with me Threepwood (and others), if I had spent more time on the forum BEFORE buying things it may have been more logical! But I got the 2nd 780 for a good price and I can sell the AM3 set-up to for 50% of what I paid for it (which to be fair wasn't much) and get a better CPU/motherboard to unlock the full potential of my MIGHTY 780s! Then I can actually play my Skyrim ENB mod in 4k at 60fps and have high Physx on Borderlands 2 and 60fps 4k Metro 2033.

I really do want a DDR4 quad channel motherboard but that's another thread :)
 
Play around with your CPU overclock to see how that impacts performance. 3.5, 4.0, and 4.5 GHz, and run in-game benchmarks as well as normal gameplay.

Then, do the same with gpu clocks.


This, I was getting around 2100 on a Valley benchmark with my new R9 290 and an i7 920 @ stock, overclocked it to 3.6ghz and it jumped to over 2700, which convinced me I was cpu bottlenecked which in turn made to decide to upgrade to an i5 4690k. After which, using the same GPU the benchmark jumped again to 3100+
 
I can completely understand the urge to upgrade and frankly if I could come up with an excuse to get an X99 system I would do. However, I would say that even if you are currently bottlenecked then it may not neccesarily be by that much and the gains from upgrading to X99 wont be proprtional compared to the cost. I'm not saying you shouldn't do it but just don't expect to suddenly get a 50% boost in frame rates from it.
 
I can completely understand the urge to upgrade and frankly if I could come up with an excuse to get an X99 system I would do. However, I would say that even if you are currently bottlenecked then it may not neccesarily be by that much and the gains from upgrading to X99 wont be proprtional compared to the cost. I'm not saying you shouldn't do it but just don't expect to suddenly get a 50% boost in frame rates from it.

Actually I think a few people may have facepalmed when I said about the x99 as I now realise that may just be crazily lurching from one imbalance to another!

I've been researching and budgeting and marketing; I've set up a buyer for my old motherboard and CPU, he has a 750ti and an old Q6600 quad core so I think he's going to be happy with his upgrade for £125. Then I think I'm going for a z97 4790K like your own Nails. I'm not quite decided on the RAM yet but I'm going to wait until everything is in my basket and I'm ready to buy before posting to ask for a sanity check.

A lot of what I do runs perfectly fine at 60fps with the v-sync on in 4k. I'm not one of these people who like to see the FPS up as high as possible, it really doesn't bother me. But now I've noticed it - it does bug me that my GPUs can't even get above 45% in certain situations. AND what really bugs me is I can't play Borderlands 2 with the physx on any higher than low without it crashing!

Nah, I think a 4790k will be a better fit and then I can just let everything be for 3 years or so and just get on and enjoy my gaming.
 
Just keep the ram you already have m8,no benefit of buying newer ram unless you require more capacity.
 
780TI_zpspsblovox.png


The short answer is you'd more than double your framerate on a single 780 if you upgraded to a 4770k and pleased it with a mild overclock. SLI would cheer you up with pretty smooth nice fluffy graphics too!

You can see the difference CPU has made to my 780TI Ghz edition along my upgrade path. The 6300 was stock as so was the 8350. The 8350 sits at 4.9Ghz now with an H80i cooler. So that might be a cheap upgrade path for you.

Unfortunately Passmark graphical benchmarking is a bit 'simple', but you get the gist of it. Your situation reminds me of my problems with Counter-Strike years ago when I kept my Pentium 4 2.53Ghz for far too long! I had a decent graphics card in it and had a decent framerate in empty Counter-Strike servers but as soon as I joined a 'new' (at the time) 40 man server, my framerate would turn to a choppy mess! The CPU couldn't handle busy scenes. I'd imagine that's how your 6300 is feeling.

780TISLI_zpsgokkm6yn.png


That's the benchmark running at whatever the standard settings are at 1080p. That is what your graphics cards want to do! :D
 
Last edited:
The CPU couldn't handle busy scenes. I'd imagine that's how your 6300 is feeling...

...That's the benchmark running at whatever the standard settings are at 1080p. That is what your graphics cards want to do! :D

Funny enough, when the FX 6300 is fine, everything is super smooth, on things like Metro Redux it just feels a little underwater chugging along at 30~fps, like it's OK but just not so 'light'.

When I ran Valley benchmark I was getting 40fps and my GPUs were sat at 30C chilling. Since the last nvidia driver update my cards seem be doing less work than ever. They're achieving the exact same result so I'm not complaining.

A 8350/8370 might be OK but I'd hate to spend that dosh and find out it doesn't cut it. I like the 4790k because I like having the 4GHz cores, still a little old school/funny about that, not going to 'upgrade' to something with a slower clock speed(I know Intel/AMD, apples/pears). But with Hitachi finance at £18 a month who can't afford NOT to have a 4790k :)

*edit* also the 2011 socket CPUs and the FX 8000 series all push the wattage over 750w so I might end up with a £100+ bill for a new PSU too.
 
Last edited:
You seem to have a good PSU though, your specs are crying out for a new Motherboard/CPU combo. It will be like night and day if you grab a 4790k. You won't regret it. Your 780TI's have so much more to give.

During the benchmark I think one card hit 82 and the other low 70s. A 4790k will give them the punt they need. But an 8350 or similar would give a nice punt for £70 if you were to sell your old CPU. Stock it gives a 1/3 increase over the 6300 CPU Benchmarks.

http://www.overclockers.co.uk/showproduct.php?prodid=BU-018-AS&groupid=2833&catid=2836

For the price of a 4790k alone you'd have a nice upgrade.
 
Yeah the 4790k is a nice cpu and the way cpu technology is progressing at the moment (i.e. very little) it should see you good for a few years yet with multiple gpu setups.
 
Back
Top Bottom