• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

How to tell if there's a bottleneck

Soldato
Joined
6 Jan 2013
Posts
22,170
Location
Rollergirl
I've created this thread as I've seen a lot of questions asked regarding how to work out if there's a bottleneck going on. I'll update this OP with any info or questions/answers that may get put forward during the conversation.

This is my guide on how to test for a bottleneck. It's got a lot of images, so I've placed in a spoiler.

Software required (click link):

- MSI Afterburner
- Heaven Benchmark
- CatZilla Benchmark

Test system:
  • 6700k (4.4/1.25v)
  • Gene VIII - 1080ti
  • Dark Pro 16GB@3600
  • Acer X34a

At time of writing, version 4.4.0 is the latest for Afterburner. During install, you want to have it install Afterburner & Rivatuner:

iIzxxKE.png

Once installed, click on the "settings" icon:

SWmhvZt.png

Now, set it up like the screenshots here:

MHzNo6h.png

In the Monitoring options, you need to check the box to the left of the attribute, and then check the "Show in On-Screen Display" box as you go through each highlighted attribute.

*note - you can highlight attributes and drag them up or down to re-order.

lNcOdsX.png

7sA3k9M.png

Next, we set some Hotkeys for your OSD so that you can enable it. Choose what you like, but the key combinations shown are what I use.

lvpc0qR.png

If you want to capture images, then set some parameters for that (highly recommended). Obviously, you want to choose a save location that suits you.

0pvCCFv.png

Now, you want to set up Rivatuner like this:

7rRmFSk.png

That's it, you're all set to gather data. Next time you run a game and you toggle Rivatuner OSD you will have an overlay like this:

mZBiXAE.png

That overlay tells you a few things:

  • It's a Direct X 11 application
  • You're getting 100 FPS (frames per second)
  • Time between frames is 10.0 milliseconds
  • GPU load is 1%
  • GPU temperature is 28C
  • GPU Clockspeed is 2050 MHz
  • Video Memory used is 1578 MB
  • Video Memory Clockspeed is 11610 MHz (double what OSD shows)
  • CPU temperature is 31C
  • CPU load is 4%
  • RAM usage is 4198 MB
Now, I'm going to simulate a GPU bottleneck. This should be easy, as it's what you are expecting to see. Make sure Vsync is switched off and set Heaven up with the following settings (resolution should be the highest you can get - for me, that's 2560*1440):

r79efYr.png

With these settings, you can clearly see that the GPU is the bottleneck. It's sitting at 99% usage and the CPU is barely breaking sweat at 13%. This is ideal as it means you're getting every ounce of performance from your GPU. Important to note, is that the frame rate is 100+FPS and given that this is a higher resolution then this is very good.

WSSjRp8.png

Now, let's simulate a CPU bottleneck. With a system that includes a 1080ti and a 6700k it was actually quite difficult to simulate a CPU bottleneck. For this test, I disabled hyper threading and dropped the CPU clock to 2GHz. I then fired up CatZilla and ran the 720p test.

EdhQ5PO.png

And here's the result. Look how awful this is, the CPU is maxed out at 99% and this means the GPU is being choked back to 50% usage which in turn is choking the frame-rate. This is why you never want a CPU bottleneck when gaming :

iqmIfP2.png

Please feel free to ask any questions or share any thoughts/suggestions/improvements you may have. :)
 
Last edited:
I've created this thread as I've seen a lot of questions asked regarding how to work out if there's a bottleneck going on. I'll update this OP with any info or questions/answers that may get put forward during the conversation.

This is my guide on how to test for a bottleneck

Nice one :cool:
 
This post shows a quick and easy way to tell if your CPU is capable of driving your GPU. It's really for those who want to ensure that they can get the maximum performance from their card. It doesn't take important factors like resolution or target FPS into account, and I'll cover all that in another post/update.

First thing you want to do, is head over to the Heaven benchmark thread and see what similar GPU's score : https://forums.overclockers.co.uk/threads/unigine-heaven-4-benchmark.18487976/page-206#post-28330031

I'm running a 1080ti so I can expect a score of approximately 4000 @ 1080p. The CPU's and clock speeds, RAM etc.. that others have used is pretty much irrelevant. Heaven is GPU bound and you should be scoring somewhere in the middle of all the other runs.

I've disabled Vsync & Gsync to ensure I get the highest demand possible on the GPU, and I've got Heaven set up like this:

VFq7ick.png

So after doing a complete benchmark run, here's my score:

P1MOjbD.png

Clearly, something is wrong. In order to diagnose the issue we need to look at some stats from Afterburner. The most important stat to view is the GPU usage, because if the GPU isn't getting used to it's full potential then the score will be poor.

cJ6tNrx.png

In the image above you can see that the FPS is struggling to hit 100. The Frametime is all over the place and the core clock is not maintained. However, most importantly look at the usage; it never hits it's peak and really struggles to get above 50%. Clearly, something is holding the GPU back. A lot of people focus on achieving the highest core clock they can get, but if the GPU isn't being used to its full potential at that clockspeed then it's all for nothing.

Here's the CPU usage for the same run:

kErnaOb.png

You can see that one core is getting hammered and all the other threads are being worked low to moderate. It's interesting to note that the core that is being hammered is not at 100% constant, so it would be easy to wonder if it was really the CPU causing the bottleneck. The tell tale sign is the consistent high usage.

The next image will explain why our 6700k is bottlenecking a 1080ti:

JiQjEOa.png

You can see above that the CPU is running at 1GHz. This is obviously an exaggerated example of a CPU gimping a GPU, and it shows what will happen when the CPU just doesn't have the power to drive the card.

Now, I overclock the CPU to 4.4GHz on all cores and repeat the benchmark...

d4RsFn5.png

Now you can see that the score is meeting expectations. Here's why...

ILXNZv7.png

Framerate is massively increased and frametime is much better, but look at the usage; the GPU is now the bottleneck and being pounded at 99% almost constantly. The core clock is also maintained (the fluctuation you see is down to Pascal reacting to temperature and power limits). This GPU is now being worked to its full potential.

And the CPU usage:

j3WWFXL.png

Whilst the CPU is still very much a single thread workload, there's much less load on it. And you can see from the next slide what's made all the difference:

s3htLBA.png

The TLDR summary is that in a system where the GPU will be called into action and driven at 100% usage on medium to low resolutions (1080p and lower) you need a CPU capable of driving it. The higher the framerate, the more the CPU will be called into action.

This isn't the full story, and I'll demonstrate later why resolution is a massive factor and also why the choice of monitor is woefully underestimated when people are designing their systems.
 
I've done all my testing for this and I just need to find the time to put the findings into words. I've been prompted by a conversation earlier today regarding whether an i5 would bottleneck a 1080ti @ 4k. Going by my testing, I would say the i5 wouldn't bottleneck a 1080ti at 4k resolution. However, I don't have an i5 system to test for sure so I'd be interested if anyone with an i5/1080ti/4K system could run some similar tests with Firestrike runs?
 
1080p gaming @ 144Hz
I've seen quite a lot of threads asking what the minimum GPU is for 1080p gaming, and with a 60Hz monitor that's probably a fair question as you wouldn't want to pay top dollar for a GPU to be sitting there barely breaking sweat. However, when you're gaming on a 144Hz monitor or if you want to aim for the highest frame rate possible... is there such a thing as overkill?


Test results:

Hardware Settings:

1080p @ 144Hz; 1080ti (2050/11610)+6700k @ 4.4GHz

Benchmark Settings:

iDMvvna.png

I've chosen Firestrike benchmark as it represents a modern AAA title on medium to high settings. I don't want to run these tests on ultra because I'm not convinced that everyone feels the need to have every slider at maximum, therefore medium/high is more representitive. This is a custom run looping a single test, and as you can see it's at 1920*1080 (1080p).


Framerate (FPS), GPU Clock Speed & Usage:

XViZJhM.png

I chose not to lock at 144Hz as I wanted to see just what the peaks where. You can see that whilst 144Hz is held most of the time this system can actually struggle to maintain the high frames even with 99% usage on the GPU throughout the benchmark, with dips down to 120 FPS in places. Just to remind, this is a 1080ti paired with a 6700k @ 4.4GHz.

I don't believe we are at the point where any GPU is overkill at 1080p if the target monitor is 144Hz. Actually, if the eye candy were raised then I'm not convinced the 1080ti could even hit 144 FPS consistently.


CPU Usage:

K1jUCN7.png

Like most games this is a single core affair, and not surprisingly the 6700k is coping fine (which is reflected in the consistent high GPU usage).


CPU Core Clock Speed:

2Mwksq9.png

As mentioned, CPU clock speed was 4.4GHz.
Summary:
With a relatively high end CPU overclocked to a decent clock speed, even the fastest consumer GPU available (and overclocked) could be the bottleneck @1080p/144Hz. No single GPU could be considered overkill at that resolution and refresh rate.
 
Here's a little secret: there's always a bottleneck. Always. If it's not CPU or GPU performance, it's your monitor's refresh rate. Or any of many other things.

You can't get "does my CPU bottleneck my GPU" results from benchmarks like Heaven and Firestrike graphics score. They are GPU benchmarks, the CPU doesn't get to do much besides sending instructions to the GPU, and as long as it can keep up with the frames the GPU is spitting out it won't affect the score much. The settings, being used, GPU clocks and GPU memory clocks is what defines the frame rate there more than anything, not what CPU is being used.

At the very least you'd need actual real world results where the CPU has to effectively contribute to what's going on on the screen. You need these to get a sense of whether the CPU and GPU performance are on a level playing field for your particular usage (resolution, settings, games being played, game modes, etc), and even after you compiled tons of that data it might not actually matter for the particular game you want to play as game A may be way more CPU demanding than Game B, with a different "bottleneck" as a result, yet while we're still on the very same system, with the very same CPU and GPU clocks. Even bandwidth or server performance can limit your frame rates well below what might be possible from your CPU or GPU performance.

Given that there's always going to be a bottleneck somewhere, the important thing really is that you match the CPU, GPU, and monitor (everything really) accordingly. Being able to hit say 140 fps reliably on a monitor is really difficult, but also entirely irrelevant for 99% of people ... being able to reach the G-sync or Freesync range of their monitor reliably is a much more attainable and a more more useful goal, as something like that is what brings smoothness, and it doesn't require going overboard to try and identify artificial bottlenecks.

Ssure, it does really help to be able to "read" performance graphs to get a better sense of what the system is doing and identify possible bottlenecks at a given moment in time, but beyond that almost every use case is different and it rarely stays the same over a longer amount of time.
 
So basically then all this talk of CPU bottlenecks only happens when you create a completely unrealistic situation (halving clocks speeds and cores, 720p) ?

Are people really buying/suggesting CPU upgrades for gaming based on this laughable scenario?
 
So basically then all this talk of CPU bottlenecks only happens when you create a completely unrealistic situation (halving clocks speeds and cores, 720p) ?

Are people really buying/suggesting CPU upgrades for gaming based on this laughable scenario?

I'm not entirely sure who or what your question is aimed at, but I can say that the reason I gimped the 6700k to 1.0GHz was to exaggerate the test to demonstrate the point I was making at the time.

e.g. I don't have an i3 to demonstrate how it will struggle in a particular scenario, so I gimped the i7 to demonstrate what a person with an i3 would be looking for in terms of results on the graph/data.

Hope that makes sense.
 
1080p gaming @ 144Hz
I've seen quite a lot of threads asking what the minimum GPU is for 1080p gaming, and with a 60Hz monitor that's probably a fair question as you wouldn't want to pay top dollar for a GPU to be sitting there barely breaking sweat. However, when you're gaming on a 144Hz monitor or if you want to aim for the highest frame rate possible... is there such a thing as overkill?


Test results:

Hardware Settings:

1080p @ 144Hz; 1080ti (2050/11610)+6700k @ 4.4GHz

Benchmark Settings:

iDMvvna.png

I've chosen Firestrike benchmark as it represents a modern AAA title on medium to high settings. I don't want to run these tests on ultra because I'm not convinced that everyone feels the need to have every slider at maximum, therefore medium/high is more representitive. This is a custom run looping a single test, and as you can see it's at 1920*1080 (1080p).


Framerate (FPS), GPU Clock Speed & Usage:

XViZJhM.png

I chose not to lock at 144Hz as I wanted to see just what the peaks where. You can see that whilst 144Hz is held most of the time this system can actually struggle to maintain the high frames even with 99% usage on the GPU throughout the benchmark, with dips down to 120 FPS in places. Just to remind, this is a 1080ti paired with a 6700k @ 4.4GHz.

I don't believe we are at the point where any GPU is overkill at 1080p if the target monitor is 144Hz. Actually, if the eye candy were raised then I'm not convinced the 1080ti could even hit 144 FPS consistently.


CPU Usage:

K1jUCN7.png

Like most games this is a single core affair, and not surprisingly the 6700k is coping fine (which is reflected in the consistent high GPU usage).


CPU Core Clock Speed:

2Mwksq9.png

As mentioned, CPU clock speed was 4.4GHz.
Summary:
With a relatively high end CPU overclocked to a decent clock speed, even the fastest consumer GPU available (and overclocked) could be the bottleneck @1080p/144Hz. No single GPU could be considered overkill at that resolution and refresh rate.
That's really interesting & kind of confirmed what I had always thought on the subject - especially when driving a 144hz+ monitor. Nice one!
 
I'm not entirely sure who or what your question is aimed at, but I can say that the reason I gimped the 6700k to 1.0GHz was to exaggerate the test to demonstrate the point I was making at the time.

e.g. I don't have an i3 to demonstrate how it will struggle in a particular scenario, so I gimped the i7 to demonstrate what a person with an i3 would be looking for in terms of results on the graph/data.

Hope that makes sense.

Sorry, I wasn't having a dig at you or your tests. I was having a pop at these people saying certain CPUs like last gen i5s and i7s are useless because of potential CPU bottlenecking, but you've shown it take serious gimping to make it an issue even in the slightest.
 
4k gaming
Gaming at 4k is extremely demanding on the latest high end hardware, and it's common knowledge that only the high end GPU's are going to stand any chance of pushing that many pixels at glossy settings. However, how demanding is it on the CPU?


Test results @ Full Speed CPU:

Hardware Settings:

4k; 1080ti (2050/11610)+6700k @ 4.4GHz

Benchmark Settings:

9yMGpOJ.png

Test set-up is similar to previous runs, although this time we're on 3840*2160 which is 4k @ 16:9 ratio (widescreen).

Framerate (FPS), GPU Clock Speed & Usage:

yjeZey6.png

As expected, the GPU is the bottleneck in this test. We're aiming for a solid 60 FPS, and although the 1080ti is hitting the 60 mark it's also dipping below 40 regularly. The GPU is working it's socks off at an almost constant 100% usage. Look at the core clock, it's dipping from peak to trough which is an indication that the Pascal card is adjusting itself due to power limits. This particular GPU is under water therefore temperature will not throttle the core clock.


CPU Usage:

RuaSB4B.png

If CPU's went on holiday, they'd probably book a week in 4k land. The 6700k @ 4.4GHz has probably worked harder opening Chrome tabs.


CPU Core Clock Speed:

IKmIW1v.png

Looking at the clock speed, it's actually dipping due to Windows balanced power state behaviour. This CPU is in it's lounge-wear during this benchmark!
Summary:
4k is all on the GPU, it's no revelation. What I find interesting is just how little work load is on the CPU at this resolution, the GPU is struggling to cope with the frames it's being fed due to the aesthetic demand which is driving down the frame count.

In order to demonstrate just how little the CPU is required here, it's time for a further test...

Test results @ Gimped CPU:

Hardware Settings:

4k; 1080ti (2050/11610)+6700k @ 1.0GHz


Framerate (FPS), GPU Clock Speed & Usage:

NNrnZBN.png

Frame rates are practically identical to the 4.4GHz run. The GPU is still being pounded with all the work.


CPU Usage:

wTDwvNI.png

Here's the first indication that things are different, the CPU is actually doing some work here but it certainly isn't struggling to cope and as you can see above the GPU and frame rate is not being restricted at all.


CPU Core Clock Speed:

UknJSvk.png

As you can see, hyperthreading disabled and core speed gimped to 1.0GHz.
Summary:
People throw high end CPUs into 4k systems, but the work load is clearly on the GPU. You can see that 4k@60FPS is still beyond the current generation of GPUs and the CPU has very little to do when it comes to glossy eye candy... strolling along at 1.0GHz without missing a beat.
 
Back
Top Bottom