CPU vs GPU bottle neck question.

rn2

rn2

Associate
Joined
13 Mar 2017
Posts
523
Location
England
Am I right in saying that the only way a cpu can bottle neck a much more powerful gpu is if you allow the gpu to push as many frames as it likes?

For example, if you cap the frames on a 240hz monitor to 165 or 144.

Or are there more factors at play?
 
Last edited:
If you cap the frames on the monitor and your GPU is able to push more frames, you will just not utilise the entirety of your GPU power. CPU bottleneck is usually felt when you are running a lower resolution than your GPU is capable. It will generate more frames, but CPU will be unable to keep up. An example would be running a game on RXT 3080 (or any other higher end GPU) at 1080p with a lower end CPU (for example my ancient i5 2500k). GPU will push many many frames, but will be limited by CPUs capacity to handle them. As a general rule, I don't tend to worry about it as I can't afford high end GPUs :D
 
It's very noticeable that the real improvement with a 3080 and 3090 are at high resolutions like 4K and 8K. Unless you have a decent monitor you are unlikely to see the benefits that everyone is raving about.
 
If you cap the frames on the monitor and your GPU is able to push more frames, you will just not utilise the entirety of your GPU power. CPU bottleneck is usually felt when you are running a lower resolution than your GPU is capable. It will generate more frames, but CPU will be unable to keep up. An example would be running a game on RXT 3080 (or any other higher end GPU) at 1080p with a lower end CPU (for example my ancient i5 2500k). GPU will push many many frames, but will be limited by CPUs capacity to handle them. As a general rule, I don't tend to worry about it as I can't afford high end GPUs :D


So you are saying the same as me? You can help the cpu load by capping the frames to stop the gpu pushing the frames so much? The worse situation is that the gpu is not being used to its potential? Yet the CPU is fine because its not being asked to push something 240 frames.
 
I believe that is the idea behind it. But capping frames is really to take advantage of optimum Variable Fresh Rate monitors.
 
It doesn't work like that, capping frames does nothing other than, matching your screens refresh rate to stop screen tearing, and reducing heat.

It is literally pointless having your game running at several hundred frames per second, your just creating more heat, using more power and wearing out your hardware for nothing.

There is an argument that running frames over say 60hz (if your monitor is capped at that) can reduce input lag and generally latency, but I mean your talking niche competitive gaming there, and I mean just get a higher refresh monitor.

Where a slower CPU will affect your gaming is with minimum frame rate, not maximum.

Thinking about it, it's actually really hard to explain.......

Ok, let me give you my example on PUBG.

Im running a Nvidia 1070 playing 1440p. I've recently switched from a 10 year old i7-2600k to a new 5800x.

So..... Overall, my frame rate hasn't significantly changed, that is because the graphics settings are the same and same graphics card the same.

However, on my old CPU in game where there was a lot of action, eg lots of players around, my frame rate would drop. Now that wasn't caused by my graphics card, it was my CPU struggling a bit. Now with the new CPU although my overall frame rate isn't that much different, I dont get that drop in frame rate anymore when there is lots going on.

You'll see that affect even more at lower resolutions where the graphics card is doing less work to keep the frame rate.

So in a way, you've almost got to think of the CPU and GPU doing 2 different things in your games.

I'm not sure if that helped or confused you further, it's hard to explain but once you get your head around it you'll understand.
 
Here's a go at a reasonable explanation:

There will always be a bottleneck.

The CPU creates the game world calculating positions of objects, collisions, physics etc.

The GPU draws the image to the screen calculating texture, colour, lighting etc.

One will always be slower than the other.

If your GPU is too weak you'll get low frame rates with graphically demanding games - fancy lighting effects, volumetric clouds, highly detailed textures etc.
If your CPU is too weak you'll get low frame rates and stutters when the game world is busy - lots of characters, objects etc.

Technically you can remove a GPU bottleneck by limiting framerates, but only if they have the potential to be really high to start with. You're still relying on your CPU to provide enough calculations for the GPU to produce the images fast enough. If it can't then the framerate may not reach the cap you've set and you'll still get poor performance.
 
Back
Top Bottom