• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Please help regarding CPUs and higher resolutions

Associate
Joined
31 May 2016
Posts
51
Hello,

So I'm a little confused regarding CPUs and higher resolutions. I've always been under the impression that a major CPU is not required when running high res. I just got my 3080 Eagle today and I've been running some games with it.

My monitor is 32:9 running at 5120x1440. Correct me if I'm wrong but I would assume thats a similar render resolution to 4k.

Anyway, I've started testing Call Of Duty Black Ops Cold War and just wacked everything to Max with DLSS on Performance. Game looks great and can get as high as 80Fps in some areas. But I noticed I'm only getting about 50% GPU usage even at this resolution. My monitor is 100hz so it can go up to 100fps.

So I've just got to the Vitanam level and my framerate totally tanked to 20fps/30fpsish. GPU usage is low and CPU usage is hovering around 80/90%.

For me it sounds like I blatant CPU bottleneck, I have a i7 7700k running at 4.4ghz and 16GB of 3200mhz ram. Overclocking the CPU anymore screws my temps.

I've checked over benchmarks on Youtube with similar setups but with a Ryzen 5600x and they are getting 80fps in that Vitaman area. Bare in mind this is at 4K resolution, not 5120x1440.

The reason I'm posting this is because I was under the impression by others that my i7 7700k would easily handle this at my resolution because it should be forcing the 3080 into the bottleneck. Instead it seems to me its the other way around.

I just wanted to make 100% sure that it is indeed CPU bound before I go ahead and buy something.

Thanks for all your help.
 
Sounds like your CPU is a bottleneck... and the 5600k is faster with 2 extra cores and more threads than your CPU. If you have V-Sync on, turn it off and see if it helps as it can hold you back a bit too.
 
Sounds like your CPU is a bottleneck... and the 5600k is faster with 2 extra cores and more threads than your CPU. If you have V-Sync on, turn it off and see if it helps as it can hold you back a bit too.

Thank you for your reply.
I think I need to do some testing with some other games because on other games I'm getting 100% GPU usage and the frames are very high. Doom Eternal is a good example of a game that runs like butter, its locked at 100fps.

I never have Vsync on if I can avoid it and just have a FPS limiter on set to 100fps.

I did some more research last night and it would appear that its Call Of Duty itself that maybe the issue. I haven't downloaded the Hi Res Assets because I thought whats the point if its gonna run at 30fps. Turns out someone online downloaded in and there frame rate shot up. So I'm in the process of downloading them now. Also the benchmarked video I checked also had high res assets downloaded.

I haven't got a problem upgrading me CPU at all but I want to be 100% certain that its gonna be worth it for the resolution I'm running. If I spend £600 on new motherboard and Processer and its a 5fps increase I'm really not that bothered.
 
I've always been under the impression that a major CPU is not required when running high res.
This is true insofar as at high resolutions, your GPU usually is what limits your framerate as higher res is much more taxing on GPU but is irrelevant to CPU demand. Also higher resolution screens usually have lower refresh rates so people aren't bothered that their CPU might cap them at 80FPS on a 60Hz display.

In your case the 3080 is an obscenely powerful GPU and with performance DLSS, is running at a lower resolution, combined with Cold War being an unoptimized CPU demanding game causes low FPS, high CPU use and low GPU use. Remember you're running a £700 GPU using 2020 tech with a £350 CPU based on 2015 tech. Doom eternal is very well optimised hence the great performance.
 
I’ve seen a few people say at high resolutions the CPU is irrelevant but they seem to be misunderstanding the context.
 
This whole 'CPU is not important at high resolutions' myth really annoys me.

100 FPS at high resolution needs the same (maybe even faster) CPU as 720p.

The confusion is due to people buying an underpowered GPU which causes a bottleneck at high resolutions. I've even seen people on these forums recommending somebody increase their resolution to solve a low FPS issue.
 
This is true insofar as at high resolutions, your GPU usually is what limits your framerate as higher res is much more taxing on GPU but is irrelevant to CPU demand. Also higher resolution screens usually have lower refresh rates so people aren't bothered that their CPU might cap them at 80FPS on a 60Hz display.

In your case the 3080 is an obscenely powerful GPU and with performance DLSS, is running at a lower resolution, combined with Cold War being an unoptimized CPU demanding game causes low FPS, high CPU use and low GPU use. Remember you're running a £700 GPU using 2020 tech with a £350 CPU based on 2015 tech. Doom eternal is very well optimised hence the great performance.
I’ve seen a few people say at high resolutions the CPU is irrelevant but they seem to be misunderstanding the context.
This whole 'CPU is not important at high resolutions' myth really annoys me.

100 FPS at high resolution needs the same (maybe even faster) CPU as 720p.

The confusion is due to people buying an underpowered GPU which causes a bottleneck at high resolutions. I've even seen people on these forums recommending somebody increase their resolution to solve a low FPS issue.

Thank you for your replies and I appreciate the time you have taken. I'm usually pretty techy and can identify problems on my own, which is why I believed I had a CPU bottleneck in the first place.
But this was one scenerio I didn't truely know the answer to because of the informationed I heard that 4K makes CPUs irrelevant, which when you hear enough you start to contradict your own knowledge which I have done here.

As for what you said @CuriousTomCat, I've also seen similar saying "Increase resolution = fixed" and that really isn't helpful information. I really didn't think that my 3080RTX would vastly "overpower" my 7700k but it would appear I'm very much mistaken.

For an upgrade I was thinking about a Ryzen 5600x, I'm in no rush to get one so I'll wait till the stock situation resolves itself then get one. But does everyone believe this is a worthwhile upgrade? From reviews I've seen anything above a 5600x for gaming is overkill, but I'm not sure. I've seen that the other 5000 line up is more for workstations. But please correct me if I'm wrong.
 
Thank you for your replies and I appreciate the time you have taken. I'm usually pretty techy and can identify problems on my own, which is why I believed I had a CPU bottleneck in the first place.
But this was one scenerio I didn't truely know the answer to because of the informationed I heard that 4K makes CPUs irrelevant, which when you hear enough you start to contradict your own knowledge which I have done here.

As for what you said @CuriousTomCat, I've also seen similar saying "Increase resolution = fixed" and that really isn't helpful information. I really didn't think that my 3080RTX would vastly "overpower" my 7700k but it would appear I'm very much mistaken.

For an upgrade I was thinking about a Ryzen 5600x, I'm in no rush to get one so I'll wait till the stock situation resolves itself then get one. But does everyone believe this is a worthwhile upgrade? From reviews I've seen anything above a 5600x for gaming is overkill, but I'm not sure. I've seen that the other 5000 line up is more for workstations. But please correct me if I'm wrong.

common mistake people do is to buy a powerful gpu and then use it with a old cpu.
cpu matters as workloads even at high resolution can be affected with min fps and what not.
so yea, 5600x is what I would buy or even a 5800x/5900x.

I get my 5600x tomorrow.
and while I play at 1440p it still makes a difference vs my current 3600.
 
common mistake people do is to buy a powerful gpu and then use it with a old cpu.
cpu matters as workloads even at high resolution can be affected with min fps and what not.
so yea, 5600x is what I would buy or even a 5800x/5900x.

I get my 5600x tomorrow.
and while I play at 1440p it still makes a difference vs my current 3600.

Thanks for the reply.
So you suggested 5800x or 5900x as well there, would the difference a lot more noticable from a 5600x do you think?
 
Thanks for the reply.
So you suggested 5800x or 5900x as well there, would the difference a lot more noticable from a 5600x do you think?

There's no difference for gamers, so if you're planning to keep the 3080 for the full life of your computer, then the 5600x is a great match.

However, most people upgrade their GPU after after a few years. Some people even upgrade the GPU a second time after a couple more years.

In order for that scenario to work, the CPU needs to be overkill for the first GPU, then a good match for the next GPU upgrade.

Some people end up building a balanced system at the start to avoid a bottleneck, but then they can't upgrade the GPU in the future, because the CPU has no headroom.

You could get a 5600x now and then upgrade to a used 5800X/5900x when you get the RTX 4080 in a few years. But used AMD CPUs hold on to their prices these days, so you might not save much.
 
Last edited:
Thanks for the reply.
So you suggested 5800x or 5900x as well there, would the difference a lot more noticable from a 5600x do you think?

5800x/5900x give you a bit more grunt for streaming, back ground tasks etc. I've gone for 5900x, primarily as I want to stream / record. 5800x would've done the job just fine but I believe the 5800x will run hotter than 5900x which will cap core clock.

5600x/5900x - run cooler than 5800x/5950x and temp on ryzen is key for boost clocks from what I can gather.
 
This whole 'CPU is not important at high resolutions' myth really annoys me.

100 FPS at high resolution needs the same (maybe even faster) CPU as 720p.

The confusion is due to people buying an underpowered GPU which causes a bottleneck at high resolutions. I've even seen people on these forums recommending somebody increase their resolution to solve a low FPS issue.

+1 it's a really common misunderstanding.

The best analogy IMO:

Your PC is like a doughnut conveyor belt when it's running a game. The CPU drives the belt and the GPU puts all the frosting and sprinkles on. If the doughnut is plain (1080p) then the GPU has little to do (50% usage) and the CPU will run the belt at full speed (80fps/100%). If the doughnut has lots of frosting and sprinkles (4k) then the GPU has lots of work to do (100%) and the CPU needs to slow down to make sure all the doughnuts get coated (30fps/40%).

The most important factor in all of this is how many doughnuts you need (FPS) and for this reason the most important choice you make is not the GPU or the CPU it's the monitor because that will dictate how many sprinkles you need (resolution) but crucially how many doughnuts you need (FPS). Sometimes you don't want the machine to work too hard because it'll make too much noise and heat, so in that case you can limit the amount of doughnuts even though your machine can do more, this is called "Vsync".

Hope that makes sense. :)
 
i'd use an example i have from Microsoft Flight Simulator, when I switch on the FPS monitoring it shows I'm not being held back by my GPU (MSI Trio X 3080) but by the single thread CPU issue. I'm running a i7 6700k until my 5900x arrives. A lot will depend on each game and whether it can take advantage of the multi core or if it's reliant on single core performance.
 
+1 it's a really common misunderstanding.

The best analogy IMO:

Your PC is like a doughnut conveyor belt when it's running a game. The CPU drives the belt and the GPU puts all the frosting and sprinkles on. If the doughnut is plain (1080p) then the GPU has little to do (50% usage) and the CPU will run the belt at full speed (80fps/100%). If the doughnut has lots of frosting and sprinkles (4k) then the GPU has lots of work to do (100%) and the CPU needs to slow down to make sure all the doughnuts get coated (30fps/40%).

The most important factor in all of this is how many doughnuts you need (FPS) and for this reason the most important choice you make is not the GPU or the CPU it's the monitor because that will dictate how many sprinkles you need (resolution) but crucially how many doughnuts you need (FPS). Sometimes you don't want the machine to work too hard because it'll make too much noise and heat, so in that case you can limit the amount of doughnuts even though your machine can do more, this is called "Vsync".

Hope that makes sense. :)

Fantastic post.
 
There's no difference for gamers, so if you're planning to keep the 3080 for the full life of your computer, then the 5600x is a great match.

However, most people upgrade their GPU after after a few years. Some people even upgrade the GPU a second time after a couple more years.

In order for that scenario to work, the CPU needs to be overkill for the first GPU, then a good match for the next GPU upgrade.

Some people end up building a balanced system at the start to avoid a bottleneck, but then they can't upgrade the GPU in the future, because the CPU has no headroom.

You could get a 5600x now and then upgrade to a used 5800X/5900x when you get the RTX 4080 in a few years. But used AMD CPUs hold on to their prices these days, so you might not save much.
It’s game dependent, but generally games are scaling past 6c 12t.
5800x/5900x give you a bit more grunt for streaming, back ground tasks etc. I've gone for 5900x, primarily as I want to stream / record. 5800x would've done the job just fine but I believe the 5800x will run hotter than 5900x which will cap core clock.

5600x/5900x - run cooler than 5800x/5950x and temp on ryzen is key for boost clocks from what I can gather.
+1 it's a really common misunderstanding.

The best analogy IMO:

Your PC is like a doughnut conveyor belt when it's running a game. The CPU drives the belt and the GPU puts all the frosting and sprinkles on. If the doughnut is plain (1080p) then the GPU has little to do (50% usage) and the CPU will run the belt at full speed (80fps/100%). If the doughnut has lots of frosting and sprinkles (4k) then the GPU has lots of work to do (100%) and the CPU needs to slow down to make sure all the doughnuts get coated (30fps/40%).

The most important factor in all of this is how many doughnuts you need (FPS) and for this reason the most important choice you make is not the GPU or the CPU it's the monitor because that will dictate how many sprinkles you need (resolution) but crucially how many doughnuts you need (FPS). Sometimes you don't want the machine to work too hard because it'll make too much noise and heat, so in that case you can limit the amount of doughnuts even though your machine can do more, this is called "Vsync".

Hope that makes sense. :)
i'd use an example i have from Microsoft Flight Simulator, when I switch on the FPS monitoring it shows I'm not being held back by my GPU (MSI Trio X 3080) but by the single thread CPU issue. I'm running a i7 6700k until my 5900x arrives. A lot will depend on each game and whether it can take advantage of the multi core or if it's reliant on single core performance.

Thank you all for your replies!
And to @Dirk Diggler for the fantastic analogy! I have learnt one or two new things from whats been posted. For example I realised although my CPU was made in 2017 its still based on a 2015 architecture which I wasn't really thinking about. Also the fact that games are now taking more advantage of more than 4 cores now which I didn't really take into account. I always been in the boat of "More than 4 cores is useless because the games don't use them" but I've stuck with that mindset for too long to the point its outdated. With the new consoles have 8 cores if I'm not mistaken, I'm assuming more and more games will be taking full advantage of them.

And even at 4k, 4 cores isn't gonna keep the donut factory running at maximum efficiency.
 
Back
Top Bottom