• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Resolution and minimum framerates

Soldato
Joined
12 Jan 2009
Posts
6,453
So I accidently broke my monitor and went and bought a huge Acer Z35P monster. I've gone from 2560 x 1440 to 3440 x 1440.

Ive noticed that my minimum framerates in games has now gone to the sub 60fps which I don't like. I have g sync enabled but it still annoys me. There are even times when I can kind of see it and feel it in game.

My question is will a CPU upgrade to a 5800X help with this or is it purely GPU bound?
 
Check your usage and see for yourself whether you're CPU or GPU bound. Have you used MSI afterburner before? It can show you in great detail what your CPU, RAM, GPU is doing during gameplay.

You'll learn far more than if somebody just told you the answer.
 
Check your usage and see for yourself whether you're CPU or GPU bound. Have you used MSI afterburner before? It can show you in great detail what your CPU, RAM, GPU is doing during gameplay.

You'll learn far more than if somebody just told you the answer.

Both CPU and GPU are pretty much at 100%. GPU drops everyone and then though
 
Back the settings off a bit. Usually ultra-high-medium on a lot of settings makes little difference visually for fairly big fps impact. 1080Ti should do 3440x1440 above 60fps from experience. Think mine had little issue running 3440x1440 100hz on the likes of bfv.

Obviously quite recent games RDR2, Cyberpunk are a little more demanding.
 
Back the settings off a bit. Usually ultra-high-medium on a lot of settings makes little difference visually for fairly big fps impact. 1080Ti should do 3440x1440 above 60fps from experience. Think mine had little issue running 3440x1440 100hz on the likes of bfv.

Obviously quite recent games RDR2, Cyberpunk are a little more demanding.

Yes I'll play around. Geforce experience said to have ambient occlusion at Ultra in COD Cold War!? I set it to High and instantly got average 80 fps. What other settings affect fps the most?
 
Yes I'll play around. Geforce experience said to have ambient occlusion at Ultra in COD Cold War!? I set it to High and instantly got average 80 fps. What other settings affect fps the most?

Shadows are normally a culprit. Any form of anti aliasing (i.e. MSAA) as well. Then it's texture quality.
 
OK here are my CPU usage and GPU usage after 1 round of COD Cold War

jTgXFZ7.png


OZBil7k.png


They seem all over the place, is that normal?
 
If your framerate has dropped since going to a higher resolution monitor then it's almost certainly down to the GPU.
 
OK here are my CPU usage and GPU usage after 1 round of COD Cold W

They seem all over the place, is that normal?

Yes, normal. Load on the GPU/CPU depends on what is happening in game at the time. The usage on your graphs actually looks very similar which suggests to me that in Warzone the CPU and GPU are well balanced against each other, with no significant bottleneck. So upgrading 1 of them in isolation probably wouldn't result in big performance gains in that game. You'd need to do both.

But you also have to remember that the ratio of how much demand a game places on CPU vs the GPU depends entirely on the game. Other games you play may have a different balance and therefore might benefit from either a CPU or GPU upgrade, as opposed to having to do both. It's a good idea to do the same kind of graphing comparison across more games you play, you may find there's benefits to be had from either a CPU or GPU upgrade in other games.

With GPU vs CPU % usage measurement, a good rule of thumb is that the GPU usage tends to be more accurate and reliable. If the GPU usage drops below about 95% average then you can bet it's a CPU limitation. Where as the reverse of that might not necessarily be true.
 
At 3440x1440 my 1080ti used to struggle. 2080ti was fine except for a couple of more demanding titles but to be honest the 4790k is getting on a bit so you probably have to upgrade CPU and GPU if you did just to prevent bottlenecks.
 
As said it really depends on the game, but You are now GPU bound primarily compared to what you were, due to switching to higher res.

I ran a 1080ti on a 3440x1440 panel for nearly 4 years. With a few minor in game settings tweaks 90% of games I played would run above 60fps nearly all the time.

1080ti is starting to show it’s age a bit though with more recent titles, which is why I went 3090 (and I got into vr, which is insanely high resolution)
 
You can still play at 2560x1440 with no image loss as it won't need to scale. I play any FPS intensive games at 2560x1440 and then anything single player at 5120x1440 when I don't mind less fps.
 
What because of black bars on the sides? I'll take them over FPS drops when it matters.
 
Yes I'll play around. Geforce experience said to have ambient occlusion at Ultra in COD Cold War!? I set it to High and instantly got average 80 fps. What other settings affect fps the most?

Probably all settings on high. Fair few guides on what to have some you can set to medium without much visual loss.
 
Turn all the game settings down to low and see if fps increases, if it does then you are GPU and not CPU limited.
 
Back
Top Bottom