• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU doesn't matter for 4k gaming, is that completely true?

Soldato
Joined
22 Oct 2004
Posts
13,557
As title I see this thrown around a lot. Is that true completely or just to some degree?
Is it always wise to spend majority on the graphics card over the CPU?
Can it ever be you need a high end CPU and GPU?
I'm not at 4k I'm on 1440p. But if I was 4k did I make a mistake getting a 7800x3d and a 7900xtx when I could have got a 7700 and a 4090. Please don't make that comment a Nvidia Vs AMD I'm just comparing I could have got a better GPU if I was a 4k user.
 
Last edited:
As title I see this thrown around a lot. Is that true completely or just to some degree?

It was true (to an extent, the CPU had to be decent, e.g. an i5 K), but with high refresh monitors and faster cards (like the 4090), it isn't an accurate generalisation anymore.

If you're happy with 4K/60, then it's still mostly true that a recent-ish i5 K is sufficient.

Is it always wise to spend majority on the graphics card over the CPU?

Yes, except if you're already likely to be GPU bottlenecked, like if you're a competitive esports/FPS gamer, or you play games that really love the cache (e.g. racing, flight sim).
 
It entirely depends on what you play, and always has. It'd probably be fair to say that "AAA" games were generally always GPU-bound at 4K until the past couple of years, but even that doesn't always hold up any more. And of course it's always been the case that something like CounterStrike or World of Warcraft is going to be CPU-bound at 4K on a decent and balanced setup. That said, you're better off in most cases directing your funds to the GPU if you already have a passable CPU, assuming you want to play a broad spectrum of games and don't have a specific, CPU-heavy use case (within the realm of gaming) in mind. People worry way too much about CPU bottlenecks, as if their shiny new GPU is going to explode in a cloud of deadly gas if it drops below 100% usage once in a while. You're still going to get a great gaming experience with any chip on par with or faster than, say, a 5600X.
 
Last edited:
Definitely not true anymore. With powerful cards that can pump out high frames at 4k, the cpu can absolutely be a massive bottleneck, especially in some games (check out Starfield benches for one).

But to be honest, it has always been true to a certain extent/in certain scenarios or parts of a game.

I had a 2500k and in Crysis 3 in the long grass my fps tanked. I swapped to an i7 3700k, and the added hyperthreading literally almost doubled my fps at that point of the game.
 
Last edited:
Its never been true as in a flat out no or yes as the answer changes game to game. Even now we have 7 year old games that are limited by CPU at 4k rather then the GPU. Yet other games will be limited by the GPU.
 
Its never been true as in a flat out no or yes as the answer changes game to game. Even now we have 7 year old games that are limited by CPU at 4k rather then the GPU. Yet other games will be limited by the GPU.
Agree it's nuanced but... the 7 year old game at 4K is probably related to the graphics settings available. More modern = bigger textures = more lighting/other post processing. 4K without modern AA/AF/etc and not so amazing textures probably isn't that hard.

Good test might be basic vs total overhaul skyrim as there's majorly improved texture packs for that. Slap one of the GPU tweak filters on too and it might start to tax a modern card enough that CPU gets breathing room.

It's essentially "how complicated are the dx GPU calls?" vs GPU power. If there's enough asked, the overall res is still a final multiplying factor on the workload but it might already be doing quite a lot.


to answer the OP though: Its complicated, see above, generally though, yeah. If you're throwing a top end card at it for 4K, you can probably go for a mid-high CPU and still see everything getting good usage.
 
Last edited:
As title I see this thrown around a lot. Is that true completely or just to some degree?
Is it always wise to spend majority on the graphics card over the CPU?
Can it ever be you need a high end CPU and GPU?
I'm not at 4k I'm on 1440p. But if I was 4k did I make a mistake getting a 7800x3d and a 7900xtx when I could have got a 7700 and a 4090. Please don't make that comment a Nvidia Vs AMD I'm just comparing I could have got a better GPU if I was a 4k user.
I agree, dont waste your money upgrading like i did. I moved from a 5900x to a 7800x3d. The cache helped in some games where i got an extra 20fps, but as the 4090 struggles with top end AAA games anyway at native 4k whats the point. i altered the settings to get better perrformance with the bullsh!t voodoo frame regeneration. Im using a 4090 strix oc aswell . The games i play see the 4090 maxed out and cpu chugging away at 20%...pointless :/,,oh and the other issue is game optimization. Many games port from console to pc and as you know they suck. The callisto protacol ran great, not stuutters on my 6900xt. It runs like garbage on the 4090 where yesterday , with FSR running it dropped to 17fps for second. The whole industry is about who sponsers the game, is optimized well, what resolution you using. I expected to run all games at native 4k with the 4090, i was widely wrong.
 
Last edited:
Definitely not true anymore. With powerful cards that can pump out high frames at 4k, the cpu can absolutely be a massive bottleneck, especially in some games (check out Starfield benches for one).

But to be honest, it has always been true to a certain extent/in certain scenarios or parts of a game.

I had a 2500k and in Crysis 3 in the long grass my fps tanked. I swapped to an i7 3700k, and the added hyperthreading literally almost doubled my fps at that point of the game.
Crysis. Now I have to go install Warhead for some nice nostalgia. I'm sure my GT 1030 will power through it :p
 
Last edited:
Averages FPS is, as unusual, a poor metric to use.

The main effect of a weak or weaker CPU is minimum FPS rather than the average. Provided the game actually has CPU intensive parts.

The games i play see the 4090 maxed out and cpu chugging away at 20%...pointless :/,,oh and the other issue is game optimization. Many games port from console to pc and as you know they suck. The callisto protacol ran great, not stuutters on my 6900xt. It runs like garbage on the 4090 where yesterday , with FSR running it dropped to 17fps for second. The whole industry is about who sponsers the game, is optimized well, what resolution you using. I expected to run all games at native 4k with the 4090, i was widely wrong.
Rather than blaming optimization, could Nvidia driver's higher dependence on the CPU be the reason?
In DX12 the difference between CPU usage for Nvidia's and AMD's duvets ist meant to be quite noticeable.
 
I agree, dont waste your money upgrading like i did. I moved from a 5900x to a 7800x3d. The cache helped in some games where i got an extra 20fps, but as the 4090 struggles with top end AAA games anyway at native 4k whats the point. i altered the settings to get better perrformance with the bullsh!t voodoo frame regeneration. Im using a 4090 strix oc aswell . The games i play see the 4090 maxed out and cpu chugging away at 20%...pointless :/,,oh and the other issue is game optimization. Many games port from console to pc and as you know they suck. The callisto protacol ran great, not stuutters on my 6900xt. It runs like garbage on the 4090 where yesterday , with FSR running it dropped to 17fps for second. The whole industry is about who sponsers the game, is optimized well, what resolution you using. I expected to run all games at native 4k with the 4090, i was widely wrong.

I get that, to a point. I run at 4k, but have the choice on second monitor to run UW. I use a 4080 along with a 12700k overclocked to 5.1Ghz. Running some Indie games that I enjoy, one in particular, I do notice similar CPU overall loading as you describe but individual cores can almost be maxed out, well at times. The one Indie game that I do play a lot does seem to be coded, according to the Dev(s), to utilise 3 cores. So it does seem to benefit from having a high clocked CPU that has pretty decent IPS.
I often see games favouring a few cores whilst the overall loading can be pretty low. There are some exceptions that seem pretty multi-core aware, even using the E-Cores, Civ VI being one of them.

The GPU is not maxed out at all in that particular game either, maybe suggesting internal game timers / optimisations etc.....limitations of game engine etc etc etc. Some games seem to be like that, meaning diminishing returns when throwing new hardware at them to try and improve their performance.
 
Last edited:
I agree, dont waste your money upgrading like i did. I moved from a 5900x to a 7800x3d. The cache helped in some games where i got an extra 20fps, but as the 4090 struggles with top end AAA games anyway at native 4k whats the point. i altered the settings to get better perrformance with the bullsh!t voodoo frame regeneration. Im using a 4090 strix oc aswell . The games i play see the 4090 maxed out and cpu chugging away at 20%...pointless :/,,oh and the other issue is game optimization. Many games port from console to pc and as you know they suck. The callisto protacol ran great, not stuutters on my 6900xt. It runs like garbage on the 4090 where yesterday , with FSR running it dropped to 17fps for second. The whole industry is about who sponsers the game, is optimized well, what resolution you using. I expected to run all games at native 4k with the 4090, i was widely wrong.
I upgraded from a 6700k 16gb ddr4 and a Nvidia 1080 a couple weeks ago, massive jump in performance. Not used to leaving upgrades that long but its made me feel like I shouldn't upgrade until I actually need to now rather than I want a new shiny.
Just in awe in the jump in performance I have with most games have increased to literally an extra 100fps.
 
I came on here to ask pretty much the same question! I play warzone 2 a fair bit on pc. Running 16gb, Ryzen 3600, and a 3080ti @4k on a 120hz monitor. I'm seeing cpu usage according to riva statistics at 100% and my fps will drop to 1 or 2 for a few seconds. I've made a few in game settings so it's less heavy on the cpu. And set amd power to max power rather than balanced. I've looked and can't find any comparison between the 3600 and 5800x3d on 4k or is there a better am4 cpu for me?
 
I upgraded from a 6700k 16gb ddr4 and a Nvidia 1080 a couple weeks ago, massive jump in performance. Not used to leaving upgrades that long but its made me feel like I shouldn't upgrade until I actually need to now rather than I want a new shiny.
Just in awe in the jump in performance I have with most games have increased to literally an extra 100fps.
Wow, bet that was a great upgrade, i went from my 6700K 2015 build, to a 13900K 4090 32GB DDR5 Ram a few months ago, still aint installed a OS to see the difference yet, guess i will be in awe like you :)
 
I came on here to ask pretty much the same question! I play warzone 2 a fair bit on pc. Running 16gb, Ryzen 3600, and a 3080ti @4k on a 120hz monitor. I'm seeing cpu usage according to riva statistics at 100% and my fps will drop to 1 or 2 for a few seconds. I've made a few in game settings so it's less heavy on the cpu. And set amd power to max power rather than balanced. I've looked and can't find any comparison between the 3600 and 5800x3d on 4k or is there a better am4 cpu for me?
your in a good position to drop in a cheapo 5800x or x3d and be good for a fair while
 
The 5800x seems pretty cheap tbf. Much less than the 5800x3d. I'll have a look now. Thank you
The 5700X is decent, although I’d go for maybe something with a little more performance, especially if pairing it with an Nvidia card card as they offload more work to the CPU.
 
as someone in a unique position - I Can confirm the 7950x3d sometimes holds back the 4090 @ 8k in cs2 [certain maps]low - so yes the CPU does bottleneck isn't always true be it only 5-10%
 
Back
Top Bottom