• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU for 4K Gaming Misunderstanding

Man of Honour
Joined
24 Sep 2005
Posts
35,965
Hey there - sorry for the ‘noob PSA’ but I’ve held an incorrect assumption for a couple of years and wanted to share so people avoid the same ‘mistake’.

If you look at benchmarking videos for 4k, there often tends to be very little difference in FPS from ‘upgrading the processor’ - on the basis that the system is GPU bound. This is true, but it has led me to believe (and others… I’m not alone!) that you don’t need to worry about the CPU that much for gaming at 4K.

But in truth, if you’re aiming for a particular FPS (say 120 fps at 4k) then it does matter, because with a faster processor you will often ‘unlock’ more FPS when you reduce the graphics settings.

This vid explains it well with graphs etc. Just thought I’d share because it’s so obvious and I’ve never thought of it that way before - derp!

 
Last edited:
for 4k gaming the CPU is not as important as the GPU...
but yes a better CPU dose a better job... thats a no brainer

That a better CPU does a better job was never in doubt. The misconception is that the extent of the better job is ‘nominal’ and ‘not worth going for the top model etc’ at 4k.

This is a ‘personal preference for value thing’ obviously but the main point is “don’t make your decision based on ultra setting benchmarks if you actually want to hit FPS higher than those benchmarks - because when you turn down graphics settings / turn on DLSS the better CPUs will show more significant gains”.

 
There comes a point most graphical intensive games won’t benefit with higher end CPUs.

Yup, this especially applies when settings / pre-set cause a GPU bottleneck (i.e. maxed out).

I think this screengrab demonstrates the point I was trying to explain in the OP quite well:

wMqWPYs.jpeg


^^^ it's obvious that the 'better CPU is better', but the bit that's interesting (to me :o) is that you start getting much more bang for buck when you turn the settings down. If you just focussed on the Ultra settings - which many benchmarks often do - you wouldn't realise this. So this is relevant is high FPS is more important than 'all bells and whistles' being turned on in the in game graphics settings.

The actual gains will vary from game to game of course.
 
Last edited:
Am missing something here, have you only just found out that lower setting give more FPS.

Yes, you are missing something - please read the posts.

Is anyone buying a 4090 and a high end CPU to run low settings at 4k?

No. But...

If game X benchmarks out, with maxed out settings and a 4090, natively at 60 FPS with a better CPU, and 55 fps with a lower rated CPU, someone may take that info to assume that the better CPU doesn't give bang for buck (it's just 5 FPS, big whoop). However, the gap may increase pretty dramatically as you reduce settings.

If you were aiming to hit at least 90fps anyway, because that's what you value, then you're probably going to hit that with a nicer looking game with the better CPU.

Edit: I'm not claiming this to be some sort of 'mythical elite info', but it's something that I myself have overlooked why considering CPU options because I've just focussed on looking at benchmarks which have 'maxed out settings', which does not show a complete set of information that may lead to a different purchasing decision.
 
Last edited:
I'd say the new refresh rate monitors are the main reason that our old assumptions about 4K gaming have had to change.

So long as 60 fps was hit, that was fine, but gamers won't tolerate that anymore and not only because the lows of a 60 fps average will make the overall experience unpleasant, though that's an important reason that the X3D CPUs are preferred with high-end GPUs.

Yes. I'm gaming at 4k with an OLED TV with a 120 refresh rate. Anything from 90-120fps is within the 'good' range for me personally, but I can see / feel it more once it drops below that. It does depend on the type of game though. Some games look a bit more 'cinematic' with a lower FPS.

I don’t think I am.

:o you literally are missing my point if you think I've just figured out "lower setting give more FPS" - that is, as painful as it is to clarify, not something that I have "just figured out".

See the remainder of my above post for further explanation but there is probably nothing to add beyond this (setting it out again here):

If game X benchmarks out, with maxed out settings and a 4090, natively at 60 FPS with a better CPU, and 55 fps with a lower rated CPU, someone may take that info to assume that the better CPU doesn't give bang for buck (it's just 5 FPS, big whoop). However, the gap may increase pretty dramatically as you reduce settings.

If you were aiming to hit at least 90fps anyway, because that's what you value, then you're probably going to hit that with a nicer looking game with the better CPU.

Edit: I'm not claiming this to be some sort of 'mythical elite info', but it's something that I myself have overlooked why considering CPU options because I've just focussed on looking at benchmarks which have 'maxed out settings', which does not show a complete set of information; that may lead to a different purchasing decision.

If it's not helpful info for you, then that's OK.
 
@Craig_d1 yup that makes sense and a good point: it's most likely to be relevant when someone is building from scratch (or making an ad-hoc upgrade that will lead to) an 'unbalanced system'. If you're going to 'go big' with a new build then you're just going to 'go big' = no issue.

(I didn't expressly acknowledge it but this was what @ICDP also mentioned in an earlier post - again, it's a good point)
 
Back
Top Bottom