• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU for 4K Gaming Misunderstanding

Man of Honour
Joined
24 Sep 2005
Posts
35,965
Hey there - sorry for the ‘noob PSA’ but I’ve held an incorrect assumption for a couple of years and wanted to share so people avoid the same ‘mistake’.

If you look at benchmarking videos for 4k, there often tends to be very little difference in FPS from ‘upgrading the processor’ - on the basis that the system is GPU bound. This is true, but it has led me to believe (and others… I’m not alone!) that you don’t need to worry about the CPU that much for gaming at 4K.

But in truth, if you’re aiming for a particular FPS (say 120 fps at 4k) then it does matter, because with a faster processor you will often ‘unlock’ more FPS when you reduce the graphics settings.

This vid explains it well with graphs etc. Just thought I’d share because it’s so obvious and I’ve never thought of it that way before - derp!

 
Last edited:
for 4k gaming the CPU is not as important as the GPU...
but yes a better CPU dose a better job... thats a no brainer

That a better CPU does a better job was never in doubt. The misconception is that the extent of the better job is ‘nominal’ and ‘not worth going for the top model etc’ at 4k.

This is a ‘personal preference for value thing’ obviously but the main point is “don’t make your decision based on ultra setting benchmarks if you actually want to hit FPS higher than those benchmarks - because when you turn down graphics settings / turn on DLSS the better CPUs will show more significant gains”.

 
Depending on where the bottleneck is (sometimes it isn't purely GPU or CPU raw horsepower - can be things like API throughput limits due to minimum batching requirements, etc.) - a faster GPU even at 4K will require more CPU horsepower it is all relative and/or if you decrease settings. But nominally high or ultra settings at 4K the bottleneck is normally hugely skewed to the GPU.
 
Last edited:
I gave up after a while, so many contradictions and the entire premise is a straw man. Nobody would ever claim a 3600 will give the same 4K performance as a 7800X 3D when paired with any decent GPU.

It should be well known by most half savvy PC builders that balance is key. Don’t pair a top end GPU with a low end CPU and of course the same applies the other way.

The real argument has always been any decent mid - top range GPU will provide a decent 4K experience. There comes a point most graphical intensive games won’t benefit with higher end CPUs.
 
Last edited:
There comes a point most graphical intensive games won’t benefit with higher end CPUs.

Yup, this especially applies when settings / pre-set cause a GPU bottleneck (i.e. maxed out).

I think this screengrab demonstrates the point I was trying to explain in the OP quite well:

wMqWPYs.jpeg


^^^ it's obvious that the 'better CPU is better', but the bit that's interesting (to me :o) is that you start getting much more bang for buck when you turn the settings down. If you just focussed on the Ultra settings - which many benchmarks often do - you wouldn't realise this. So this is relevant is high FPS is more important than 'all bells and whistles' being turned on in the in game graphics settings.

The actual gains will vary from game to game of course.
 
Last edited:
Am missing something here, have you only just found out that lower setting give more FPS.

Yes, you are missing something - please read the posts.

Is anyone buying a 4090 and a high end CPU to run low settings at 4k?

No. But...

If game X benchmarks out, with maxed out settings and a 4090, natively at 60 FPS with a better CPU, and 55 fps with a lower rated CPU, someone may take that info to assume that the better CPU doesn't give bang for buck (it's just 5 FPS, big whoop). However, the gap may increase pretty dramatically as you reduce settings.

If you were aiming to hit at least 90fps anyway, because that's what you value, then you're probably going to hit that with a nicer looking game with the better CPU.

Edit: I'm not claiming this to be some sort of 'mythical elite info', but it's something that I myself have overlooked why considering CPU options because I've just focussed on looking at benchmarks which have 'maxed out settings', which does not show a complete set of information that may lead to a different purchasing decision.
 
Last edited:
If you look at benchmarking videos for 4k, there often tends to be very little difference in FPS from ‘upgrading the processor’ - on the basis that the system is GPU bound. This is true, but it has led me to believe (and others… I’m not alone!) that you don’t need to worry about the CPU that much for gaming at 4K.
I'd say the new refresh rate monitors are the main reason that our old assumptions about 4K gaming have had to change.

So long as 60 fps was hit, that was fine, but gamers won't tolerate that anymore and not only because the lows of a 60 fps average will make the overall experience unpleasant, though that's an important reason that the X3D CPUs are preferred with high-end GPUs.
 
I'd say the new refresh rate monitors are the main reason that our old assumptions about 4K gaming have had to change.

So long as 60 fps was hit, that was fine, but gamers won't tolerate that anymore and not only because the lows of a 60 fps average will make the overall experience unpleasant, though that's an important reason that the X3D CPUs are preferred with high-end GPUs.

I would disagree personally. I am happy with anything over my minimum VRR and never play competitive games. I can feel a difference between 50 FPS and 144 FPS of course, I just don’t care.

I have a 7900 X3D with a 4080 and many games get well over 100 FPS at 4K. I actually set them to 60 FPS to help reduce heat. It’s a personal preference.
 
Last edited:
I would disagree personally. I am happy with anything over my minimum VRR and never play competitive games. I can feel a difference between 50 FPS and 144 FPS of course, I just don’t care.
Sure, I don't mean that every gamer wants/needs 60+.

But,
I have a 7900 X3D with a 4080 and many games get well over 100 FPS at 4K. I actually set them to 60 FPS to help reduce heat. It’s a personal preference.
An FPS capped 60 fps on a PC that is capable of 100+ is going to be a much smoother experience than a PC that is only capable of a 60 fps average.
 
I would disagree personally. I am happy with anything over my minimum VRR and never play competitive games. I can feel a difference between 50 FPS and 144 FPS of course, I just don’t care.

I have a 7900 X3D with a 4080 and many games get well over 100 FPS at 4K. I actually set them to 60 FPS to help reduce heat. It’s a personal preference.
that's the crux though...competetive games. for those that do, fps is everything where they turn down the game settings so it loks ***p, just so they can get the extra fps...also going for 1080 mponitors with 360hz or above refresh rates.
For you, and I for that matter, I want my game to look as good as possible. I'm not shelling out a bucketful of cash to look at a game that rivals the original half life in 2024, no matter how good the game is....I want stunning visuals for my cash...but then I want higher fps too....I'm not picky, I just want it all....all I say :cry: :cry: (insane laughter follows)
 
Last edited:
I'd say the new refresh rate monitors are the main reason that our old assumptions about 4K gaming have had to change.

So long as 60 fps was hit, that was fine, but gamers won't tolerate that anymore and not only because the lows of a 60 fps average will make the overall experience unpleasant, though that's an important reason that the X3D CPUs are preferred with high-end GPUs.

Yes. I'm gaming at 4k with an OLED TV with a 120 refresh rate. Anything from 90-120fps is within the 'good' range for me personally, but I can see / feel it more once it drops below that. It does depend on the type of game though. Some games look a bit more 'cinematic' with a lower FPS.

I don’t think I am.

:o you literally are missing my point if you think I've just figured out "lower setting give more FPS" - that is, as painful as it is to clarify, not something that I have "just figured out".

See the remainder of my above post for further explanation but there is probably nothing to add beyond this (setting it out again here):

If game X benchmarks out, with maxed out settings and a 4090, natively at 60 FPS with a better CPU, and 55 fps with a lower rated CPU, someone may take that info to assume that the better CPU doesn't give bang for buck (it's just 5 FPS, big whoop). However, the gap may increase pretty dramatically as you reduce settings.

If you were aiming to hit at least 90fps anyway, because that's what you value, then you're probably going to hit that with a nicer looking game with the better CPU.

Edit: I'm not claiming this to be some sort of 'mythical elite info', but it's something that I myself have overlooked why considering CPU options because I've just focussed on looking at benchmarks which have 'maxed out settings', which does not show a complete set of information; that may lead to a different purchasing decision.

If it's not helpful info for you, then that's OK.
 
I was always asking the question how low of a CPU can you go with a higher end GPU on 4k as that used to be the common question. Now as someone mentioned it's the higher Res screens and people demanding 100+ FPS as the norm rather than 60.
 
@Nitefly
I get what you saying with this thread, but see other point of view also. My take is that if you have a high end cpu, turning down the setting a bit when upgrading a gpu down the line will help massively increase you fps and also help prolong the life of your cpu...helps if there's a new release coming and you want to eek out a few more months etc
Downside to that argument though is...does it really matter. When you buy your pc in the 1st place, if you buy a 4090 you're unlikely to buy a 5600x or 12100k for it, you'll be looking at a 14700k/7800x3d or above so become a little irrelevant....and argument goes other way, if you're going in buying a budget cpu, you're not going to pair it with a 4090(though did see one person do this, and even when i pointed out he'd be massively cpu bottlenecked and leave about £800 worth of unutilized gpu potential, and getting a better cpu with a lesser gpu will give him better fps, went on and did what he did)
whole point of getting a balanced system, which geerally works out that way anyway
think the HU vid just helps visualise the cpu potential, same as them testing games at 1080p when you're looking to play at 4k...just shows you how much better the top end cpu is
 
Back
Top Bottom