• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

CPU doesn't matter for 4k gaming, is that completely true?

Soldato
Joined
30 Jun 2019
Posts
7,875
If there's a CPU bottleneck in a game at lower resolutions like 720p and 1080p, then it definitely matters.

A recent 8 core CPU is a good idea for modern games, particularly as many games are console ports (and current gen consoles have 8 CPU cores).

V-cache CPUs are mostly for if you want to play well above 60 FPS (in most titles).

If you play at 4K, it probably does make more sense to spend money on the GPU, but the RTX 4090 FE is priced over £1,500 so it's really not a good deal.

In my opinion, neither the RTX 4090, nor v-cache CPUs offer good value. V-cache is probably a good option as a used product, when the prices come down a lot.

I'm not sure about the 5800X3D, in general, I think making the switch to DDR5 is a good idea these days (now that prices have come down a lot).

I wouldn't worry about having a RX 7900 XTX, this is still a far better GPU than approx. 99% of desktop PCs.

These cards are a bit overpriced, but it's no where near as bad as what we've seen with the RTX 4080 and 4090.
 
Last edited:
Soldato
Joined
1 Apr 2014
Posts
18,657
Location
Aberdeen
As title I see this thrown around a lot. Is that true completely or just to some degree?

True to a significant degree. It does depend upon the games you are playing and your GPU.

Games tend to be GPU-limited at 4k. I upgraded my A770 system from an 8700 to a Ryzen 7700 and saw little improvement at 4k medium settings in HZD and Tomb Raider. However, where you will see improvements - as I did - is in the 1% lows. My main PC went from a 3090 to a 4090 and improvements there were from the GPU.

OTOH if you are playing very old games then you will see framerate improvements to the extent your monitor allows.

There's something else, though: playing at 4k on a 27" monitor is a qualitatively better experience than 1440p. It's not something that's immediately obvious, though.
 
Soldato
OP
Joined
22 Oct 2004
Posts
13,385
True to a significant degree. It does depend upon the games you are playing and your GPU.

Games tend to be GPU-limited at 4k. I upgraded my A770 system from an 8700 to a Ryzen 7700 and saw little improvement at 4k medium settings in HZD and Tomb Raider. However, where you will see improvements - as I did - is in the 1% lows. My main PC went from a 3090 to a 4090 and improvements there were from the GPU.

OTOH if you are playing very old games then you will see framerate improvements to the extent your monitor allows.

There's something else, though: playing at 4k on a 27" monitor is a qualitatively better experience than 1440p. It's not something that's immediately obvious, though.
Sorry squirtz what did you mean by your last paragraph on 27"?
 
Soldato
Joined
14 Aug 2009
Posts
2,811
If you want 60fps at 4k, but your CPU can do only 50fps.... doesn't matter that is 720p or 4k, it will still do 50fps.

Also worth nothing that going with Surround or Eyefinity, CPU usage will increase as FoV increases, so if you're 60fps locked on single screen and the processor is quite close to the 60fps limit , on multiple screens you can drop below that easily. Same goes for ultra wide, but to a lesser extent, I suppose.

Also, just because a game is DX12/Vulkan, doesn't mean that is properly multithreaded on the render side.
 
Last edited:
Soldato
Joined
15 Oct 2019
Posts
11,709
Location
Uk
With technologies like DLSS + FSR becoming much more prevalent and rendering games at lower resolutions plus GPU performance advancing at a greater pace than CPUs over the past 10 years then I’d say a high end CPU is more important these days if you’re using a high end GPU.
 
Associate
Joined
31 Oct 2007
Posts
251
I've just dropped in a 4070ti, I try and aim for 4k with DLSS I'm trying to work out if it's worth me upgrading my 5600x @4.8 to a 5800x3d with lower clocks?
 
Associate
Joined
1 Jun 2012
Posts
193
Location
Hereford
The 5700X is decent, although I’d go for maybe something with a little more performance, especially if pairing it with an Nvidia card card as they offload more work to the CPU.
I'm worried that I'd still hit 100% cpu usage. It is however a common problem if you Google it. But I'm sure I never had that issue before though. Perhaps because I had a lesser gpu? I went from 1080ti, to a 2080ti ( that's when the issues started.. Both 1440p mind) to now a 3080ti @4k but still hitting 100% usage.
 
Soldato
Joined
18 Feb 2015
Posts
6,485
It was never true, because resolution doesn't determine the framerate target, the player does. As for whether you should've went 4090 over 7900XTX but with 7700 vs 7800X3D, yeah that was a major misstep on your part. The difference between a 4090 & 7900 XTX can be astronomical in some scenarios, plus GPUs tend to lose value quicker so you'd want to get most out of the latest gen when they're out. On the other hand CPUs hold well in value overall, there's a lower cost involved, and upgrading is easy on the AM5 platform. I would probably try and sell and do a swap in your shoes.

I've just dropped in a 4070ti, I try and aim for 4k with DLSS I'm trying to work out if it's worth me upgrading my 5600x @4.8 to a 5800x3d with lower clocks?
A 5600X will have issues even with stable 60 fps in certain games so an upgrade to a 5800X3D would be massive, especially considering how cheap it is nowadays.
 
Soldato
Joined
28 May 2007
Posts
18,293
I'm worried that I'd still hit 100% cpu usage. It is however a common problem if you Google it. But I'm sure I never had that issue before though. Perhaps because I had a lesser gpu? I went from 1080ti, to a 2080ti ( that's when the issues started.. Both 1440p mind) to now a 3080ti @4k but still hitting 100% usage.

Yeah, the 3600 will be holding back performance particularly with an Nvidia card. I used a very well tuned and overclocked 3800X and that was getting tapped out by a 3080 10gb.

When I upgraded to a 5950X I seen a decent performance jump with the 3080. I’d imagine for many games a 5800X3D would offer more performance again, especially at 1440. I’m unsure if a 5800X3D would fully leverage and tap out your 3080ti, but it would definitely be a huge jump in graphics performance over your 3000 Ryzen.
 
Associate
Joined
1 Jun 2012
Posts
193
Location
Hereford
Yeah, the 3600 will be holding back performance particularly with an Nvidia card. I used a very well tuned and overclocked 3800X and that was getting tapped out by a 3080 10gb.

When I upgraded to a 5950X I seen a decent performance jump with the 3080. I’d imagine for many games a 5800X3D would offer more performance again, especially at 1440. I’m unsure if a 5800X3D would fully leverage and tap out your 3080ti, but it would definitely be a huge jump in graphics performance over your 3000 Ryzen.
I play at 4k now. Im close to getting ga 5800x. Just need that bit of reassurance that the cpu limit of my 3600 will go!
 
Soldato
Joined
28 Jun 2013
Posts
3,696
You have to compare benchmark results since it varies game to game, it matters less the higher the resolution.

That is why many get mid range CPUs for gaming like the i5 instead of i7 / i9
 
Man of Honour
Joined
18 Oct 2002
Posts
100,415
Location
South Coast
There's always going to be SOME games that have weirdness. Starfields on a very old engine with lots of bolt on's. I'm not at all surprised the CPU gets hit a little harder.
Not just that, under no circumstance ever should a dialogue only cutscene see a 20 thread CPU see 75% utilisation. The game is categorically bugged for CPU utilisation and should never be used as a meric for benchmarking CPUs or anything for that matter. The same goes for a bunch of other games that have out of whack CPU usage due to poor optimisation. As far as I am concerned only Cyberpunk as of today remains the only game engine that maximises utilisation of both CPU and GPU in a positive way and the scaling differences as a result are excellent between different hardware. The thrashing CDPR got for 2077's launch forced them to essentially fix the engine to what it is now, and only now to abandon it in favour of UE5, wich continues to have CPU and optimisation issues :o

MS Flight Sim is another one, all of those CPU tasks are reliant on ingle threading and not multi threading. Apart from the plane details, the rest of the game looks rather average and runs at sub 100fps on a 4090, at 3440x1440. Poorly optimised engine once again that relies on CPU brute force of the non multi-threaded variety.

I've been lucky enough to have had the 12700KF since launch and used it through 3 GPUs, a 2070 Super, 3080 Ti FE and the 4090. The performance uplift is in orders of magnitude from just the GPU upgrades alone with the 4090 now showing performance in games that I won't need a CPU upgrade for the foreseeable future if I stick to 100-139fps boundaries(144Hz QD-OLED) - Which by all accounts seems will remain the case.
 
Last edited:
Soldato
Joined
11 Jun 2003
Posts
5,081
Location
Sheffield, UK
Not just that, under no circumstance ever should a dialogue only cutscene see a 20 thread CPU see 75% utilisation. The game is categorically bugged for CPU utilisation and should never be used as a meric for benchmarking CPUs or anything for that matter. The same goes for a bunch of other games that have out of whack CPU usage due to poor optimisation. As far as I am concerned only Cyberpunk as of today remains the only game engine that maximises utilisation of both CPU and GPU in a positive way and the scaling differences as a result are excellent between different hardware. The thrashing CDPR got for 2077's launch forced them to essentially fix the engine to what it is now, and only now to abandon it in favour of UE5, wich continues to have CPU and optimisation issues :o

MS Flight Sim is another one, all of those CPU tasks are reliant on ingle threading and not multi threading. Apart from the plane details, the rest of the game looks rather average and runs at sub 100fps on a 4090, at 3440x1440. Poorly optimised engine once again that relies on CPU brute force of the non multi-threaded variety.

I've been lucky enough to have had the 12700KF since launch and used it through 3 GPUs, a 2070 Super, 3080 Ti FE and the 4090. The performance uplift is in orders of magnitude from just the GPU upgrades alone with the 4090 now showing performance in games that I won't need a CPU upgrade for the foreseeable future if I stick to 100-139fps boundaries(144Hz QD-OLED) - Which by all accounts seems will remain the case.

Was kinda saying the same. Starfields built on a crappy old engine so will be weird as hell and is definitely an outlier on the CPU benchies front :D
 
Back
Top Bottom