• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Upgrade from 9900k to Ryzen 5900x right now for gaming or wait for Zen 3d V-cache refresh?

Associate
Joined
25 Apr 2017
Posts
847
I currently have a 9900k (overclocked to 5.0ghz) / 3080 Ti and play at 4K and lately I have been seeing instances of CPU bottlenecks in games like Cyberpunk 2077, Watch Dogs Legion and Far Cry 6, mainly with DLSS/FSR enabled and GPU usage in certain areas goes down to 80-90%. Far Cry 6 is consistently in the 80-90% range.

I was thinking of waiting for Alder Lake until now but seeing as there could be issues with older game titles, sky high prices of DDR5 RAM and possible stability issues at the beginning, wanted to stick to a more stable platform. Since I am playing at 4K, and the bottleneck isn’t that huge, should I wait for the new Zen 3 refresh in 2022, supposed to have a 15% increase in gaming performance or will a 5900X still get the job done?

I will only be using the PC for gaming, nothing more.
 
Soldato
Joined
9 Jun 2011
Posts
3,446
I currently have a 9900k (overclocked to 5.0ghz) / 3080 Ti and play at 4K and lately I have been seeing instances of CPU bottlenecks in games like Cyberpunk 2077, Watch Dogs Legion and Far Cry 6, mainly with DLSS/FSR enabled and GPU usage in certain areas goes down to 80-90%. Far Cry 6 is consistently in the 80-90% range.

I was thinking of waiting for Alder Lake until now but seeing as there could be issues with older game titles, sky high prices of DDR5 RAM and possible stability issues at the beginning, wanted to stick to a more stable platform. Since I am playing at 4K, and the bottleneck isn’t that huge, should I wait for the new Zen 3 refresh in 2022, supposed to have a 15% increase in gaming performance or will a 5900X still get the job done?

I will only be using the PC for gaming, nothing more.


WAITTTTTT lol dude - You waited till end of life for a 5900x - You are fine! - Wait for ryzen 5000 IMO but if you can't! then get the 3d V-cache


Just saw you are gaming at 4k - 0fps to gain here...
 
Soldato
Joined
15 Oct 2019
Posts
8,382
Location
Uk
Its a case of poorly optimised games as others have said although im supprised you had issues in WD legion as I found no difference between the 3600 and 5800X @ 1440P with a 3080 and the 9900K should be faster than a R5 3600.
Screenshot-16.png
Screenshot-20.png
 
Soldato
Joined
22 May 2010
Posts
7,860
Agree its the poor optimisation of games, my 9900k in far cry 6 sits around 50% usage and barely heats up. Same as you i've done a all core overclock of 5.0ghz as well paired with a 3090.

Hoping patches get released to fix it like they did with watch dogs legion.
 
Soldato
Joined
18 Feb 2015
Posts
5,889
I'd say wait to see AD-L results first, but most likely Zen3D will end up better for gaming. We'll see if the wait is worth the difference. As for the games, the people crying unoptimised simply don't understand game dev & that these games are truly that demanding, we're talking open world with either heavy RT (CP2077), which itself punishes the CPU more, or extreme levels of detail and draw distances (Far Cry 6), or a mix between the two (WD:L). If you have an Nvidia GPU this then compounds the problem as they are more CPU-bound than AMD GPUs, it's a difference borne out of different choices the vendors made - it worked great for NV with DX11, but now with DX12/VLK it's getting a spanking. So if you buy a high-end NV GPU then you are that much more required to also have a state of the art CPU to keep up with it, and if you want RT on it's not like you have a choice, clearly RDNA 2 can't keep up there, so in the end the CPU upgrade is mandatory.
 
Associate
Joined
8 Sep 2020
Posts
475
Don't judge your CPU performance on clearly a broken game mate :cry: As has been said already your 9900K is perfectly fine and you will notice pretty much 0% performance increase by switching unless gaming at 1080p which you are not . what you will notice is a nice dent in your wallet for changing CPU , MB and possibly Ram. If you just want to upgrade just for the sake of it at least wait till the 12th Gen reviews and game benchmarks have been released and make a decision from there but at 4K i bet we will still see pretty much no uplift at 4k between what you have now and 12th Gen :)
 
Soldato
Joined
18 Feb 2015
Posts
5,889
You are not understanding, and once again steam into a thread and try and put others down.

In Far Cry 6 my CPU usage is in the 20% range and my GPU is mainly at 99%, however the GPU will drop at times to 80% for a second then straight back up. Yes it is a demanding game, but even dropping settings right down this still occurs. If it was my 10900k then my cpu would be high usage and my GPU would be lower. Buying a better CPU won’t fix a poorly optimised game. Cyberpunk was and in some ways still is broken, please don’t use that as an example of a benchmark.
No, I do understand - fully. That's why I don't claim unoptimised when talking demanding, and why I have data to back it up. ;)

Far Cry 6 - easy peasy. You can see a single deviation on the frametime graph which is normal for open world games, it's just streaming stuttering most likely - hardly a big deal. The only other option is to have a very aggressive streaming like what Cyberpunk does but then you look at that compared to FC6 and you see clearly the nuked LOD transitions. So there's no free lunch, just a choice between compromises.
VUGeqdq.jpg

HSWzUnD.jpg

CP2077 - harder but doable. More consistent frametimes but as I said before that required a big sacrifice for LODs.
nYIEVEH.jpg

qplTCeq.jpg

It really is as simple as people need better CPUs. If you're telling me you have a 10900K and are not seeing results like these then you might have to examine your setup, perhaps there's an issue with it and you're not aware of it. Could be working fine in most games but open worlds like these tend to expose problems that are otherwise dormant.
 
Soldato
Joined
30 Jun 2019
Posts
3,919
All this talk of having nothing to gain with CPU upgrades if you happen to play at 4K is nonsense.

It depends entirely on how CPU intensive the game is (Total War games with high unit counts for example). There are also games that will load faster too, especially turn based games like Civ 6, but the main factors are always cache and IPC.

It looks like some new games, such as Farcry 6, will need a very powerful CPU, in terms of IPC and probably L3 cache too.

For most people, saving the money for a GPU upgrade first makes a lot more sense in terms of framerate, especially at 4K. You can get by with a recent 6 core CPU with many, but not all games.

If you use DLSS or upscaling in any games to play at 4K, you may only be barely reaching 60-70 FPS in some titles, in these cases I think your CPU can matter quite a bit, but you need to check CPU utilization in game.

@Shaz12 - It depends how much FPS drops, if it's down to 58-59 I doubt that matters much. The problems you notice in WD: Legions and Cyberpunk may indicate the direction of games released in 2022/2023, e.g. higher CPU utilization across multiple cores (I have seen the same thing in these games, with Cyberpunk though, I think the game needs a massive patch, or rework as some settings affect performance intermittently).

If you want to keep using DDR4 RAM, I'd wait for Zen3 + 3D cache. But, the advantage of Intel's new LGA 1200 platform, is that you may be able to upgrade to the 13th generation after Alder Lake. I would wait to see how good DDR4 support is on Alder Lake, in particular, how high can the RAM frequency go, before the memory controller has to be run at half the speed.

DDR5 should be avoided for a couple of years in my opinion, memory controller support appears to be poor. If you can wait for Zen 4, that should be a pretty substantial upgrade, maybe any DDR5 problems will be mitigated by the planned 25-30% increase in IPC?
 
Last edited:
Soldato
Joined
28 May 2007
Posts
16,792
From the little AMD have said about Zen 3D, it’s seems gaming is one of the main aspects AMD are targeting with this part.
 
Soldato
Joined
29 Sep 2010
Posts
4,961
No, I do understand - fully. That's why I don't claim unoptimised when talking demanding, and why I have data to back it up. ;)

Far Cry 6 - easy peasy. You can see a single deviation on the frametime graph which is normal for open world games, it's just streaming stuttering most likely - hardly a big deal. The only other option is to have a very aggressive streaming like what Cyberpunk does but then you look at that compared to FC6 and you see clearly the nuked LOD transitions. So there's no free lunch, just a choice between compromises.
VUGeqdq.jpg

HSWzUnD.jpg

CP2077 - harder but doable. More consistent frametimes but as I said before that required a big sacrifice for LODs.
nYIEVEH.jpg

qplTCeq.jpg

It really is as simple as people need better CPUs. If you're telling me you have a 10900K and are not seeing results like these then you might have to examine your setup, perhaps there's an issue with it and you're not aware of it. Could be working fine in most games but open worlds like these tend to expose problems that are otherwise dormant.
Might be completely missing your point here, what do these 720p res with RT on have to do with the OP running games @ 4K? Everyone knows that the lower the res the bigger the hit the cpu takes.

I've been running Metro Exodus on Ultra @4K with DLSS, RTX Ultra with no issues at all on a 10400f.
 
Soldato
Joined
28 May 2007
Posts
16,792
I’d sit tight until Zen 3D is out and then consider what performance is on offer. A 5900X would be a no brainier if it was a drop in upgrade.
 
Soldato
Joined
30 Jun 2019
Posts
3,919
A standard Zen 3 CPU is not a great idea right now, definitely worth waiting for 3D cache models.
 
Soldato
Joined
30 Jun 2019
Posts
3,919
Nevermind, I see what you mean now.

But either way, he's gonna need a new motherboard if he wants to upgrade. LGA 1700 + DDR4 motherboard would potentially offer more upgradability (he could use a Alder Lake CPU with 6/8 large cores for now), but no guarantee how many large cores the 13th gen might have, not sure I believe the current rumours it would only be 8, just like Alder Lake. I'd of thought at least half of them would be large cores, to actually get decent performance (perhaps this will depend on if further optimizations to Intel's 10nm tech are possible).

Hopefully, Intel will be helpful enough to confirm if the 13th gen. will work on LGA 1700 Alder Lake boards, in the coming months...

The 13th gen might mean another 10-20% boost in overall performance.

I like the idea of Zen 3 + 3D cache, but the AM4 platform is definitely a dead end. I think this will mostly be aimed at people who already have a later AM4 board, like the B550/ X570. It's a good option though, if you really need 12/16 large Zen 3 cores.

I think the early investors in DDR5 are gonna get burned.
 
Last edited:
Soldato
Joined
28 May 2007
Posts
16,792
Nevermind, I see what you mean now.

But either way, he's gonna need a new motherboard if he wants to upgrade. LGA 1700 + DDR4 motherboard would potentially offer more upgradability (he could use a Alder Lake CPU with 6/8 large cores for now), but no guarantee how many large cores the 13th gen might have, not sure I believe the current rumours it would only be 8, just like Alder Lake. I'd of thought at least half of them would be large cores, to actually get decent performance (perhaps this will depend on if further optimizations to Intel's 10nm tech are possible).

Hopefully, Intel will be helpful enough to confirm if the 13th gen. will work on LGA 1700 Alder Lake boards, in the coming months...

I like the idea of Zen 3 + 3D cache, but the AM4 platform is definitely a dead end. I think this will mostly be aimed at people who already have a later AM4 board, like the B550/ X570. It's a good option though, if you really need 12/16 large Zen 3 cores.

I think the early investors in DDR5 are gonna get burned.

Id sit tight until Alderlake and Zen 3D are out. All Intel will have on LGA1700 will be more of Alderlake with incremental performance increases. Possibly some performance regression to get power consumption under control.

LGA 1700 isn’t going to see years of support like AM4 had. Between now and Nova lake Intel will move to a new platform. Nove lake onward Intel should come back into contention in the desktop, but LGA1700 will be long gone.
 
Top Bottom