1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.
  2. Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

    Dismiss Notice

Upgrade from 9900k to Ryzen 5900x right now for gaming or wait for Zen 3d V-cache refresh?

Discussion in 'CPUs' started by Shaz12, 18 Oct 2021.

  1. Shaz12

    Hitman

    Joined: 25 Apr 2017

    Posts: 828

    I currently have a 9900k (overclocked to 5.0ghz) / 3080 Ti and play at 4K and lately I have been seeing instances of CPU bottlenecks in games like Cyberpunk 2077, Watch Dogs Legion and Far Cry 6, mainly with DLSS/FSR enabled and GPU usage in certain areas goes down to 80-90%. Far Cry 6 is consistently in the 80-90% range.

    I was thinking of waiting for Alder Lake until now but seeing as there could be issues with older game titles, sky high prices of DDR5 RAM and possible stability issues at the beginning, wanted to stick to a more stable platform. Since I am playing at 4K, and the bottleneck isn’t that huge, should I wait for the new Zen 3 refresh in 2022, supposed to have a 15% increase in gaming performance or will a 5900X still get the job done?

    I will only be using the PC for gaming, nothing more.
     
  2. faceman123

    Mobster

    Joined: 9 Jun 2011

    Posts: 3,351


    WAITTTTTT lol dude - You waited till end of life for a 5900x - You are fine! - Wait for ryzen 5000 IMO but if you can't! then get the 3d V-cache


    Just saw you are gaming at 4k - 0fps to gain here...
     
  3. Drollic

    Mobster

    Joined: 24 Feb 2013

    Posts: 3,794

    Location: East Midlands

    Wait for mid gen or later ddr5 systems, will be barely anything to gain. It's shoddy games that are the problem.
     
  4. Joxeon

    Sgarrista

    Joined: 15 Oct 2019

    Posts: 7,962

    Location: Uk

    Its a case of poorly optimised games as others have said although im supprised you had issues in WD legion as I found no difference between the 3600 and 5800X @ 1440P with a 3080 and the 9900K should be faster than a R5 3600.
    [​IMG]
    [​IMG]
     
  5. Jay85

    Sgarrista

    Joined: 22 May 2010

    Posts: 7,715

    Agree its the poor optimisation of games, my 9900k in far cry 6 sits around 50% usage and barely heats up. Same as you i've done a all core overclock of 5.0ghz as well paired with a 3090.

    Hoping patches get released to fix it like they did with watch dogs legion.
     
  6. Sean473

    Gangster

    Joined: 23 Dec 2020

    Posts: 250

    Far Cry 6 is pretty broken right now - Give it a 2-3 months it will be better!
     
  7. Poneros

    Soldato

    Joined: 18 Feb 2015

    Posts: 5,760

    I'd say wait to see AD-L results first, but most likely Zen3D will end up better for gaming. We'll see if the wait is worth the difference. As for the games, the people crying unoptimised simply don't understand game dev & that these games are truly that demanding, we're talking open world with either heavy RT (CP2077), which itself punishes the CPU more, or extreme levels of detail and draw distances (Far Cry 6), or a mix between the two (WD:L). If you have an Nvidia GPU this then compounds the problem as they are more CPU-bound than AMD GPUs, it's a difference borne out of different choices the vendors made - it worked great for NV with DX11, but now with DX12/VLK it's getting a spanking. So if you buy a high-end NV GPU then you are that much more required to also have a state of the art CPU to keep up with it, and if you want RT on it's not like you have a choice, clearly RDNA 2 can't keep up there, so in the end the CPU upgrade is mandatory.
     
  8. Jay-G25

    Gangster

    Joined: 8 Sep 2020

    Posts: 449

    Don't judge your CPU performance on clearly a broken game mate :cry: As has been said already your 9900K is perfectly fine and you will notice pretty much 0% performance increase by switching unless gaming at 1080p which you are not . what you will notice is a nice dent in your wallet for changing CPU , MB and possibly Ram. If you just want to upgrade just for the sake of it at least wait till the 12th Gen reviews and game benchmarks have been released and make a decision from there but at 4K i bet we will still see pretty much no uplift at 4k between what you have now and 12th Gen :)
     
  9. Poneros

    Soldato

    Joined: 18 Feb 2015

    Posts: 5,760

    No, I do understand - fully. That's why I don't claim unoptimised when talking demanding, and why I have data to back it up. ;)

    Far Cry 6 - easy peasy. You can see a single deviation on the frametime graph which is normal for open world games, it's just streaming stuttering most likely - hardly a big deal. The only other option is to have a very aggressive streaming like what Cyberpunk does but then you look at that compared to FC6 and you see clearly the nuked LOD transitions. So there's no free lunch, just a choice between compromises.
    [​IMG]
    [​IMG]

    CP2077 - harder but doable. More consistent frametimes but as I said before that required a big sacrifice for LODs.
    [​IMG]
    [​IMG]

    It really is as simple as people need better CPUs. If you're telling me you have a 10900K and are not seeing results like these then you might have to examine your setup, perhaps there's an issue with it and you're not aware of it. Could be working fine in most games but open worlds like these tend to expose problems that are otherwise dormant.
     
  10. humbug

    Caporegime

    Joined: 17 Mar 2012

    Posts: 39,318

    K for core ^^^^^^ :D
     
  11. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,161

    All this talk of having nothing to gain with CPU upgrades if you happen to play at 4K is nonsense.

    It depends entirely on how CPU intensive the game is (Total War games with high unit counts for example). There are also games that will load faster too, especially turn based games like Civ 6, but the main factors are always cache and IPC.

    It looks like some new games, such as Farcry 6, will need a very powerful CPU, in terms of IPC and probably L3 cache too.

    For most people, saving the money for a GPU upgrade first makes a lot more sense in terms of framerate, especially at 4K. You can get by with a recent 6 core CPU with many, but not all games.

    If you use DLSS or upscaling in any games to play at 4K, you may only be barely reaching 60-70 FPS in some titles, in these cases I think your CPU can matter quite a bit, but you need to check CPU utilization in game.

    @Shaz12 - It depends how much FPS drops, if it's down to 58-59 I doubt that matters much. The problems you notice in WD: Legions and Cyberpunk may indicate the direction of games released in 2022/2023, e.g. higher CPU utilization across multiple cores (I have seen the same thing in these games, with Cyberpunk though, I think the game needs a massive patch, or rework as some settings affect performance intermittently).

    If you want to keep using DDR4 RAM, I'd wait for Zen3 + 3D cache. But, the advantage of Intel's new LGA 1200 platform, is that you may be able to upgrade to the 13th generation after Alder Lake. I would wait to see how good DDR4 support is on Alder Lake, in particular, how high can the RAM frequency go, before the memory controller has to be run at half the speed.

    DDR5 should be avoided for a couple of years in my opinion, memory controller support appears to be poor. If you can wait for Zen 4, that should be a pretty substantial upgrade, maybe any DDR5 problems will be mitigated by the planned 25-30% increase in IPC?
     
    Last edited: 20 Oct 2021
  12. jigger

    Capodecina

    Joined: 28 May 2007

    Posts: 16,729

    From the little AMD have said about Zen 3D, it’s seems gaming is one of the main aspects AMD are targeting with this part.
     
  13. Defy Belief

    Mobster

    Joined: 29 Sep 2010

    Posts: 4,774

    Might be completely missing your point here, what do these 720p res with RT on have to do with the OP running games @ 4K? Everyone knows that the lower the res the bigger the hit the cpu takes.

    I've been running Metro Exodus on Ultra @4K with DLSS, RTX Ultra with no issues at all on a 10400f.
     
  14. jigger

    Capodecina

    Joined: 28 May 2007

    Posts: 16,729

    I’d sit tight until Zen 3D is out and then consider what performance is on offer. A 5900X would be a no brainier if it was a drop in upgrade.
     
  15. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,161

    A standard Zen 3 CPU is not a great idea right now, definitely worth waiting for 3D cache models.
     
  16. jigger

    Capodecina

    Joined: 28 May 2007

    Posts: 16,729

    It is for a drop in upgrade. Not so much if you need a motherboard and OS.
     
  17. CAT-THE-FIFTH

    Capodecina

    Joined: 9 Nov 2009

    Posts: 21,897

    Location: Planet Earth

    If I were in your situation OP,I would wait and see how Rocketlake and the Zen3 3D V-Cache models pan out,and IMHO a Core i9 9900K probably is enough to hold off until Zen4 IMHO.
     
  18. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,161

    .
     
  19. jigger

    Capodecina

    Joined: 28 May 2007

    Posts: 16,729

    Not sure what you mean.
     
  20. g67575

    Mobster

    Joined: 30 Jun 2019

    Posts: 3,161

    Nevermind, I see what you mean now.

    But either way, he's gonna need a new motherboard if he wants to upgrade. LGA 1700 + DDR4 motherboard would potentially offer more upgradability (he could use a Alder Lake CPU with 6/8 large cores for now), but no guarantee how many large cores the 13th gen might have, not sure I believe the current rumours it would only be 8, just like Alder Lake. I'd of thought at least half of them would be large cores, to actually get decent performance (perhaps this will depend on if further optimizations to Intel's 10nm tech are possible).

    Hopefully, Intel will be helpful enough to confirm if the 13th gen. will work on LGA 1700 Alder Lake boards, in the coming months...

    The 13th gen might mean another 10-20% boost in overall performance.

    I like the idea of Zen 3 + 3D cache, but the AM4 platform is definitely a dead end. I think this will mostly be aimed at people who already have a later AM4 board, like the B550/ X570. It's a good option though, if you really need 12/16 large Zen 3 cores.

    I think the early investors in DDR5 are gonna get burned.
     
    Last edited: 20 Oct 2021
  21. jigger

    Capodecina

    Joined: 28 May 2007

    Posts: 16,729

    Id sit tight until Alderlake and Zen 3D are out. All Intel will have on LGA1700 will be more of Alderlake with incremental performance increases. Possibly some performance regression to get power consumption under control.

    LGA 1700 isn’t going to see years of support like AM4 had. Between now and Nova lake Intel will move to a new platform. Nove lake onward Intel should come back into contention in the desktop, but LGA1700 will be long gone.