Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
See, the thing about reviews is, you have to be knowledgable enough to review the reviewers. Problem is, people just see it on the web then copy-paste and don't understand why it's wrong in reality.I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.
All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.
If the problem is during streaming, then use your NVENC encoder on the GPU.
EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:
There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
Well one way to test what your issue is, crank dlss to ULTRA performance and see if you still have the dips. If you do, it's your CPU. Here is a 12900k with no gpu bottleneckIt's an old pic now so I would have to guess it's the RT ultra preset with DLSS on. Was also while driving around Japantown which causes FPS to fluctuate a bit - it's not always below 60. Go for a run round around cherry blossom market and see if your 5800X holds up there as it pretty demanding in motion. A good chance the 3080 is GPU limited anyway at 1440p DLSS Quality though.
These runs are definitely with RT off. RT makes it way harder for the CPU ^^I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.
All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.
If the problem is during streaming, then use your NVENC encoder on the GPU.
EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:
There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
Ok I did as you suggested and regardless of DLSS setting my FPS while stationary is identical of 69-70. So it seems it's bottlenecking.Well one way to test what your issue is, crank dlss to ULTRA performance and see if you still have the dips. If you do, it's your CPU. Here is a 12900k with no gpu bottleneck
Cyberpunk 12900k 4090 1440p RT ULTRA
youtu.be
Yeah, it's your CPU then. You get around the same fps i used to get with my 11600k, it usually got around 90 to 100 but in heavy areas it dropped to the 60s. A 5700x would help but not by a lot. But honestly 60-70 in heavy scenes is fine, no need to spend any money, just enjoy what you haveOk I did as you suggested and regardless of DLSS setting my FPS while stationary is identical of 69-70. So it seems it's bottlenecking.
Gigabyte board limits the PBO boost clock to only +200MH. I like MSI which can do upto 350Mhz. So it isn’t like 5600X as the extra boost clock from 5600x won’t be caught. Unless you talking about stock 5600x.You removed the power limit of the 5600? It will basically turn into a 5600X if you do. That should be similar to a 5700X.
PBO2 for AMD Ryzen - Free Performance!
Try FreshBooks free, for 30 days, no credit card required at https://www.freshbooks.com/techquickieHow does Precision Boost Overdrive 2 ("PBO") work on recen...www.youtube.com
Mines the non X version, so overclocking is very limited. I might just make do until I move onto AM5 further down the line. Unless of course I see a great deal on a 5800X3DGigabyte board limits the PBO boost clock to only +200MH. I like MSI which can do upto 350Mhz. So it isn’t like 5600X as the extra boost clock from 5600x won’t be caught. Unless you talking about stock 5600x.
I'll stick with it, unless I see a great deal on the X3D otherwise I'll wait to move to AM5.Yeah, it's your CPU then. You get around the same fps i used to get with my 11600k, it usually got around 90 to 100 but in heavy areas it dropped to the 60s. A 5700x would help but not by a lot. But honestly 60-70 in heavy scenes is fine, no need to spend any money, just enjoy what you have
So I've also lowered resolution and the fps remained the same, going from 1440 to 1080. Both GPU usage well below 98%, bit once I switch to 4k the usage is 98%.Yeah, it's your CPU then. You get around the same fps i used to get with my 11600k, it usually got around 90 to 100 but in heavy areas it dropped to the 60s. A 5700x would help but not by a lot. But honestly 60-70 in heavy scenes is fine, no need to spend any money, just enjoy what you have
Yeah maybe it's me treating Cyberpunk like the OG Crysis being the be and end all of having a well performing PC. I have plenty of other games that are pretty much maxed out and run great. For example spiderman remastered is between 62-90 in native res, with crowd/traffic on low, with object range at 6.I don't want to run the risk of turning this thread into an argument about graphs and specs, but it looks as though OP has made the right choice to stick with what he has. Try using PBO to your advantage and maybe consider upgrading later on down the line if the performance really bothers you. Would also be a little mental to spend £300 on a CPU to get slightly more performance in one game
When you say overclocking is limited what do you mean?Mines the non X version, so overclocking is very limited. I might just make do until I move onto AM5 further down the line. Unless of course I see a great deal on a 5800X3D
Yeah, whenever i wanna know if a cpu can cut it for gamingg, cyberpunk RT is my go to. But still if you actually get above 60 in the heaviest scenes in the game, it's fine. You probably hit 90+ on the rest of the game right?Yeah maybe it's me treating Cyberpunk like the OG Crysis being the be and end all of having a well performing PC. I have plenty of other games that are pretty much maxed out and run great. For example spiderman remastered is between 62-90 in native res, with crowd/traffic on low, with object range at 6.
Factor in widespread support for frame generation and it'll be super.
Thx everyone for your valued insight and respected opinions
Yeah generally 70-80 with DLSS quality and GPU usage around 80-90% most times. So if the GPU has headroom then a faster CPU would allow more fps, especially the lows!Yeah, whenever i wanna know if a cpu can cut it for gamingg, cyberpunk RT is my go to. But still if you actually get above 60 in the heaviest scenes in the game, it's fine. You probably hit 90+ on the rest of the game right?