• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5700X enough for the Nvidia 4070TI

Associate
Joined
30 Jan 2004
Posts
800
Location
Milton Keynes, Bucks
As per the title, it seems my 5600 is holding it back especially with RT enabled. I have tested with Spiderman and Cyberpunk using rivatuner, however my GPU usage is rarely hitting 98% or more at 1440p.
All current drivers installed with no disabled cores, on a B550 MB and 16gb 3200Mhz ram.
TIA
 
I think maybe if you get a 5800X3D you'll see some worthwhile improvements. Just driiving around the city I still see occasional drops to low 70s due to CPU but there's likely even worse spots in the game. Where are you hitting the low 50?
I'm seeing it around the Japan market in particular. Additionally if I stream it's gets even worse. Although to be expected, so I thought if I add more cores there's less chance of a bottleneck?
Maybe the 5800X3D is the option.
 
Last edited:
Drops to the 60s running around the cherry blossom market with the 5800X3D for me on RT ultra (DLSS on). Worth mentioning that the upcoming patch will be adding frame generation to the game so should be a big boost in FPS for you there since it works around CPU limitations. Might have some nasty input lag if Portal RTX is anything to go by though :p
Blimey if your dropping too around the cherry blossom market with the X3D, there's little point in spending over £300 for one.
Maybe I'll go Intel at some point
 
I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.

All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.

If the problem is during streaming, then use your NVENC encoder on the GPU.

EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:

U0WuQDJ.jpg

There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
That seems to be without RT, because I definitely don't bottleneck with just high settings at 1440p DLSS.
 
Last edited:
Well one way to test what your issue is, crank dlss to ULTRA performance and see if you still have the dips. If you do, it's your CPU. Here is a 12900k with no gpu bottleneck

Ok I did as you suggested and regardless of DLSS setting my FPS while stationary is identical of 69-70. So it seems it's bottlenecking.
 
Gigabyte board limits the PBO boost clock to only +200MH. I like MSI which can do upto 350Mhz. So it isn’t like 5600X as the extra boost clock from 5600x won’t be caught. Unless you talking about stock 5600x.
Mines the non X version, so overclocking is very limited. I might just make do until I move onto AM5 further down the line. Unless of course I see a great deal on a 5800X3D
 
Yeah, it's your CPU then. You get around the same fps i used to get with my 11600k, it usually got around 90 to 100 but in heavy areas it dropped to the 60s. A 5700x would help but not by a lot. But honestly 60-70 in heavy scenes is fine, no need to spend any money, just enjoy what you have
I'll stick with it, unless I see a great deal on the X3D otherwise I'll wait to move to AM5.
 
Yeah, it's your CPU then. You get around the same fps i used to get with my 11600k, it usually got around 90 to 100 but in heavy areas it dropped to the 60s. A 5700x would help but not by a lot. But honestly 60-70 in heavy scenes is fine, no need to spend any money, just enjoy what you have
So I've also lowered resolution and the fps remained the same, going from 1440 to 1080. Both GPU usage well below 98%, bit once I switch to 4k the usage is 98%.
Definitely need a faster CPU for the lower res.
 
I don't want to run the risk of turning this thread into an argument about graphs and specs, but it looks as though OP has made the right choice to stick with what he has. Try using PBO to your advantage and maybe consider upgrading later on down the line if the performance really bothers you. Would also be a little mental to spend £300 on a CPU to get slightly more performance in one game :)
Yeah maybe it's me treating Cyberpunk like the OG Crysis being the be and end all of having a well performing PC. I have plenty of other games that are pretty much maxed out and run great. For example spiderman remastered is between 62-90 in native res, with crowd/traffic on low, with object range at 6.
Factor in widespread support for frame generation and it'll be super.
Thx everyone for your valued insight and respected opinions ✊
 
Last edited:
Yeah, whenever i wanna know if a cpu can cut it for gamingg, cyberpunk RT is my go to. But still if you actually get above 60 in the heaviest scenes in the game, it's fine. You probably hit 90+ on the rest of the game right?
Yeah generally 70-80 with DLSS quality and GPU usage around 80-90% most times. So if the GPU has headroom then a faster CPU would allow more fps, especially the lows!
 
Back
Top Bottom