• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

5700X enough for the Nvidia 4070TI

Associate
Joined
30 Jan 2004
Posts
800
Location
Milton Keynes, Bucks
As per the title, it seems my 5600 is holding it back especially with RT enabled. I have tested with Spiderman and Cyberpunk using rivatuner, however my GPU usage is rarely hitting 98% or more at 1440p.
All current drivers installed with no disabled cores, on a B550 MB and 16gb 3200Mhz ram.
TIA
 
As per the title, it seems my 5600 is holding it back especially with RT enabled. I have tested with Spiderman and Cyberpunk using rivatuner, however my GPU usage is rarely hitting 98% or more at 1440p.
All current drivers installed with no disabled cores, on a B550 MB and 16gb 3200Mhz ram.
TIA

No, its not going to make any difference.

The best you can do on the B550 is the 5800X3D.

HLCUlVQ.jpg

FqQFG90.jpg
 
You removed the power limit of the 5600? It will basically turn into a 5600X if you do. That should be similar to a 5700X.

 
Last edited:
Definitely, I use DLSS quality but the drops are into the low 50s.
I thought if I upgraded the CPU this would be higher?
I think maybe if you get a 5800X3D you'll see some worthwhile improvements. Just driiving around the city I still see occasional drops to low 70s due to CPU but there's likely even worse spots in the game. Where are you hitting the low 50?
 
As per the title, it seems my 5600 is holding it back especially with RT enabled. I have tested with Spiderman and Cyberpunk using rivatuner, however my GPU usage is rarely hitting 98% or more at 1440p.
All current drivers installed with no disabled cores, on a B550 MB and 16gb 3200Mhz ram.
TIA
YOu would see an improvement but I don't think it'll be worth the extra money.

98% usage for your GPU is essentially 100% so don't stress (you'd probably still see 98% usage even with a much faster CPU)

As you move to a higher resolution, 1440P or 4K, you do put more pressure on the GPU rather than the CPU but the 4070ti is really quick so I would agree with the others that the 5800X3D would be a better option. Intels options are better but that means spending money on a new CPU and new motherboard and the 5800X3D competes with a 12900K and 13900K.

I do prefer 8 cores to 6 cores for modern use cases (6 cores works but those two extra cores can be useful for some games/apps) but the increased cache advantage of the X3D is amazing at catching up to the high end CPUs.
 
5800x3D is the only AM4 upgrade you should really consider. Otherwise Intel (new platform costs for mobo and RAM) or new AM5 (again same platform costs) Some deals websites have decent deals on the 5800x3d and you can recoup come cash back on the current CPU to cover a chunk of the outlay of the 5800x3d.
 
I think maybe if you get a 5800X3D you'll see some worthwhile improvements. Just driiving around the city I still see occasional drops to low 70s due to CPU but there's likely even worse spots in the game. Where are you hitting the low 50?
I'm seeing it around the Japan market in particular. Additionally if I stream it's gets even worse. Although to be expected, so I thought if I add more cores there's less chance of a bottleneck?
Maybe the 5800X3D is the option.
 
Last edited:
I'm seeing it around the Japan market in particular. Additionally if I stream it's gets even worse. Although to be expected, do I thought if I add more cores there's less chance of a bottleneck?
Maybe the 5800X3D is the option.
Well if you stream you'd need a much stronger CPU because it's transcoding the video in real time.

You may want to look into the intel Arc A380 which can do AV1 encoding. It's not expensive for what it delivers.
 
No, the upgrade won't do anything for you. And in Cyberpunk in particular AMD does poorly compared to Intel, with only the 5800X3D putting up a decent showing. If you like open world games with good usage of ray tracing then I'm afraid Intel is simply better. I guess wait and see what the 7800X3D performs like before making a decision on an upgrade.

d3L7Msv.jpg
3k2GhMb.jpg
 
I'm seeing it around the Japan market in particular. Additionally if I stream it's gets even worse. Although to be expected, so I thought if I add more cores there's less chance of a bottleneck?
Maybe the 5800X3D is the option.
Drops to the 60s running around the cherry blossom market with the 5800X3D for me on RT ultra (DLSS on). Worth mentioning that the upcoming patch will be adding frame generation to the game so should be a big boost in FPS for you there since it works around CPU limitations. Might have some nasty input lag if Portal RTX is anything to go by though :p
 
Last edited:
Drops to the 60s running around the cherry blossom market with the 5800X3D for me on RT ultra (DLSS on). Worth mentioning that the upcoming patch will be adding frame generation to the game so should be a big boost in FPS for you there since it works around CPU limitations. Might have some nasty input lag if Portal RTX is anything to go by though :p
Blimey if your dropping too around the cherry blossom market with the X3D, there's little point in spending over £300 for one.
Maybe I'll go Intel at some point
 
I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.

All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.

If the problem is during streaming, then use your NVENC encoder on the GPU.

EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:

U0WuQDJ.jpg

There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
 
Last edited:
I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.

All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.

If the problem is during streaming, then use your NVENC encoder on the GPU.

EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:

U0WuQDJ.jpg

There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
Depends where they tested it and as mentioned it just seems to run better on Intel.

Here's a pic of my old 5800X struggling on the CPU front:
gVwOxj1.jpg
 
I'm a bit confused here. You should not need to upgrade from a 5600 if you're playing at 1440p. If your GPU utilistation is at 90% or higher then a CPU change will make very little change other than emptying your wallet.

All these charts that people are sharing are at 720p and 1080p, which is a completely different ball game to 1440p.

If the problem is during streaming, then use your NVENC encoder on the GPU.

EDIT. See below chart from Jarrod's Tech. Granted this is with an RTX 3090 Ti, but it shows that at 1440p and above there is next to nothing in it between even a 5600x and a 7600x, for example:

U0WuQDJ.jpg

There are some games where even at 1440p a CPU upgrade will show a difference, but CP2077 isn't one of them.
That seems to be without RT, because I definitely don't bottleneck with just high settings at 1440p DLSS.
 
Last edited:
Depends where they tested it and as mentioned it just seems to run better on Intel.

Here's a pic of my old 5800X struggling on the CPU front:
gVwOxj1.jpg
What settings/resolution are you running? I have the exact same CPU & GPU and do not have this problem. I play at 1440p with RT on and DLSS set to Quality.

EDIT:

90-100% GPU utlisation. ~40% CPU utilisation on 5800x.

My screenshot:

SmQFibx.jpg
 
Last edited:
It's an old pic now so I would have to guess it's the RT ultra preset with DLSS on. Was also while driving around Japantown which causes FPS to fluctuate a bit - it's not always below 60. Go for a run round around cherry blossom market and see if your 5800X holds up there as it pretty demanding in motion. A good chance the 3080 is GPU limited anyway at 1440p DLSS Quality though.
 
Back
Top Bottom