• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD Zen 5 rumours

Soldato
Joined
28 May 2007
Posts
18,581
I think one of the main issues is having reduced power available to the CPU. I think the extra 40watts would make more difference more of the time. At least for Nvidia.
 
Soldato
Joined
28 May 2007
Posts
18,581
Lot of hints Nvidia offload a quite a few tasks to the CPU that AMD do in GPU hardware. Physics calculations used to be a big one. It makes sense in some aspects as Nvidia can save money on silicon and development. The down side is that cost is either past on to the user or GPU performance tanks a bit.

I’d love a peek under the Nvidia driver hood to see what’s what.
 
Last edited:
Soldato
Joined
12 Sep 2003
Posts
10,198
Location
Newcastle, UK
Maybe you're playing a different game? Here's proof of my experience with a 7600, 4090 @ 4k max details with RT enabled, from a few months ago:

MF7FqwI.jpeg


Horrible frame drops and lag spikes in the majority of built up areas. Meanwhile, smooth as silk on my 7700 and 7950x3d and 13900k chips.
I'll try and get a comparative screenshot later on. Will be interesting to see if there is a difference. Interesting reading the latest replies about NVIDIA cpu overhead.
 
Soldato
Joined
1 Feb 2006
Posts
3,437
No, the Nvidia driver uses a lot more CPU cycles than for example AMD and possibly Intel just to exist, so wherever there is a CPU bottleneck the Nvidia GPU reaches it much sooner than an AMD GPU, to such an extent that a much slower AMD GPU can be much faster than a normally much faster Nvidia GPU, we are talking 30 - 40% just because the Nvidia driver chews up that many CPU cycles.
Its not about driver overhead, everyone knows Nvidia has been offloading work on to the CPU for 10+ years. I was simply saying, enabling RT can have a large CPU overhead and on a 6 core it seems to be too much for it. It’s probably the same on AMD GPU’s even with a bit less general driver overhead. Currently, RT has a big performance hit on both GPU and CPU and that needs to be factored in.
 
Last edited:
Suspended
Joined
17 Mar 2012
Posts
48,325
Location
ARC-L1, Stanton System
Its not about driver overhead, everyone knows Nvidia has been offloading work on to the CPU for 10+ years. I was simply saying, enabling RT can have a large CPU overhead and on a 6 core it seems to be too much for it. It’s probably the same on AMD GPU’s even with a bit less general driver overhead. Currently, RT has a big performance hit on both GPU and CPU and that needs to be factored in.
The GPU is running at 86%, i bet without that driver overhead it would be fine.

I'm not a fan of advocating people spend more on supporting hardware because the primary hardware vendor wants higher margins.

Nvidia driver overhead is a thing, its unnecessary if they spent more money on developing a better solution and integrating it in to better hardware, like AMD did. But we don't like to criticize Nvidia for anything, you see this all over the Internet including from Tech jurnoes, Nvidia's over priced GPU's are AMD's fault, to give you one example.
And we wonder why they keep peeing on us from a great height, they think we are stupid, we are.

This is not a problem with it not being an expensive enough CPU, this is a problem with Nvidia stealing a lot of its cycles to run an architecture designed on the cheap.
 
Last edited:
Soldato
Joined
1 Feb 2006
Posts
3,437
The GPU is running at 86%, i bet without that driver overhead it would be fine.

I'm not a fan of advocating people spend more on supporting hardware because the primary hardware vendor wants higher margins.

Nvidia driver overhead is a thing, its unnecessary if they spent more money on developing a better solution and integrating it in to better hardware, like AMD did. But we don't like to criticize Nvidia for anything, you see this all over the Internet including from Tech jurnoes, Nvidia's over priced GPU's are AMD's fault, to give you one example.
And we wonder why they keep peeing on us from a great height, they think we are stupid, we are.

This is not a problem with it not being an expensive enough CPU, this is a problem with Nvidia stealing a lot of its cycles to run an architecture designed on the cheap.
This is not about Nvidia or driver overhead, we all know Nvidia are a horrifying company with **** driver overhead, its simply that currently, RT hammers the GPU and CPU. I don’t know why you cannot understand that. Think most AMD users don’t bother with RT as it not worth the performance hit so don’t notice or go on about the CPU hit.
 
Suspended
Joined
17 Mar 2012
Posts
48,325
Location
ARC-L1, Stanton System
This is not about Nvidia or driver overhead, we all know Nvidia are a horrifying company with **** driver overhead, its simply that currently, RT hammers the GPU and CPU. I don’t know why you cannot understand that. Think most AMD users don’t bother with RT as it not worth the performance hit so don’t notice or go on about the CPU hit.

I think if most AMD users don't bother with RT because the performance hit is too great then exactly the same is true for Nvidia users. Most people are not spending £800 on GPU's and with that said show me an Nvidia GPU that's better at RT than this for £450.
 
Last edited:
Associate
Joined
31 Dec 2011
Posts
837
I'll let you in to a little secrete i discovered when researching my own GPU.

worst case scenario.

ApL14w5.png


Reality outside of Cyberpunk.

1Zbp6z4.png



Yeah, this factored heavily into my decision to buy a 7900GRE, outside of a few PT RT games (Alan Wake 2 being one that I refuse to buy anyway as its a single playthrough for me and they didn't create physical copies for PS5 so that I could sell on) , RT is really not that bad on RDNA 3 and will happily last me a few years.
 
Suspended
Joined
17 Mar 2012
Posts
48,325
Location
ARC-L1, Stanton System
Yeah, this factored heavily into my decision to buy a 7900GRE, outside of a few PT RT games (Alan Wake 2 being one that I refuse to buy anyway as its a single playthrough for me and they didn't create physical copies for PS5 so that I could sell on) , RT is really not that bad on RDNA 3 and will happily last me a few years.


You could make a straw man argument for as low down as the 4070 and point at Cyberpunk "Look 17% faster than the 7900 GRE" its 32 vs 37 FPS at 1440P, overall in a selection of RT games which includes Cyberpunk the 4070 is only 8% better in RT.

You have to step it up to the 4070 Ti to make an Nvidia vs AMD RT argument, starting price for that £770, even then the 7900 XT which is only 17% behind it in RT overall is £100 cheaper!

The RT argument is mostly nonsense. Its only true if you're paying really big money, AMD have a perception of being bad at RT, its simply not true and it comes from click bait tech journos laboring on ridiculous Cyberpunk slides with frame rates that are barely in to double digits for AMD and Nvidia, this is also why people by in large don't give a #### about Ray Tracing, they see that #### everywhere and think its unattainable anyway, something only cash rich whales can enjoy.
 
Last edited:
Associate
Joined
31 Dec 2011
Posts
837
You could make a straw man argument for as low down as the 4070 and point at Cyberpunk "Look 17% faster than the 7900 GRE" its 32 vs 37 FPS at 1440P, overall in a selection of RT games which includes Cyberpunk the 4070 is only 8% better in RT.

You have to step it up to the 4070 Ti to make an Nvidia vs AMD RT argument, starting price for that £770, even then the 7900 XT which is only 17% behind it in RT overall is £100 cheaper!

The RT argument is mostly nonsense. Its only true if you're paying really big money, AMD have a perception of being bad at RT, its imply not true and it comes from click bait tech journos laboring on ridiculous Cyberpunk slides with frame rates that are barely in to double digits for AMD and Nvidia, this is also why people by in large don't give a #### about Ray Tracing, they see that #### everywhere and think its unattainable anyway, something only cash rich whales can enjoy.
To be honest I picked it over the 4070ti super as like you say, even that is pretty **** for RT, only the 4080 and 4090 make it worth it
 
Soldato
Joined
12 Sep 2003
Posts
10,198
Location
Newcastle, UK
OK so my CPU was 100% regardless if I used RT on Ultra or no RT (which is how I usually play - all settings on high/ultra etc). I didn't notice any problems when playing. All appeared smooth. Micro stutter occasionally flashed up as shown at 1-2% but it went back to 0% just as quickly. Didn't notice any problems in game when this occured. What should I see if Micro Stutter was causing problems?

Metrics.jpg


CPU Util 100%. CPU Temp 48ºC. Task Manager doesn't quite match yours, the CPU graphs aren't all 100% maxed even though it says 100%.

Metrics2.jpg


Like I say, gaming on the 7600 in Cyberpunk seems fine for me. But I don't use RT or 4K or an Nvidia GPU so perhaps this combo is the straw that broke the camels back.

Anyway back on topic - looking forward to swapping out the 7600 with the new Zen 5 when they arrive. :)
 
Soldato
Joined
19 Sep 2009
Posts
2,770
Location
Riedquat system
Dave could have had other stuff going on the background eating CPU time like whatever is using 15mbit network. Plus they may have improved the CPU usage in later builds - took them ages to add proper AMD SMT support! Be interesting to see how Zen5 6 core CPUs handle such titles :)
 
Caporegime
Joined
12 Jul 2007
Posts
40,879
Location
United Kingdom
OK so my CPU was 100% regardless if I used RT on Ultra or no RT (which is how I usually play - all settings on high/ultra etc). I didn't notice any problems when playing. All appeared smooth. Micro stutter occasionally flashed up as shown at 1-2% but it went back to 0% just as quickly. Didn't notice any problems in game when this occured. What should I see if Micro Stutter was causing problems?

Metrics.jpg


CPU Util 100%. CPU Temp 48ºC. Task Manager doesn't quite match yours, the CPU graphs aren't all 100% maxed even though it says 100%.

Metrics2.jpg


Like I say, gaming on the 7600 in Cyberpunk seems fine for me. But I don't use RT or 4K or an Nvidia GPU so perhaps this combo is the straw that broke the camels back.

Anyway back on topic - looking forward to swapping out the 7600 with the new Zen 5 when they arrive. :)
Is that at Toms Diner? It’s where all the cool kids hang out to maximise cpu usage. Stooeh can provide a save game file if needed.
 
Back
Top Bottom