Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I'll try and get a comparative screenshot later on. Will be interesting to see if there is a difference. Interesting reading the latest replies about NVIDIA cpu overhead.Maybe you're playing a different game? Here's proof of my experience with a 7600, 4090 @ 4k max details with RT enabled, from a few months ago:
Horrible frame drops and lag spikes in the majority of built up areas. Meanwhile, smooth as silk on my 7700 and 7950x3d and 13900k chips.
I'll try and get a comparative screenshot later on. Will be interesting to see if there is a difference. Interesting reading the latest replies about NVIDIA cpu overhead.
Ah I can't do that then.Needs to be 4k with RT enabled, as this drives CPU usage in this game.
Its not about driver overhead, everyone knows Nvidia has been offloading work on to the CPU for 10+ years. I was simply saying, enabling RT can have a large CPU overhead and on a 6 core it seems to be too much for it. It’s probably the same on AMD GPU’s even with a bit less general driver overhead. Currently, RT has a big performance hit on both GPU and CPU and that needs to be factored in.No, the Nvidia driver uses a lot more CPU cycles than for example AMD and possibly Intel just to exist, so wherever there is a CPU bottleneck the Nvidia GPU reaches it much sooner than an AMD GPU, to such an extent that a much slower AMD GPU can be much faster than a normally much faster Nvidia GPU, we are talking 30 - 40% just because the Nvidia driver chews up that many CPU cycles.
The GPU is running at 86%, i bet without that driver overhead it would be fine.Its not about driver overhead, everyone knows Nvidia has been offloading work on to the CPU for 10+ years. I was simply saying, enabling RT can have a large CPU overhead and on a 6 core it seems to be too much for it. It’s probably the same on AMD GPU’s even with a bit less general driver overhead. Currently, RT has a big performance hit on both GPU and CPU and that needs to be factored in.
This is not about Nvidia or driver overhead, we all know Nvidia are a horrifying company with **** driver overhead, its simply that currently, RT hammers the GPU and CPU. I don’t know why you cannot understand that. Think most AMD users don’t bother with RT as it not worth the performance hit so don’t notice or go on about the CPU hit.The GPU is running at 86%, i bet without that driver overhead it would be fine.
I'm not a fan of advocating people spend more on supporting hardware because the primary hardware vendor wants higher margins.
Nvidia driver overhead is a thing, its unnecessary if they spent more money on developing a better solution and integrating it in to better hardware, like AMD did. But we don't like to criticize Nvidia for anything, you see this all over the Internet including from Tech jurnoes, Nvidia's over priced GPU's are AMD's fault, to give you one example.
And we wonder why they keep peeing on us from a great height, they think we are stupid, we are.
This is not a problem with it not being an expensive enough CPU, this is a problem with Nvidia stealing a lot of its cycles to run an architecture designed on the cheap.
This is not about Nvidia or driver overhead, we all know Nvidia are a horrifying company with **** driver overhead, its simply that currently, RT hammers the GPU and CPU. I don’t know why you cannot understand that. Think most AMD users don’t bother with RT as it not worth the performance hit so don’t notice or go on about the CPU hit.
You can use FSR if needed, just make sure RT is enabled at say Ultra. It doesn’t matter if you run FSR Ultra Performance either, it’ll still tax the cou. Actually, the more FPS rendered the harder the CPU has to work.Ah I can't do that then.
OK thanks I'll give it a whirl. Just downloading a 60GB update for it. 70% done....You can use FSR if needed, just make sure RT is enabled at say Ultra. It doesn’t matter if you run FSR Ultra Performance either, it’ll still tax the cou. Actually, the more CPau rendered the harder the CPU has to work.
I'll let you in to a little secrete i discovered when researching my own GPU.
worst case scenario.
Reality outside of Cyberpunk.
Gainward GeForce RTX 4060Ti Panther 16GB GDDR6 Graphics Card
Order Gainward GeForce RTX 4060Ti Panther 16GB GDDR6 Graphics Card now online and benefit from fast delivery.www.overclockers.co.uk
Yeah, this factored heavily into my decision to buy a 7900GRE, outside of a few PT RT games (Alan Wake 2 being one that I refuse to buy anyway as its a single playthrough for me and they didn't create physical copies for PS5 so that I could sell on) , RT is really not that bad on RDNA 3 and will happily last me a few years.
To be honest I picked it over the 4070ti super as like you say, even that is pretty **** for RT, only the 4080 and 4090 make it worth itYou could make a straw man argument for as low down as the 4070 and point at Cyberpunk "Look 17% faster than the 7900 GRE" its 32 vs 37 FPS at 1440P, overall in a selection of RT games which includes Cyberpunk the 4070 is only 8% better in RT.
You have to step it up to the 4070 Ti to make an Nvidia vs AMD RT argument, starting price for that £770, even then the 7900 XT which is only 17% behind it in RT overall is £100 cheaper!
The RT argument is mostly nonsense. Its only true if you're paying really big money, AMD have a perception of being bad at RT, its imply not true and it comes from click bait tech journos laboring on ridiculous Cyberpunk slides with frame rates that are barely in to double digits for AMD and Nvidia, this is also why people by in large don't give a #### about Ray Tracing, they see that #### everywhere and think its unattainable anyway, something only cash rich whales can enjoy.
To be honest I picked it over the 4070ti super as like you say, even that is pretty **** for RT, only the 4080 and 4090 make it worth it
Is that at Toms Diner? It’s where all the cool kids hang out to maximise cpu usage. Stooeh can provide a save game file if needed.OK so my CPU was 100% regardless if I used RT on Ultra or no RT (which is how I usually play - all settings on high/ultra etc). I didn't notice any problems when playing. All appeared smooth. Micro stutter occasionally flashed up as shown at 1-2% but it went back to 0% just as quickly. Didn't notice any problems in game when this occured. What should I see if Micro Stutter was causing problems?
CPU Util 100%. CPU Temp 48ºC. Task Manager doesn't quite match yours, the CPU graphs aren't all 100% maxed even though it says 100%.
Like I say, gaming on the 7600 in Cyberpunk seems fine for me. But I don't use RT or 4K or an Nvidia GPU so perhaps this combo is the straw that broke the camels back.
Anyway back on topic - looking forward to swapping out the 7600 with the new Zen 5 when they arrive.