• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA 4000 Series

Looks like amd cpus are holding back 4090:

NXpKrzv.png

Kind of wish I went with a 13600k setup now instead of 5800x3d :(


I'm not surprised, in footage I saw yesterday I noticed the rtx4090 was sitting at just 50% usage at 4k Ultra with RT on. Turning on frame generation doesn't help even though frame generation is supposed to remove CPU bottlenecks so something else may be going on
 
Last edited:
This is just whataboutism.

Stop turning this into that rubbish of "whataboutism", he clearly said he didn't want to get a nvidia gpu because of the vram "issue" i.e. "dead on arrival" so why not buy the gpu with more vram since performance will be better with a gpu that has more vram, right..... no.

You think it might get fixed in a future update?

Just enjoy the 5800x3d for two more years and then when the next gen cards come out you can upgrade the CPU around that time.

Lucky for me I do not think it would have much of an impact as I plan to go back to 4K at some point so that will give my 5900X much more breathing room. Plus I would not be bothered if I could not hit higher fps as much. As long as it can hit at least 60fps I would be content :)

Hopefully! Fingers crossed it is on the game side and not amd side as we'll be waiting forever if it is on amd.....

And yeah exactly, will enjoy it for now, even the current top end cpus are still holding back the 4090 so maybe the next cpus will allow the 4090 to stretch its legs more.
 
  • Like
Reactions: TNA
I'm not surprised, in footage I saw yesterday I noticed the rtx4090 was sitting at just 50% usage at 4k Ultra with RT on. Turning on frame generation doesn't help even though frame generation is supposed to remove CPU bottlenecks so something else may be going on

Now that you mention this, I think similar behaviour has happened in other games when it comes to amd cpus and frame generation, was it spiderman miles morale and maybe witcher 3?
 
And by doing that, sacrifices graphical settings and res. massively :D Might as well get a steam deck then :p
Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)

PC gaming is an overpriced joke at the moment. No amount of confirmation bias posting on OCUK will change that. :D
 
Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)

PC gaming is an overpriced joke at the moment. No amount of confirmation bias posting on OCUK will change that. :D

And just like the GOW and HZD which arrived on pc (along with sonys other titles), when those games arrive on pc, they'll likely run at a higher res. more intensive graphics and maybe even RT whilst all being at a higher fps too.
 
Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)

PC gaming is an overpriced joke at the moment. No amount of confirmation bias posting on OCUK will change that. :D
Unoptimized games are Unoptimized on consoles as well. Case in point, forspoken drops to 720p on a ps5. So please, these are comparisons are silly. A mid range pc from 5 years ago can get similar performnace to a console.
 
I'm sure the game developer or Nvidia will fix whatever issue is going on with this game on Ryzen CPUs. I can confirm that although initially being impressed with frame generation (FG) in Cyberpunk, it has some issues and it looks like the same issues are occurring on the Witcher 3 too. The issues specifically are stuttering, either in game randomly or when navigating menu screens. This issue is discussed here and here. Same developer, same game engine, so maybe related. Unsure if the Hogwarts issue is anything to do with FG or not as I don't have the game yet. Either way, the fixes for all will only come via the developer (most likely) or Nvidia as it'll either be a coding issue or perhaps a driver issue. As soon as turn FG off, everything works as it should.
 
Last edited:
Stop turning this into that rubbish of "whataboutism", he clearly said he didn't want to get a nvidia gpu because of the vram "issue" i.e. "dead on arrival" so why not buy the gpu with more vram since performance will be better with a gpu that has more vram, right..... no.
We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.
 
I'm sure the game developer or Nvidia will fix whatever issue is going on with this game on Ryzen CPUs. I can confirm that although initially being impressed with frame generation (FG) in Cyberpunk, it has some issues and it looks like the same issues are occurring on the Witcher 3 too. The issues specifically are stuttering, either in game randomly or when navigating menu screens. This issue is discussed here and here. Same developer, same game engine, so maybe related. Unsure if the Hogwarts issue is anything to do with FG or not as I don't have the game yet. Either way, the fixes for all will only come via the developer (most likely) or Nvidia as it'll either be a coding issue or perhaps a driver issue. As soon as turn FG off, everything works as it should.
I'm sure it'll be fixed by the next gen optical flow analyser in the 5000 series :p
 
We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.

Did you read the post I responded to?

Here I'll quote it for you and highlight the relevant bits.

sigh, so now hogwarts legacy is setting the stage for even 12GB vram not being enough and the 4070ti is dead on arrival.

What gpu am I suppose to buy now then? Have a gtx 1660 and play at 1440p.

Edit: is it my imagination or are hardware review channels trying really hard to push for 4k gaming

Hence why I linked "evidence" showing that it isn't vram causing the issue hence why a 4070ti is beating a 7900xtx despite only having 12gb vram.....

Only one making it a "whataboutism" is you.
 
Last edited:
We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.
Why do you think the game runs badly on nvidia? Rt off numbers seem perfectly fine, rt on numbers are low but according to hwunboxed the game looks great, so... What am i missing? Its running better than for example cyberpunk.
 
Yup. Can feel a 4090 purchase coming sooner than later to play all these new rubbish optimised titles though :cry:
Same, it will happen at some point but its just whether im willing to wait for an offer nearer black friday or just cave and buy an FE but in all honesty im in no rush right now lol.

Or skip this whole gen and wait for 5000 series, can i hold out though. It would mean owning a 3090 closer to 4 years!
 
Last edited:
Same, it will happen at some point but its just whether im willing to wait for an offer nearer black friday or just cave and buy an FE but in all honesty im in no rush right now lol.

Or skip this whole gen and wait for 5000 series, can i hold out though. It would mean owning a 3090 closer to 4 years!

Yeah same here, until now there hasn't really been anything to justify the cost, with dlss and some settings sacrificed, 3080 still doing pretty well at 4k and 3440x1440, if more and more games end up performing like deliver us mars, hogwarts, forspoken then there isn't much choice but to buy something to be able to power its way through the ****** optimisation, that and having FG is a nice pro to have when so many games appear to have rubbish cpu utilization too :(




3QkieNi.png

6GS2AJz.png

326lIeh.png

f4BdEXi.png

GLKyd9z.gif
 
Back
Top Bottom