Soldato
Bad coding affects both. When one is faster with bad coding..Faster because the devs can't code properly it seems.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Bad coding affects both. When one is faster with bad coding..Faster because the devs can't code properly it seems.
All of them? Especially when heavy rt is involved, cyberpunk spiderman...Which other games are Intel much faster?
This is just whataboutism.
You think it might get fixed in a future update?
Just enjoy the 5800x3d for two more years and then when the next gen cards come out you can upgrade the CPU around that time.
Lucky for me I do not think it would have much of an impact as I plan to go back to 4K at some point so that will give my 5900X much more breathing room. Plus I would not be bothered if I could not hit higher fps as much. As long as it can hit at least 60fps I would be content
I'm not surprised, in footage I saw yesterday I noticed the rtx4090 was sitting at just 50% usage at 4k Ultra with RT on. Turning on frame generation doesn't help even though frame generation is supposed to remove CPU bottlenecks so something else may be going on
Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)And by doing that, sacrifices graphical settings and res. massively Might as well get a steam deck then
Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)
PC gaming is an overpriced joke at the moment. No amount of confirmation bias posting on OCUK will change that.
Are you still on a 3080 FE Nexus?And just like the GOW and HZD which arrived on pc (along with sonys other titles), when those games arrive on pc, they'll likely run at a higher res. more intensive graphics and maybe even RT whilst all being at a higher fps too.
They never conceived of a situation where someone would want to game with an AMD cpu... Intel inside.Faster because the devs can't code properly it seems.
Are you still on a 3080 FE Nexus?
Unoptimized games are Unoptimized on consoles as well. Case in point, forspoken drops to 720p on a ps5. So please, these are comparisons are silly. A mid range pc from 5 years ago can get similar performnace to a console.Massively? Cloud cokoo thinking. GoW and Horizon forbidden west (plus many, many others) are simply stunning on a 4k TV with HDR on console. PS5 hits 60 fps with ease (yes, using the same kind of upsampling techniques PC users get.)
PC gaming is an overpriced joke at the moment. No amount of confirmation bias posting on OCUK will change that.
We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.Stop turning this into that rubbish of "whataboutism", he clearly said he didn't want to get a nvidia gpu because of the vram "issue" i.e. "dead on arrival" so why not buy the gpu with more vram since performance will be better with a gpu that has more vram, right..... no.
I'm sure it'll be fixed by the next gen optical flow analyser in the 5000 seriesI'm sure the game developer or Nvidia will fix whatever issue is going on with this game on Ryzen CPUs. I can confirm that although initially being impressed with frame generation (FG) in Cyberpunk, it has some issues and it looks like the same issues are occurring on the Witcher 3 too. The issues specifically are stuttering, either in game randomly or when navigating menu screens. This issue is discussed here and here. Same developer, same game engine, so maybe related. Unsure if the Hogwarts issue is anything to do with FG or not as I don't have the game yet. Either way, the fixes for all will only come via the developer (most likely) or Nvidia as it'll either be a coding issue or perhaps a driver issue. As soon as turn FG off, everything works as it should.
We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.
sigh, so now hogwarts legacy is setting the stage for even 12GB vram not being enough and the 4070ti is dead on arrival.
What gpu am I suppose to buy now then? Have a gtx 1660 and play at 1440p.
Edit: is it my imagination or are hardware review channels trying really hard to push for 4k gaming
/strokes [email protected] allcore
Why do you think the game runs badly on nvidia? Rt off numbers seem perfectly fine, rt on numbers are low but according to hwunboxed the game looks great, so... What am i missing? Its running better than for example cyberpunk.We're on an nVidia thread. Rather than discussing the issues with the game on nVidia (the game runs poorly on both AMD and nVidia), you turned it into a "but AMD is no better" post. Hence my "whataboutism" remark.
Same, it will happen at some point but its just whether im willing to wait for an offer nearer black friday or just cave and buy an FE but in all honesty im in no rush right now lol.Yup. Can feel a 4090 purchase coming sooner than later to play all these new rubbish optimised titles though
Same, it will happen at some point but its just whether im willing to wait for an offer nearer black friday or just cave and buy an FE but in all honesty im in no rush right now lol.
Or skip this whole gen and wait for 5000 series, can i hold out though. It would mean owning a 3090 closer to 4 years!