Good numbers although DLSS Performance is overkill - they should've shown 'Balanced' - still crazy that DLSS is *required* to get over 60 fps in CPCp comparison now:
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Good numbers although DLSS Performance is overkill - they should've shown 'Balanced' - still crazy that DLSS is *required* to get over 60 fps in CPCp comparison now:
While there's definitely a performance gain on older cards with DLSS 3 frame generation enabled, it should be pointed out that it wasn't without its fair share of issues. The user experienced instability and frame drops so running frame generation on DLSS 3 won't get you the most optimized gaming experience at the moment since it is designed with GeForce RTX 40 graphics cards in mind.
Guess us ampere owners are good now
But in all seriousness will be interesting to see how much of a hit there is to latency. Said he is also going to test on a 3080 so be interesting to see if the fps drops and crashes go away compared to the 2070 experience.
was able to bypass a software lock by adding a config file to remove the VRAM overhead in Cyberpunk
Think speed, latency and counts. Matching the requirements grants access. This is all I can share.
34 > 49 ms latency though ? Eek
Yup not ideal but as Alex said, you are gaining so much more in return for that hit:
- far better visuals
- higher fps which will look/play smoother
That is using dlss quality preset too, so you could quite easily bring the latency down by using balanced or performance mode or even just reduce some settings.
Visuals might be better but I cant see how more latency would play smoother. A better comparison would have been the same latency with increased visuals/frame rate imo
Yup not ideal but as Alex said, you are gaining so much more in return for that hit:
- far better visuals
- higher fps which will look/play smoother
That is using dlss quality preset too, so you could quite easily bring the latency down by using balanced or performance mode or even just reduce some settings.
Today my focus is on the NVIDIA gimmicks. DLSS 2.3 and DLSS 3.0 together with NVIDIA Reflex and Boost are allowed to show what they can do. RTX on is the focus today without exception.
There is nothing more to say against DXR on and when DLSS 3.0 comes into play, the whole thing is almost a no-brainer. Cyberpunk on UHD with maximum ray tracing and DLSS 3.0 quality makes 111 FPS possible. Yes, the latency is a bit higher for this, but still far from becoming a problem in terms of input latency.
I can’t see any difference here. DLSS 2.3 and DLSS 3.0 can produce a better image than native thanks to NVIDIA’s own AA (Anti Aliasing)
NVIDIA DLSS 3 “Frame Generation” Lock Reportedly Bypassed, RTX 2070 Gets Double The Frames In Cyberpunk 2077
Reddit - Dive into anything
www.reddit.com
Guess us ampere owners are good now
But in all seriousness will be interesting to see how much of a hit there is to latency. Said he is also going to test on a 3080 so be interesting to see if the fps drops and crashes go away compared to the 2070 experience.
That's what I'm currently trying to check. It did seem odd at first but this just seems to be CDPR's method to identify the video card. Also, all of the results I've shared come from a WIP version of Cyberpunk so it's likely they will change the video card checking method before the update goes live.
Sadly looks like this workaround for cp 2077 and FG may be patched
Bets on how many will use their screen captures ofthe "fake frames" to say "DLSS 3/FGFSR* sucks and this is why" even though they have said "it is very hard to see these in normal gameplay and the gameplay itself is mostly good"
Bets on how many will use their screen captures of the "fake frames" to say "DLSS 3/FG sucks and this is why" even though they have said "it is very hard to see these in normal gameplay and the gameplay itself is mostly good"
Would take their input latency results with a pinch of salt as they didn't mention about if vsync was on/off (unless I missed it?), which as proven by DF, can massively increase input lag.
Defo seems you will be wanting to keep fps above 100 fps with FG for the best experience anyway, which is what I try to aim for on my 3440x1440 display.