Permabanned
Essentially your GPU is lying to you.
100 FPS at 50ms latency is not at all a good experience.
100 FPS at 50ms latency is not at all a good experience.
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Here is DLSS2 in cyberpunk, game is doing 62fps, latency is 58ms
Seems odd to me - there definitely wasn't anything like 50ms latency in CP2077 with DLSS on my setup.
I had one of the early 4K monitors which had around 50-60ms latency (possibly even a little higher than that) between the start of the frame rendering and the pixel fully changing on the display and the experience for gaming was terrible.
It's probably measuring total system latency from mouse click to pixel completion, not just the screen's latency. This is the way Nvidia has been measuring latency in all its other latency marketing material in the last while
so if you move your mouse 1 pixel, then 1 pixel more and stop Nvidia will move your mouse 1 pixel again before realising it is in the wrong place.It calculates based on 2 previous frames to predict a new frame, not in between two existing frames, according to their description here.
What does the nvidia perf overlay show render latency at for you ?Even then it seems rather high, though inline with their video, my setup is around 9ms for system + whatever screen latency - usually sub 16ms total when using a high performance gaming monitor.
What does the nvidia perf overlay show render latency at for you ?
At around 62 fps I see somewhere between 22 and 30ms in CP2077
If that is what they are using for measuring DLSS 3, then it is flawed method of measuring latency in this instance.It's probably measuring total system latency from mouse click to pixel completion, not just the screen's latency. This is the way Nvidia has been measuring latency in all its other latency marketing material in the last while
People will need to understand that they will need a mouse that is also Reflex compatible for all this to work hand in hand. All this is adding up to an already expensive affair. New crazy GPU price, new PSU needed most likely, new gaming mouse needed.It will add latency, hence why it's tied directly to Nvidia reflex to try and reduce the
added latency.
People need time to test before any
formal conclusions can be made.
Why would anynone use DLSS on an esport? They usually select the lowest graphics settings and crack on.To be fair, I think people will notice higher latency more than any difference on zoomed screenshots and if latency is any higher than just not using dlss3 then you can imagine no one will use it for latency sensitive games like esports
To answer your question yea dlss3 increases latency because it's adding in fake frames between real frames and each fake frame is a frame that doesn't accept player input, therefore increasing the latency between real frames that takeninput
Suggest you go back and read that again as that's not how it works. It interpolates between the current and previous frame to insert a new frame in-between the two. Therefore the current frame has to be delayed so the newly rendered intermediate can be drawn on screen first. This is why it operates in conjunction with reflex to minimise that latency hit.
My take on the OP: no, it won't add latency, but you also don't gain any improvement in latency from increased framerates (many people like improved framerates precisely because of the effect of reduced latency, but that won't be the case with these interpolated frames).
Are you sure game engines only poll player inputs at the same rate as the frame rate?Without frame generation, frame rate is 50Hz, a frame every 20ms and input polled every 20ms.
With frame generation, frame rate is 100Hz, a frame every 10ms and input polled every 20ms.
Are you sure game engines only poll player inputs at the same rate as the frame rate?
Just seen the latter half of your post. I'm sure i heard something about games decoupling physics from frame rate other wise the game speeds up as framerate increases.
It's not really about polling, but when the data is last updated to make use of said polling. Assuming you remove frames rendered ahead (ie Reflex) then you update the simulation each time just before you output the information - higher frame rates means more updates per second therefore it feels like it's polling more often. When you add in frames rendered ahead, or DLSS frame interpolation, you aren't changing the rate that you update the simulation, despite (especially in the latter case) outputting more frames - though if it predicted accurately enough it might feel like it did!Are you sure game engines only poll player inputs at the same rate as the frame rate?
Just seen the latter half of your post. I'm sure i heard something about games decoupling physics from frame rate other wise the game speeds up as framerate increases.
They could add some transparent frames to fool the fps countersNvidia adding fake frames in because they know they won't be able to keep up with the raw power of the 7900XT couldn't make it up. They might as well just make there own fps counting software that reports double the frame rate
Why would anynone use DLSS on an esport? They usually select the lowest graphics settings and crack on.