• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will DLSS 3 add latency ?

I noticed that Nvidia actually put the times on a couple of its charts and one had DLSS2 and one has DLSS3. So this is our first look at how much latency DLSS3 adds.

Short answer: latency is still lower than just using DLSS2 with a lower framerate.


Here is DLSS2 in cyberpunk, game is doing 62fps, latency is 58ms





Here is DLSS3 turned on, framerate jumps from 62fps to 101fps and latency drops from 58ms to 55ms






Based on this, I'm going to go out on a limb and suggest that in games where your framerate is already very high, you probably don't want to use DLSS3 - so like you wouldn't want to use it in Overwatch 2 when you're already on 400fps. But in graphically demanding ray tracing games, it looks like you can boost framerate higher than DLSS2 and still get lower latency with that higher framerate - the latency is not as low as if you were getting 101fps natively but you're never going to get that in demanding games anyway.
 
Last edited:
Here is DLSS2 in cyberpunk, game is doing 62fps, latency is 58ms

Seems odd to me - there definitely wasn't anything like 50ms latency in CP2077 with DLSS on my setup.

I had one of the early 4K monitors which had around 50-60ms latency (possibly even a little higher than that) between the start of the frame rendering and the pixel fully changing on the display and the experience for gaming was terrible.
 
Last edited:
Seems odd to me - there definitely wasn't anything like 50ms latency in CP2077 with DLSS on my setup.

I had one of the early 4K monitors which had around 50-60ms latency (possibly even a little higher than that) between the start of the frame rendering and the pixel fully changing on the display and the experience for gaming was terrible.


It's probably measuring total system latency from mouse click to pixel completion, not just the screen's latency. This is the way Nvidia has been measuring latency in all its other latency marketing material in the last while

 
Last edited:
It's probably measuring total system latency from mouse click to pixel completion, not just the screen's latency. This is the way Nvidia has been measuring latency in all its other latency marketing material in the last while

Even then it seems rather high, though inline with their video, my setup is around 9ms for system + whatever screen latency - usually sub 16ms total when using a high performance gaming monitor.
 
Last edited:
It calculates based on 2 previous frames to predict a new frame, not in between two existing frames, according to their description here.
so if you move your mouse 1 pixel, then 1 pixel more and stop Nvidia will move your mouse 1 pixel again before realising it is in the wrong place.

you'll need your sea legs for DLSS3.0 :p
 
Even then it seems rather high, though inline with their video, my setup is around 9ms for system + whatever screen latency - usually sub 16ms total when using a high performance gaming monitor.
What does the nvidia perf overlay show render latency at for you ?
At around 62 fps I see somewhere between 22 and 30ms in CP2077
 
It's probably measuring total system latency from mouse click to pixel completion, not just the screen's latency. This is the way Nvidia has been measuring latency in all its other latency marketing material in the last while
If that is what they are using for measuring DLSS 3, then it is flawed method of measuring latency in this instance.

When gaming it is a closed loop. You see something on screen, click, it is processed and something new on screen pops up.

The previous method has been fine but with Nvidia now changing the rate that game information is displayed on screen it is no longer a good method. Hopefully reviewers will pick up on this.
 
It will add latency, hence why it's tied directly to Nvidia reflex to try and reduce the
added latency.

People need time to test before any
formal conclusions can be made.
People will need to understand that they will need a mouse that is also Reflex compatible for all this to work hand in hand. All this is adding up to an already expensive affair. New crazy GPU price, new PSU needed most likely, new gaming mouse needed.


£££.
 
Reflex just reduces frame render ahead doesn't it? You should be able to do the same thing to games without DLSS if latency is important (and you can afford to not render ahead) - no mouse requirements at all.

I hadn't pick up on DLSS frame interpolation working on previous frames rather than rendered ahead frames - the latter would obviously be incompatible with Reflex, but if it's previous frames then render ahead would actually end up adding even more frames between inputs so makes sense that you actually want Reflex on when using DLSS frame interpolation - it might even become a requirement.

My take on the OP: no, it won't add latency, but you also don't gain any improvement in latency from increased framerates (many people like improved framerates precisely because of the effect of reduced latency, but that won't be the case with these interpolated frames).
 
Last edited:
To be fair, I think people will notice higher latency more than any difference on zoomed screenshots and if latency is any higher than just not using dlss3 then you can imagine no one will use it for latency sensitive games like esports
Why would anynone use DLSS on an esport? They usually select the lowest graphics settings and crack on.
 
To answer your question yea dlss3 increases latency because it's adding in fake frames between real frames and each fake frame is a frame that doesn't accept player input, therefore increasing the latency between real frames that takeninput

This doesn't follow.

Without frame generation, frame rate is 50Hz, a frame every 20ms and input polled every 20ms.
With frame generation, frame rate is 100Hz, a frame every 10ms and input polled every 20ms.

The latency remains the same.

Suggest you go back and read that again as that's not how it works. It interpolates between the current and previous frame to insert a new frame in-between the two. Therefore the current frame has to be delayed so the newly rendered intermediate can be drawn on screen first. This is why it operates in conjunction with reflex to minimise that latency hit.

Have a look at Nvidia's diagram:

HQlIisx.png

The generated frame comes after the rendered one, not before it.

My take on the OP: no, it won't add latency, but you also don't gain any improvement in latency from increased framerates (many people like improved framerates precisely because of the effect of reduced latency, but that won't be the case with these interpolated frames).

I think this probably depends on how it is integrated with the game engine. If the engine has full control and access to what is happening with these generated frames then it can use the headroom allowed by half the frames being generated to run a physics update on every rendered frame, doubling the number of times it polls input each second. Obviously you won't see this on screen immediately, but you would get the faster response.
 
Last edited:
Without frame generation, frame rate is 50Hz, a frame every 20ms and input polled every 20ms.
With frame generation, frame rate is 100Hz, a frame every 10ms and input polled every 20ms.
Are you sure game engines only poll player inputs at the same rate as the frame rate?

Just seen the latter half of your post. I'm sure i heard something about games decoupling physics from frame rate other wise the game speeds up as framerate increases.
 
Last edited:
Are you sure game engines only poll player inputs at the same rate as the frame rate?

Just seen the latter half of your post. I'm sure i heard something about games decoupling physics from frame rate other wise the game speeds up as framerate increases.

Yeah, you're right, a lot of games do work that way, so frame rate won't affect latency at all.
 
the only way you'd get low latency is through just in time rendering where the game doesnt maintain a queued buffer of frames.. which also translates to low fps, i dont think any kind of increased activity in the game world can negate that

also the absolute order of frame rendering shouldnt matter, whether it is the first or second frame, because when you look at it in a sequence theres always a frame buffer being queued which would impact latency

whats interesting is how quickly the new method can reorder/flush the buffer queue to deliver a better experience, sounds interesting
 
Are you sure game engines only poll player inputs at the same rate as the frame rate?

Just seen the latter half of your post. I'm sure i heard something about games decoupling physics from frame rate other wise the game speeds up as framerate increases.
It's not really about polling, but when the data is last updated to make use of said polling. Assuming you remove frames rendered ahead (ie Reflex) then you update the simulation each time just before you output the information - higher frame rates means more updates per second therefore it feels like it's polling more often. When you add in frames rendered ahead, or DLSS frame interpolation, you aren't changing the rate that you update the simulation, despite (especially in the latter case) outputting more frames - though if it predicted accurately enough it might feel like it did!
 
Last edited:
Nvidia adding fake frames in because they know they won't be able to keep up with the raw power of the 7900XT :cry: couldn't make it up. They might as well just make there own fps counting software that reports double the frame rate
They could add some transparent frames to fool the fps counters
 
Why would anynone use DLSS on an esport? They usually select the lowest graphics settings and crack on.


And in quite a few cases, dlss will have better motion clarity than native/TAA but obviously if you're setting everything to low4pro then native/TAA won't be an issue.
 
Back
Top Bottom