• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will DLSS 3 add latency ?

It's obviously not going to be ideal for PVP FPS, it will be for those who want to enjoy SP games with maxed out settings including RT @ high res. such as 4k.

You can't deny when watching the above comparisons by DF that the 120 fps dlss/frame generation is far smoother looking than the 60 fps footage, which is one of the main reasons most pc gamers have high refresh rate displays.

The alternative is, you simply reduce settings or/and run a considerably lower res. if you value latency that much but obviously you aren't going to be getting the same visuals by doing that.
 
Last edited:
It's obviously not going to be ideal for PVP FPS, it will be for those who want to enjoy SP games with maxed out settings including RT @ high res. such as 4k.

You can't deny when watching the above comparisons by DF that the 120 fps dlss/frame generation is far smoother looking than the 60 fps footage, which is one of the main reasons most pc gamers have high refresh rate displays.
The reason I have high-refresh was to have a eye-hand-screen co-ordination that was closer to being natural, I can play at 30FPS fine with a controller, heck CNC3 is 30 FPS lock and is M+KB only. I see no use in it other than saying "We can higher FPS than you" Which is just gimmicky and marketing, oh look higher number, but it's not real!

Except at the huge loss of hand-eye-screen co-ordination is lost, even on a controller you are seeing a higher latency number.

I don't see the point of it at all, it is in it's infancy though so perhaps they can fix the huge latency nonsense.

Not for me though, the prices ensure I will stay salty.
 
Last edited:
The reason I have high-refresh was to have a eye-hand-screen co-ordination that was closer to being natural, I can play at 30FPS fine with a controller, heck CNC3 is 30 FPS lock and is M+KB only. I see no use in it other than saying "We can higher FPS than you" Which is just gimmicky and marketing, oh look higher number, but it's not real!

Except at the huge loss of hand-eye-screen co-ordination is lost, even on a controller you are seeing a higher latency number.

I don't see the point of it at all, it is in it's infancy though so perhaps they can fix the huge latency nonsense.

Not for me though, the prices ensure I will stay salty.

I also have a high refresh screen for the same reason but more so for the "smoothness" that comes from high refresh rate AND high fps (even when playing with a 360 controller), as said, if I'm playing something like rdr 2, cp 2077 then latency isn't an issue for my needs but fps/smoothness is, if playing something like cs:go, bf, cod, I would be reducing settings in order to have a locked 170 fps for obviously smoothness and most importantly low latency thus no need for dlss 3 frame generation.

They will probably be able to improve it over time but going by nvidias comments, a lot of it is down to the hardware as they stated that with lesser hardware such as ampere, customers would complain about it feeling laggy or IQ not looking good, which is apparently their reason for not bringing it to anything <40xx

Saw someone on another forum argue that DLSS3 will be incompatible with G-Sync. Any reason why this would be the case?

Link/source?

If true..... then that will be a massive oversight and surely will make this tech DOA, I'll be absolutely gobsmacked if that is the case....
 
How come power draw is lower with DLSS 3, is it because the GPU is doing less work? Edit: I just read the article, the tensor cores are more efficient at this and offload work from the rest of the GPU (I am not sure whether this is speculation on their part).

It's just one game, it's hard to know for sure what's happening but my guess is that the lower power draw with dlss on is because the GPU is not running anymore at full load due to a system bottleneck either on bandwidth, ram or cpu. I say this because I've never noticed lower power draw using dlss on my own system so the only reason I could think is that the framerate is high enough to encounter a bottleneck somewhere else that prevents it going any higher
 
Last edited:
Yes exactly. Frame doubling in an FPS sense but still low FPS number latency.

This on top of screen latency and natural human latency.

LOL!

Good luck with any game requring an ounce of accuracy from you, you will be too late to react.
But it's no worse than not using DLSS3. In short, if latency was too bad before with DLSS2+Reflex, DLSS3 isn't going to help you. But you'd be stuffed without DLSS3 as well, so you've not lost anything.
 
There's not a chance nVidia would have focused most of the presentation on DLSS 3 if it had some of the obvious flaws mentioned here. They know at this point that there are plenty of tech sites and YouTubers who will pull it apart if it's ****.

It's pretty much the only reason to buy a 4 series card so they'll no doubt have nailed it.
 
There's not a chance nVidia would have focused most of the presentation on DLSS 3 if it had some of the obvious flaws mentioned here. They know at this point that there are plenty of tech sites and YouTubers who will pull it apart if it's ****.

It's pretty much the only reason to buy a 4 series card so they'll no doubt have nailed it.

See Turing and dlss 1.... :p
 
You are all mostly missing the fact that the rendered frames are also DLSS2 upscaled, not native.

Using simple round numbers, if 4k native ran at 10FPS, then render 1 frame takes 100ms and 3 frames 300ms. With DLSs3, frames 1 and 3 may be rendered at 1080p and upscaled in 30ms each. The intermediate frame is quicker to create, ay 15ms, so the total for frames 1 to 3 is 75ms, less time than a single frame of 4k native
 
But it's no worse than not using DLSS3. In short, if latency was too bad before with DLSS2+Reflex, DLSS3 isn't going to help you. But you'd be stuffed without DLSS3 as well, so you've not lost anything.
Would like to see what DLSS2 + Reflex compared to DLSS3 is like with Latency and the difference in FPS too. Because i can very much bet that nVidia wasn't showing their graphs comparing DLSS2 vs DLSS3 with Reflex on for DLSS2 since DLSS3 has it baked in, again to misinform more for the customer.

I would honestly think a lot of people would choose to go DLSS2 + Reflex as it's a better balance, you might not get as high an FPS as DLSS3 but it's still a boost while also making a reasonable reduction in latency to match the FPS boost.
 
You are all mostly missing the fact that the rendered frames are also DLSS2 upscaled, not native.

Using simple round numbers, if 4k native ran at 10FPS, then render 1 frame takes 100ms and 3 frames 300ms. With DLSs3, frames 1 and 3 may be rendered at 1080p and upscaled in 30ms each. The intermediate frame is quicker to create, ay 15ms, so the total for frames 1 to 3 is 75ms, less time than a single frame of 4k native
I think the issue here is that the latency of the input not the latency of the frame
 
I think the issue here is that the latency of the input not the latency of the frame
yes, but the total latency from input could be lower than native depending on the speed of the DLSS rendering of the 2 anchor frames. The latency might not reflect the FPS , but there is no need to believe it is significantly worse than native and visually you will get the smoothness of the higher FPS
 
You are all mostly missing the fact that the rendered frames are also DLSS2 upscaled, not native.

Using simple round numbers, if 4k native ran at 10FPS, then render 1 frame takes 100ms and 3 frames 300ms. With DLSs3, frames 1 and 3 may be rendered at 1080p and upscaled in 30ms each. The intermediate frame is quicker to create, ay 15ms, so the total for frames 1 to 3 is 75ms, less time than a single frame of 4k native
We're talking about whether DLSS3 adds/reduces latency compared to DLSS2 - so upscaling is taken out of the equation (save for any gains in upscaling efficiency).


Would like to see what DLSS2 + Reflex compared to DLSS3 is like with Latency and the difference in FPS too. Because i can very much bet that nVidia wasn't showing their graphs comparing DLSS2 vs DLSS3 with Reflex on for DLSS2 since DLSS3 has it baked in, again to misinform more for the customer.

I would honestly think a lot of people would choose to go DLSS2 + Reflex as it's a better balance, you might not get as high an FPS as DLSS3 but it's still a boost while also making a reasonable reduction in latency to match the FPS boost.
Exactly. Unless your card can't keep up with Reflex/low latency (which is why it's not enabled by default) - in this case FRUC would visually compensate (but do nothing for input latency).
 
We're talking about whether DLSS3 adds/reduces latency compared to DLSS2 - so upscaling is taken out of the equation (save for any gains in upscaling efficiency).



Exactly. Unless your card can't keep up with Reflex/low latency (which is why it's not enabled by default) - in this case FRUC would visually compensate (but do nothing for input latency).
Just did a quick test myself, got a 5900x and a 3080ti - at 1440p with everything turned up to Ultra and Pyscho settings and DLSS 2.x on Quality i got just under 60FPS on average... except my latency was around 22 to 28.

There is a program called Special K which gives you a menu to run all sorts of things, monitoring etc but also lets you switch on Reflex and Boost, i tried that and my latency just had a wider margin of 14ms to 28ms.

So if the 4090 was using DLSS3 and getting around 170FPS on quality it was doubling the latency (of course the settings in Cyberpunk 2077 could have been using this new RT overdrive i don't know)
 
Back
Top Bottom