• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will DLSS 3 add latency ?

It's just one game, it's hard to know for sure what's happening but my guess is that the lower power draw with dlss on is because the GPU is not running anymore at full load due to a system bottleneck either on bandwidth, ram or cpu. I say this because I've never noticed lower power draw using dlss on my own system so the only reason I could think is that the framerate is high enough to encounter a bottleneck somewhere else that prevents it going any higher
In my experience running at a lower resolution results in lower power draw even where GPU usage is 99%+ in both instances. I see lower power usage in DLSS vs native for presumably the same reasons, rendering at a lower resolution.
Seen this consistently over past few years and it is therefore much easier for me to slam into my 3080FE powerlimits at 2160p compared to 1080p.
 
To be as green as possible you have to play at 24Fps for the ultimate green cinematographic cinematic XXX Sniper Elitez Vegan Muscle gaming experience.
 
Last edited:
Just did a quick test myself, got a 5900x and a 3080ti - at 1440p with everything turned up to Ultra and Pyscho settings and DLSS 2.x on Quality i got just under 60FPS on average... except my latency was around 22 to 28.

There is a program called Special K which gives you a menu to run all sorts of things, monitoring etc but also lets you switch on Reflex and Boost, i tried that and my latency just had a wider margin of 14ms to 28ms.

So if the 4090 was using DLSS3 and getting around 170FPS on quality it was doubling the latency (of course the settings in Cyberpunk 2077 could have been using this new RT overdrive i don't know)
Driver level reflex mode (ie, low latency mode/reduced frame ahead render) won't work in DX12 games as the game controls the frame buffer in DX12, not the driver (DX11 and lower). I'm guessing that program is running it at the driver level (which you can also do directly on Nvidia control panel) rather than going into the game's engine.
 
Yeah when I heard about how it worked I was worried about lag, I guess this will all be tested soon. I wouldn't be surprised if it was basically horrible feeling in FPS games, but makes the numbers look good in the marketing. Other games like RTS game might not be so bad but then who cares about frame rates in those type of games.
 
Driver level reflex mode (ie, low latency mode/reduced frame ahead render) won't work in DX12 games as the game controls the frame buffer in DX12, not the driver (DX11 and lower).

Nvidia state that game level support is required for Reflex. I don't think this is a big deal for any major game studio.
 

EFH2BwM.png


j4Z9Eff.png


ZRhv28C.png



Initial insight appears to be no serious issues with frame pacing either:

QGSwriQ.png



There will be more in depth videos coming.
 
Last edited:
DLSS 3.0 *looks* like 120fps, *plays* like 60.

I don't like that nVidia has portrayed Reflex like it's something new with DLSS 3.0 - it's been around since the 30xx series - it's just that its adoption by game devs have been a bit patchy (mostly FPS games) - if DLSS 3.0 prompts more devs to include it, that's a good thing - but clearly it's not able to do much to provide a true high-framerate experience when using frame generation.
 
Last edited:
DLSS 3.0 *looks* like 120fps, *plays* like 60.

I'm forgetting which but there was some games which did that awhile ago - think one of the Wolfenstein ones maybe - inserted duplicate frames to "double" FPS, due to the 60Hz locked physics, and you could tell immediately it wasn't doubled.
 
Last edited:
I'm forgetting which but there was some games which did that awhile ago - think one of the Wolfenstein ones maybe - inserted duplicate frames to "double" FPS, due to the 60Hz locked physics, and you could tell immediately it wasn't doubled.

Batman arkham knight was another one, game released locked to 60 fps iirc but all the animations of cloth, ragdolls or something was locked to 30 and then when pc players forced the game and those physics above 60, it broke some of the physics.




Regarding the artefacts that people highlighted in spiderman, worth watching Alex's explanation on this here:


Of course, people will say "shill" but his points are perfectly valid and backed up with him showing footage on it with very good explanations.



Based on what we have atm, I think dlss 3 looks very good, I get the "fake fps", "soap opera" complaints but if it works and looks good and provides the same look as "real high fps", I don't see the issue.

- the latency isn't a huge increase "all the time" and in some cases, is barely any higher than dlss 2 + reflex and obviously is far more ideal than native, obviously for FPS/competitive shooters, you won't be using it
- the artefacts produced by the fake frame looks like it will be a complete non issue when playing the game and at normal speed if Alex's explanation is anything to go by
- "soap opera effect", very different this to tv/films which are shot at 24 fps, heck even people didn't like the hobbit when it was shot at 48 fps :p For gaming, you want high fps on high refresh rate displays so it won't be anything like watching a 24 fps film/tv show with motion interpolation on

And obviously this is the first version of it so naturally any kinks will get ironed out.

It absolutely sucks that it isn't coming to ampere though, can understand nvidias reasons of "customers will complain about it being laggy or poor iq" but they should at least give the choice, after all it is an option, which can be turned on/off, of course, we all know why it is being kept locked to 40xx...
 
I've seen worse, like when people were asking for lower settings in some game the devs replied they don't want their game to look bad so they won't offer lower settings. Nvidia's reason sounds plausible even if i think the real reason is that there wasn't too much of a difference between the two generations if you enabled the same features on the 3000 series.
 
DLSS 3.0 *looks* like 120fps, *plays* like 60.

I don't like that nVidia has portrayed Reflex like it's something new with DLSS 3.0 - it's been around since the 30xx series - it's just that its adoption by game devs have been a bit patchy (mostly FPS games) - if DLSS 3.0 prompts more devs to include it, that's a good thing - but clearly it's not able to do much to provide a true high-framerate experience when using frame generation.

It's used on 900 series and up too...

v1cBIWy.jpg


It's all sounding like they have something to hide with these DF videos and trying to create fake hype.
 
Last edited:
It's used on 900 series and up too...

v1cBIWy.jpg


It's all sounding like they have something to hide with these DF videos and trying to create fake hype.

Yup I'm somewhat wary of the iq and any other potential issues still. Too much focus on spiderman and cp 2077 and nothing else..... why I want to see a wider range of games outside the promotional games for this new feature. As we all know from history, generally the PR showcases for new tech are great but once out there in the wild, it ends up being a **** show e.g. DLSS 1 and FSR 2 :p
 
Last edited:
- the latency isn't a huge increase "all the time" and in some cases, is barely any higher than dlss 2 + reflex and obviously is far more ideal than native, obviously for FPS/competitive shooters, you won't be using it
The point is, DLSS 2.0 gets you appropriate (or better with Reflex) latency to your actual framerate - DLSS 3.0 just doubles that framerate without halving the latency (and in fact, actually worsens it compared with just DLSS 2.0 + Reflex).
 
The point is, DLSS 2.0 gets you appropriate (or better with Reflex) latency to your actual framerate - DLSS 3.0 just doubles that framerate without halving the latency (and in fact, actually worsens it compared with just DLSS 2.0 + Reflex).

Yup I get that but I think I rather have a more fluid/smoother looking game with a slight hit to latency (especially since I use controller more often than K+M so the latency increase won't be as noticeable) but each to their own.

A good article by the best monitor reviewing site:


  • Class 1) Less than 16ms / 1 frame lag – should be fine for gamers, even at high levels
  • Class 2) A lag of 16 – 32ms / One to two frames – moderate lag but should be fine for many gamers
  • Class 3) A lag of more than 32ms / more than 2 frames – Some noticeable lag in daily usage, not suitable for high end gaming

If we're talking about 50+ms for say 90% of the time then obviously that won't be great but if that is only for a small % of the time and 90% of the time, the latency is <40ms, I can deal with that but as said, lag, fps, what input device you use, the games you play etc. is all very subjective.
 
Last edited:
Yup I get that but I think I rather have a more fluid/smoother looking game with a slight hit to latency (especially since I use controller more often than K+M so the latency increase won't be as noticeable) but each to their own.
You'll notice that all of those latency numbers listed above for DLSS 3.0 fall into Class 3/'more than 32ms' - some are nearly double that.
 
Last edited:
You'll notice that all of those latency numbers listed above for DLSS 3.0 fall into Class 3/'more than 32ms' - some are nearly double that.

2 of the 3 DLSS 2 with reflex also falls in that category, well one is 1ms off from falling into the class 3 category tbf :p

That's why I said I want to see more footage in areas outside their chosen testing areas as well as other games ;) It could be worse, it could be better, that and this is still an early build they are using/testing so there might be improvements. Not to mention test it for myself on my setup.

Either way, it's good to see this "option", as we have all said, there will be cases where you won't want to use it and there will be cases where it will be most welcome (if no severe issues).



How long do we give amd to catch up on providing a similar feature? :D
 
Last edited:
More latency than DLSS2 is a massive fail for me, I suspect will be particularly noticeable in mkb shooters :)
 
So, take away is, just use DLSS 2 and get the benefit of some performance uplift and lower latency :D

Depends on the game. Some games have lower latency in dlss3 and some lower in dlss2.

I suspect games that are GPU bottlenecked are the ones where dlss3 has slightly lower latency than dlss2 and games where it's CPU bottlenecked, the latency is higher on dlss3 than 2. DF didn't show it but even mentioned that if your CPU is weak enough and the CPU bottleneck bad enough, DLSS3 will actually make performance worse by amplifying stuttering.

So you definitely don't want to use DLSS3'in competitive first person shooters, those games are all heavily CPU bottlenecked, DLSS3 probably adds a ton of latency and any stuttering or frame drops gets doubled.

DLSS3 is once again, just a trick to boost heavy Ray Tracing/Path Tracing performance, where the GPU is the reason why FPS is low and DLSS3 can help without hurting the pipeline
 
Last edited:
Back
Top Bottom