• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will DLSS 3 add latency ?

Why on earth would us pc gamers on high refresh rate displays want that!? :mad:

:p

I hope someone does a comparison of dlss 3 vs dlss 2/fsr/xess without showing fps, will be interesting to see which one people would pick out as looking the best i.e. smoothest ;) :p Sadly youtube only goes up to 60 fps though so you aren't going to see the benefit with footage over 60 fps (unless it is slowed down as DF did), main benefit will be for games like cp 2077, portal rtx etc. maxed i.e. where fps will be 30-40 without any upscaling tech.
I think that as long as you hit 60fps locked (with dlss 3), you'll be fine in almost all scenarios.
 
As far as I can tell, if you have a game that plays at 22fps, no matter how you manage to speed it up using DLSS 3, it will still "play" on the input side like it is 22fps. So it doesn't add latency but nor will it improve it.
 
As far as I can tell, if you have a game that plays at 22fps, no matter how you manage to speed it up using DLSS 3, it will still "play" on the input side like it is 22fps. So it doesn't add latency but nor will it improve it.
If you're getting 22fps after applying dlss 2.x, then your initial FPS was around 11-14fps +/-, so without it you'd still need to lower details or upgrade. But, some games with the input lag of 22fps and, let's say, display / fluidity of animations and such at around 30-40fps, can still be played.
Then there are the games where dlss 2.x gets you to 40-50fps and dlss 3.x ups the frame rate to 60fps plus and that's very playable even for shooters.
Is this for guys that want the input lag of native or dlss 2.x for 120fps + ? Most likely not.
 
If you're getting 22fps after applying dlss 2.x, then your initial FPS was around 11-14fps +/-, so without it you'd still need to lower details or upgrade. But, some games with the input lag of 22fps and, let's say, display / fluidity of animations and such at around 30-40fps, can still be played.
Then there are the games where dlss 2.x gets you to 40-50fps and dlss 3.x ups the frame rate to 60fps plus and that's very playable even for shooters.
Is this for guys that want the input lag of native or dlss 2.x for 120fps + ? Most likely not.

I mentioned 22fps because in one of the 4090 demo's it shows it getting that fps without DLSS.
 
In that case dlss 2 would give about 50fps +/-, so the input lag would be for around that 50fps. With dlss 3 you're safely over 60fps which should be plenty.

The game is the Portal that Nvidia showed off, the game had been given full path tracing using Remix. The 4090 got 20-25fps at native 4k and 100fps with dlss3 on
 
Last edited:
The game is the Portal that Nvidia showed off, the game had been given full path tracing using Remix. The 4090 got 20-25fps at native 4k and 100fps with dlss3 on

That's a very good increase which hopefully can be replicated in other GPU bottlenecked games.

50 fps can be pretty pants for input latency in a shooter, might be fine though depending on the user.

Depends per game. Some that I've played had a weird combination of lag and stutter even at 40-50 fps, while others were much better at 30. I'm good with 50fps input lag and 60+ v sync gameplay.
Even 30fps input lag can be ok for plenty of people in RTS, TBS, RPGs, Action Action RPGs and even shooters. Not all require that high fps input rate.

Is DLSS Performance universaly accepted as being ok now being the goto setting for 3.0 despite lower IQ than FSR 2* Quality?:p
I see what you did there, but if the performance is enough you can always use Quality or Balanced as well. If nVIDIA goes from 24 to something like 50fps with Performance in dlss 2 (and way higher with dlss 3), does AMD do the same with FRS Quality, 24 to 50? I guess not.
nVIDIA will have the upper hand when you do need that performance, not when native or highr forms of dlss or fsr are good enough. Just for that and the green team would be my first choice if I were to upgrade now (even if the price was a bit higher). ;)

You may "hate" as much as you want, but dlss gave me a free generational upgrade for games that support it. Radeon VII would have gave me a higher electricity bill. And 16GB of RAM, 'cause 8GB are not enough :p
 
Not too mention dlss balanced and performance is very good now. Can't comment on fsr 2.1 balanced and performance modes though.
Plus you're not CPU limited anymore. So even though, let's say, fsr can run about the same with dlss 2 now, it will be limited in CPU bound scenarios, while the competition would offer a better experience through higher fps.
 
I feel this DLSS 3 is also going to be only useful on a 4090 level card too

The 4080 12GB (really a xx60ti) is probably going to be badly bottlenecked by it's memory bandwidth and trying to run any game at DLSS 3 Performance won't look very good as well as you will be needing to run it at a lower resolution

All conjecture till we see independent tests, but i just have that feeling that only the top end cards will make use of it
 
I feel this DLSS 3 is also going to be only useful on a 4090 level card too

The 4080 12GB (really a xx60ti) is probably going to be badly bottlenecked by it's memory bandwidth and trying to run any game at DLSS 3 Performance won't look very good as well as you will be needing to run it at a lower resolution

All conjecture till we see independent tests, but i just have that feeling that only the top end cards will make use of it

Where does it say that dlss scales to different resolutions based on what card it runs on? Quality, Balanced, Performance,wahtever mode will be the same lower resolution upscaled, be that rtx2060 or 4090.
 
Last edited:
Dlss used to do that back in v1, it would use different resolutions for its presets depending on what gpu you have but I think I'm v2 inwards they got risk of it

My only complaint is that dlss seems to have issues on ultrawide panels. Even using the latest version of dlss and selecting quality mode it still makes games look more like performance mode when used on a 21:9 screen where as quality looks sharp and clean on 16:9.

Due to that I don't use dlss on my monitor, only TV
 
Last edited:
You may "hate" as much as you want, but dlss gave me a free generational upgrade for games that support it. Radeon VII would have gave me a higher electricity bill. And 16GB of RAM, 'cause 8GB are not enough :p
'hate' :cry:

Can you show some examples?

Considering the scrutinisation FSR's IQ has had from a few posters in this thread but zero thoughts on possibly using lower than FSR* Quality IQ.

Oh and DLSS isn't and never has been 'free'.:p
 
Good video by DF on some of the DLSS 3 concerns including people nitpicking at them highlighting fsr 2 artifacts and not dlss 3, short answer is perfectly valid by them as shown by testing, FSR 2 artifacts happen on every frame with fsr where as with frame generation/dlss 3, it's only the "fake" frame in between the 2 real frames, which shows the artifacts and when at 120+ fps, you just don't see this in gameplay and only way is by slowing down the footage or/and capturing that frame by itself.


Also apparently frame generation has been in the works for 6 years :eek: Intel apparently are looking into a similar solution too (makes sense given their focus on machine learning/ai too) and eventually amd will follow....
 
Back
Top Bottom