• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Will DLSS 3 add latency ?

'hate' :cry:

Can you show some examples?

Considering the scrutinisation FSR's IQ has had from a few posters in this thread but zero thoughts on possibly using lower than FSR* Quality IQ.

Oh and DLSS isn't and never has been 'free'.:p
I was joking with the hate, hence the quotation marks :p.

I know is not "free", but it is as "free" as it can be. I doubt they could have used that silicon better:

18fps native


vs.

54fps dlss 2 performance


In this particular instance is on my rtx2080 (so not even a 3xxx series card), 3x the increase with dlss 2 and this is only in Rasterization, NO RT enabled.
Is like that always? No, it can vary, but the difference is huge in dlss performance vs. native and the artifacts you get here and there are well worth what you get in exchange. To me is the "fine wine" AMD could not offer at that time with Radeon VII - which turned out to be not such a great card after all for gamers.

Would I like to run it native? Sure, just let me know a GPU that can do native 60fps @ 5760*1080 for around $300 and I'll take it! :D
 
Last edited:
I thought by 2023 we'd have GPUs with the performance of an RTX3090 (a 4k 60fps GPU) selling for $300-350

I was wrong, they are still trying to sell 1080/1440p 60 fps GPUs
 
Last edited:
Saw this?



From this footage from the looks of it:


Seems like it is only noticeable when running at the slowest playback speed, normal speed doesn't look like you would notice it, especially when viewing at higher than 60 fps/hz.
 
I was joking with the hate, hence the quotation marks :p.

I know is not "free", but it is as "free" as it can be. I doubt they could have used that silicon better:

18fps native


vs.

54fps dlss 2 performance


In this particular instance is on my rtx2080 (so not even a 3xxx series card), 3x the increase with dlss 2 and this is only in Rasterization, NO RT enabled.
Is like that always? No, it can vary, but the difference is huge in dlss performance vs. native and the artifacts you get here and there are well worth what you get in exchange. To me is the "fine wine" AMD could not offer at that time with Radeon VII - which turned out to be not such a great card after all for gamers.

Would I like to run it native? Sure, just let me know a GPU that can do native 60fps @ 5760*1080 for around $300 and I'll take it! :D
Didn't know what way to take it.:)

The highlighted bit is partly why I'm asking questions on using the Performance preset for inserting frames.

Nvidia's promotion partner DF claims (on behalf of)Nv that DLSS doesn't artifact if I'm reading nexus post correctly.

IQ is reduced on Performance mode, what's the possibility degradation can cause and impact artifacts on games is going under the radar, quality has been the talking point since FSR arrived, now it's out the window. :cry:


Wonder if DLSS will be getting 'gimped'-in the sense it remains static, if we don't upgrade the way NV are accustomed to, it's not as if NV haven't dropped support in the past and left previous gens performance lacking until they got called out.
 
Last edited:
Right, half the price or less for the same performance - since it can offer up to 3x the performance with the DLSS silicon. Which card does that? AMD sure doesn't although such hardware is not present on their cards.
I honestly look at people hyping this up (and FSR I dislike both equally) and I wonder how you are falling for marketing so easily.

Most of the people here are buying high end cards and they are changing them every generation, you ain't struggling to hit 60 fps. Some of you probably haven't even seen a game running at sub 70 fps in years. and you are out here talking about how badly you want DLSS in your next graphics cards. For what, posting in the benchmark thread?

If you are at 60fps stable then you are good with regards to a gaming experience. Turning on DLSS (especially when you can't be bothered to optimise your settings) is just so you don't feel like you wasted your money by not using every feature the graphics card came with. That's all. You didn't need to turn it on, you wanted to, to get your monies worth.

IMO DLSS 2 and 3 is only useful if you are struggling to hit 60fps (after tweaking settings). Coincidentally If your not hitting 60fps then I reckon DLSS 3 will play like **** due to the latency. Since essentially the base frame rate before adding in fake frame will probably be in the low 30s.

Another thing that is funny. If a game runs poorly on high end hardware people say the devs are lazy for not optimising the game, unless Nvidia needs to sell you a graphics card then it's fine and there are no issues. What does CP2077 need a new higher RT mode for? They should be optimising what they have so it runs better. What is this nonsense about adding a new higher RT level.
 
Last edited:
Didn't know what way to take it.:)

The highlighted bit is partly why I'm asking questions on using the Performance preset for inserting frames.

Nvidia's promotion partner DF claims (on behalf of)Nv that DLSS doesn't artifact if I'm reading nexus post correctly.

IQ is reduced on Performance mode, what's the possibility degradation can cause and impact artifacts on games is going under the radar, quality has been the talking point since FSR arrived, now it's out the window. :cry:


Wonder if DLSS will be getting 'gimped'-in the sense it remains static, if we don't upgrade the way NV are accustomed to, it's not as if NV haven't dropped support in the past and left previous gens performance lacking until they got called out.

The fake frame that is produced in between the 2 real frames does artifact but because of it only being one frame out of 3 frames where this happens and when you're at 100+ fps, it is nigh on impossible to spot it during normal gameplay, which is why Alex/DF etc. has to slow the footage down in order to show that as well as because they can't show just how much better/smoother the game is due to youtube 60 fps limitation.

FSR 2/2.1 and even DLSS 2 will show "artifacts" in the "real" frames especially if the native has flaws in its image, however, problem with FSR 2 is it happens far more often and produces artifacts, which isn't even there in the native image, let alone dlss (at least not to the same "extent") e.g. and this isn't even the worst case frame either.....

Uri9InX.png

And deathloop is probably the best show case for FSR. That's why they have more issues with fsr 2 than dlss 3 + frame generation, especially since the issues with fsr 2 are noticeable during normal gameplay speed.

DLSS performance and even balanced was very bad not that long ago but in recent months, myself, a few others and even in the recent footage you see now, balanced and especially performance presets for dlss have come a long way e.g.


Haven't looked too much into FSR modes though so maybe it is improved/better especially with 2.1. Obviously quality will always give the best IQ especially with res <4k but for games like CP overdrive mode and probably portal rtx where native FPS is about 30 fps @ 4k, then balanced and performance will have to be used in order to achieve an "acceptable" latency when using "frame generation".

I think DLSS 3/frame generation looks very promising but as was the case with fsr 2 and deathloop, lets wait and see how it looks/performs outside of nvidias PR games, it sounds like it is pretty much confirmed intel will be getting something out asap and possibly amd too, as Richard said, machine learning/ai isn't just a graphics thing especially not about "cheating", it is being utilised in every industry where possible now due to how powerful it is.

As for last point, I wouldn't be surprised but if I read it right, dlss super resolution i.e. dlss 2 will still work exactly the same but time will tell on this, I've taken a few screenshots of before and after dlss 2 is turned on in cp just in case they do try any dodgy **** :p

I honestly look at people hyping this up (and FSR I dislike both equally) and I wonder how you are falling for marketing so easily.

Most of the people here are buying high end cards and they are changing them every generation, you ain't struggling to hit 60 fps. Some of you probably haven't even seen a game running at sub 70 fps in years. and you are out here talking about how badly you want DLSS in your next graphics cards. For what, posting in the benchmark thread?

If you are at 60fps stable then you are good with regards to a gaming experience. Turning on DLSS (especially when you can't be bothered to optimise your settings) is just so you don't feel like you wasted your money by not using every feature the graphics card came with. That's all. You didn't need to turn it on, you wanted to, to get your monies worth.

IMO DLSS 2 and 3 is only useful if you are struggling to hit 60fps (after tweaking settings). Coincidentally If your not hitting 60fps then I reckon DLSS 3 will play like **** due to the latency. Since essentially the base frame rate before adding in fake frame will probably be in the low 30s.

Another thing that is funny. If a game runs poorly on high end hardware people say the devs are lazy for not optimising the game, unless Nvidia needs to sell you a graphics card then it's fine and there are no issues. What does CP2077 need a new higher RT mode for? They should be optimising what they have so it runs better. What is this nonsense about adding a new higher RT level.

Personally I would love dlss 3 frame generation for my 3440x1440 175hz qd-oled display, obviously 60-80 fps is entirely playable and a good experience but when you have seen a locked 140+ fps on qd-oled, it is a thing to behold, no gpu can deliver this right now and especially without dlss/fsr, obviously it's not "needed" and certainly not worth an outlay of £800+ though.

As for ray tracing, as said a few times now, what we have seen with RT, even metro ee and cp 2077 is very much just the tip of the iceberg, so many compromises have had to be made to RT graphical effects due to the lack of grunt, even with dlss/fsr, as long as we have sliders/options to control the settings, I don't see the problem, the main thing I would worry about is these SER specific optimisations for 40xx hardware....
 
Last edited:
You're agreeing with DF, DLSS doesn't artifact?

DF/Richards comment was a very general observation/statement, you should know that it isn't just as clear as being "they produce artifacts", there is far more to it than just simply stating 3 words i.e. the frame rate you're at, the res. you're at, the type of game, the settings which are used that can determine how good fsr/dlss can look e.g. motion blur, dof, chromatic abberation, lens flare, sharpening, AA all impact the end result not to mention, there are many kinds of artifacting too, FSR 2 suffers from several different kinds of artifacting.

If just a simple question of do I agree that dlss "never" artifacts? Obviously not given this in my post:

FSR 2/2.1 and even DLSS 2 will show "artifacts" in the "real" frames especially if the native has flaws in its image, however, problem with FSR 2 is it happens far more often and produces artifacts, which isn't even there in the native image, let alone dlss (at least not to the same "extent") e.g. and this isn't even the worst case frame either.....

The question raised to DF was why do they nitpick on FSR artifacts but not call out DLSS 3/frame generation artifacts and it's pretty simple, especially since we are now talking about dlss performance mode as that is what nvidia are using for their PR. As Alex said, literally the only way to capture the frame, which has heavy artifacts i.e. the fake frame is by slowing the footage completely down i.e.

hxr1bfg.png

FSR 2 produces all kinds of artifacts, which are visible in the majority of frames especially with "performance" mode compared to dlss

2qF0Wd0.png

ade5tvP.png

W54MNd1.png

Which one produces less artifacts there? Bearing in mind as well that deathloop is by far the best showcase for FSR 2 so far...

As said, FSR 2.1 has come a long way so I can't comment on how it looks/performs since we have very little footage from any tech journalist currently (more so ones who do proper in depth analysis like DF)
 
Last edited:
I'm presuming you haven't watched the video Tommy? If going based on what I said in my post? If so, here it is summed up:

Richard - "a lot of criticism on artifacts and how the artifacts in fsr 2 were treated in a different way, thing with FSR 2 artifacts is you can see them persist in frame after frame after frame, what happens with dlss 3 is those artifacts are strobing between perfect frames and when you're talking about 120 fps and higher gameplay, they're really difficult to see and they're not continuous from frame to frame"

Based on that comment alone:

- I wouldn't say FSR 2 produced artifacts "frame after frame after frame", there are definitely a few scenes where fsr 2 looks and plays fine, however, that is very rare based on my own experience
- I wouldn't say DLSS (without frame generation) gives "perfect" frames "all" the time, heck even in their own example above, you can see an "artifact" in the 3rd frame
 
Last edited:
That's a better explanation, you don't agree with what DF are saying. :)

What I'm looking for in general is, looking for opinion on whether DLSS Performance>FSR 2* Quality.

Therefore does DLSS 3 frame insertion get you higher fps+lower IQ.
 
Last edited:
That's a better explanation, you don't agree with what DF are saying. :)

What I'm looking for in general is, looking for opinion on whether DLSS Performance>FSR 2* Quality.

Therefore does DLSS 3 frame insertion get you higher fps+lower IQ.

I don't agree on certain comments nor how they have portrayed the situation but I do agree with their "overall" sentiments/explanation as backed up by their footage. Worth keeping in mind as well that we are still waiting on the proper complete review video on dlss 3, so far, any footage/comments from DF have purely been based on nvidias PR format i.e. only cp 2077 and spiderman under certain settings and test scenes/areas.


As for overall IQ of DLSS performance mode vs fsr 2 quality, will need to see footage of that. All I know is dlss balanced + performance are considerably better recently, only time I could use them was if playing at 4k but even with 3440x1440, they are certainly much more usable now. I have only ever tried FSR 2 quality mode and even then it left a lot to be desired imo.

I might do a couple comparisons in cp 2077 and whatever other FSR 2 titles I still have installed later, just screenshots for now though. I suspect in terms of clarity, FSR 2 quality will look better and when it comes to temporal stability i.e. shimmering, aliasing and overall artifacts, dlss performance will look better than fsr 2 quality.

Suppose you can sum it up as follows:

- DLSS 3 performance mode + frame generation will give you a better looking game "overall" at the cost of "clarity/sharpness" as you'll be able to dial up the graphical settings especially RT
- DLSS 2/3 quality mode and FSR 2/2.1 quality mode will give you a sharper/clearer image but if wanting to achieve fps like the above settings, you'll have to sacrifice graphical settings and/or RT, which in return will result in lesser visual quality "overall"
 
Ok seems cp 2077 still doesn't have official FSR 2 integration.....

DLSS quality vs performance @ 3440x1440:




Personally performance looks to be holding up pretty well compared to quality imo, both are with 0 sharpness too.

In motion, performance isn't quite as clear as quality but certainly not bad so this is where it will be interesting to see what happens with frame generation enabled, will frame generation make performance mode look much better in motion or will any artifacts in the frames be amplified? I'm going to say if fps is above 90, you'll be hard pressed to notice but if fps is around 60, the issues might be more apparent as opposed to just using dlss without frame generation.
 
I honestly look at people hyping this up (and FSR I dislike both equally) and I wonder how you are falling for marketing so easily.

Most of the people here are buying high end cards and they are changing them every generation, you ain't struggling to hit 60 fps. Some of you probably haven't even seen a game running at sub 70 fps in years. and you are out here talking about how badly you want DLSS in your next graphics cards. For what, posting in the benchmark thread?

If you are at 60fps stable then you are good with regards to a gaming experience. Turning on DLSS (especially when you can't be bothered to optimise your settings) is just so you don't feel like you wasted your money by not using every feature the graphics card came with. That's all. You didn't need to turn it on, you wanted to, to get your monies worth.

IMO DLSS 2 and 3 is only useful if you are struggling to hit 60fps (after tweaking settings). Coincidentally If your not hitting 60fps then I reckon DLSS 3 will play like **** due to the latency. Since essentially the base frame rate before adding in fake frame will probably be in the low 30s.

Another thing that is funny. If a game runs poorly on high end hardware people say the devs are lazy for not optimising the game, unless Nvidia needs to sell you a graphics card then it's fine and there are no issues. What does CP2077 need a new higher RT mode for? They should be optimising what they have so it runs better. What is this nonsense about adding a new higher RT level.
Yeah, from 20fps range you'll get to stable 60fps by "optimizing settings". :D
Even with everything on low and I still can't keep looked 60fps in native resolution - granted, Cyberpunk can look surprisingly good like that.
 
Back
Top Bottom