• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA announces DLDSR (Deep Learning Dynamic Super Resolution), coming to GeForce drivers on Jan 14

Soldato
Joined
26 May 2014
Posts
2,944
I don't understand how they are able to render at 2.25x resolution on the card, without any loss of performance? That doesn't sound right to me!
They have the framerate capped (probably with v-sync enabled on a 144Hz monitor). A card which only achieves 145fps in a game at 1080p isn't getting 108fps at 4K. That's not a possible outcome when you're pushing 4x the pixels, unless you were CPU limited at lower resolutions. However, the game is Prey, which isn't very demanding and doesn't suffer from CPU bottlenecking. From looking at some benchmarks, a 1080 Ti gets over 250fps at 1080p in that game on the highest preset, so they clearly have the framerate capped here to make it seem like the 2.25x option isn't losing you any performance. Incidentally, a 1080 Ti gets ~67fps at 4K in the same game, so they're testing with something much more powerful than that to achieve 108fps rendering at 4K.
 
Associate
Joined
8 Sep 2020
Posts
1,432
Way I see it is I'd rather use DLSS or anything that gives me higher frames for the same image quality rather than insisting on running at 4k for same image quality with less frames.

Last game I tried DLSS on, at 4k Cyberpunk 2077 looks great with Quality DLSS, however it definitely wasn't as sharp than native 4k, but to be honest it was either that or slideshow.

This is more for those games where you have no DLSS/FSR as an option and have to run it native in which case this should come into its own as will give you an improvement in image quality for no hit in performance or that's how its being advertised ... While DLSS is brilliant and gets better with every iteration it still isn't quite as good looking as running a game native let alone one that is super sampled from a higher res. This new DLDSR & DLSS are both very good options to have depending on what you are after , higher FPS or a better looking IQ with no hit to native performance :)
 
Associate
Joined
8 Sep 2020
Posts
1,432
They have the framerate capped (probably with v-sync enabled on a 144Hz monitor). A card which only achieves 145fps in a game at 1080p isn't getting 108fps at 4K. That's not a possible outcome when you're pushing 4x the pixels, unless you were CPU limited at lower resolutions. However, the game is Prey, which isn't very demanding and doesn't suffer from CPU bottlenecking. From looking at some benchmarks, a 1080 Ti gets over 250fps at 1080p in that game on the highest preset, so they clearly have the framerate capped here to make it seem like the 2.25x option isn't losing you any performance. Incidentally, a 1080 Ti gets ~67fps at 4K in the same game, so they're testing with something much more powerful than that to achieve 108fps rendering at 4K.

Tested with the 3090TI , the only card capable of reaching 108fps at 4k native :D
 
Associate
Joined
8 Oct 2020
Posts
2,282
It does sound promising, although I can see it being like DLSS in that it will require a second revision before it's widely used.

DLSS was revised because the original implementation was tedious (had to be taught per game), as well as a couple other performance issue. This would likely use the current approach where they do generic training.
 
Soldato
Joined
7 Oct 2003
Posts
3,886
Location
York
I'm looking forward to trying this. I run The Witcher 3 at 2.25x DSR on my 1440p monitor, so any improvement in either fps or IQ from DLDSR will be good :)
 
Associate
Joined
8 Oct 2020
Posts
2,282
I hope that is the case. I guess this is sort of a second revision anyway as an extension of DSR.

Yea, although it's driver level instead of being in the game so not sure whether full integration would make a difference. Between Intel, AMD and Nvidia, there's a benefit to not requiring games to implement your technology directly unless it creates very little overhead and is an improvement over driver-level integration.
 
Soldato
Joined
19 Feb 2007
Posts
14,258
Location
ArcCorp
Wonder how much they paid him for that and whether it'll be an Nvidia exclusive from now on

From what I read on Mcflys discord the Nvidia versions will have limits and his versions are more fully fleshed out and will continue to be available and updated.

Quote from him -

in essence, it's a customized RTGI version for Nvidia FreeStyle it has reduced sliders (quality, effect radius, intensity and fade out range) and is also embedded into the driver, so no transferring to ReShade. I don't think it's in competition with ReShade, as you tend to use either of the two and not together, and it doesn't have the extended features that the new RTGI versions have.
However, it's working on games that ReShade isn't working on, and it works out of the box, if you have an Nvidia card.
 
Associate
Joined
8 Sep 2020
Posts
1,432
They have the framerate capped (probably with v-sync enabled on a 144Hz monitor). A card which only achieves 145fps in a game at 1080p isn't getting 108fps at 4K. That's not a possible outcome when you're pushing 4x the pixels, unless you were CPU limited at lower resolutions. However, the game is Prey, which isn't very demanding and doesn't suffer from CPU bottlenecking. From looking at some benchmarks, a 1080 Ti gets over 250fps at 1080p in that game on the highest preset, so they clearly have the framerate capped here to make it seem like the 2.25x option isn't losing you any performance. Incidentally, a 1080 Ti gets ~67fps at 4K in the same game, so they're testing with something much more powerful than that to achieve 108fps rendering at 4K.

I just downloaded Prey as i got it free on the epic game store and its not that they have v-sync enabled but the game itself its capped at 145fps as at 4k it will sit at 145fps with around 85% load on the gpu , cpu will just sit at 5% so no cpu bottleneck either . It is somewhat misleading of them to use a game that has a hard cap in place as clearly doesn't paint all the picture although they are stating it does increase performance over using dsr alone just probably not as good as native , we shall find out tomorrow but either way as long as improves over standard dsr and doesn't cost anything then i will take that .
 
Associate
Joined
26 Jun 2015
Posts
668
Way I see it is I'd rather use DLSS or anything that gives me higher frames for the same image quality rather than insisting on running at 4k for same image quality with less frames.

Last game I tried DLSS on, at 4k Cyberpunk 2077 looks great with Quality DLSS, however it definitely wasn't as sharp than native 4k, but to be honest it was either that or slideshow.
Or actually go into the settings and play about with them? So many pc gamers don't seem to do this any more.
 
Associate
Joined
8 Oct 2020
Posts
2,282
Or actually go into the settings and play about with them? So many pc gamers don't seem to do this any more.

Some games have 20+ settings with varying degrees of influence, which is also inconsistent between games as implementations differ. Yes, I agree that you can get similar results by turning a few things down a notch or 2, but most people would rather spend their time actually playing the game.

If you can run max settings with DLSS and get close to native quality with playable FPS, then that's a really easy solution. If it's a less intense/well optimised title, and you have some FPS/GPU to spare, DLDSR is a nice option to have especially as an alternative to traditional resolution scaling.
 
Associate
Joined
17 Oct 2009
Posts
2,334
Is there something tech that uses DLSS to improve native without upscaling? The example they give of DLDSR still renders the game at higher than native (2.25 Vs 4x of standard DSR).

I noticed no man sky used DLSS AA, but it messed up some reflections.

Edit: Punctuation and spacing
 
Last edited:
Associate
Joined
8 Oct 2020
Posts
2,282
Is there something tech that uses DLSS to improve native without upscaling? The example they give of DLDSR still renders the game at higher than native (2.25 Vs 4x of standard DSR) I noticed no man sky used DLSS AA, but it messed up some reflections.

It's called DLAA, basically cleans-up/runs anti-aliasing against the native resolution image. Not sure whether it's actually in any recent games although I feel like I've seen the option somewhere.
 
Associate
Joined
17 Oct 2009
Posts
2,334
It's called DLAA, basically cleans-up/runs anti-aliasing against the native resolution image. Not sure whether it's actually in any recent games although I feel like I've seen the option somewhere.
Yep, I would have used it in NMS, but there was an issue running reflections at max, which was fixed by running them below max, but turning on DLAA removed the reflections at lower levels so I turned it off.

I know DLSS can be used to upscale, so 1080 ->4k for example, and it's supposed to improve the picture by doing so (rather than just running 1080 on a 4k display). DLDSR seems to be for running above native, hence the rendering at using DLDSR 2.25x is equivalent to running DSR at 4x. I just want it to take my 1080p image, work the DLSS magic to improve the picture that it does as if it would run at 4k, but still output a 1080p image as that's what my screen is. Unless DLDSR comes with a 1x render resolution....
 
Man of Honour
Man of Honour
Joined
23 Dec 2002
Posts
9,979
Location
London
Does anyone know if DLDSR will make some of the existing techniques such as MSAA redundant? Right now MSAA results in quite a large performance hit.
If DLDSR is completed using tensor cores and results in little performance impact, then does that mean we can stop using MSAA and indirectly get a performance increase?
 
Associate
Joined
26 Jun 2015
Posts
668
Some games have 20+ settings with varying degrees of influence, which is also inconsistent between games as implementations differ. Yes, I agree that you can get similar results by turning a few things down a notch or 2, but most people would rather spend their time actually playing the game.

If you can run max settings with DLSS and get close to native quality with playable FPS, then that's a really easy solution. If it's a less intense/well optimised title, and you have some FPS/GPU to spare, DLDSR is a nice option to have especially as an alternative to traditional resolution scaling.

Having those settings is part of the pc playing process, then by that merit, most of these guys should stick with consoles then, this is probably the reason why ff7 remake has barren settings.
 
Soldato
Joined
3 Jan 2006
Posts
24,945
Location
Chadderton, Oldham
Then using dlss isn't that....

Dlss and saying running at max graphics is hypocrisy, if you are happy with dlss, then you should be within reason to notch other settings down.

No, max settings / max res. Running max settings with DLSS with almost equivalent image quality is much better than the framerate hit. I'm not knocking settings down to reduce graphics / effects and then keeping resolution at native nope that's a worse deal.
 
Back
Top Bottom