• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA announces DLDSR (Deep Learning Dynamic Super Resolution), coming to GeForce drivers on Jan 14

Soldato
OP
Joined
7 Dec 2010
Posts
8,249
Location
Leeds
How are the DLDSR resolutions shown in Windows worked out?

On a 4K monitor, the new DLDSR resolutions that are enabled for me are:
5120 x 2880
5760 x 3240

Mine runs like crap btw, in the Witcher 3, on a RTX 3070.

It seems like we need some more options to scale at a lower multiplier. Couldn't Nvidia add some options to enable DLDSR resolutions below the desktop resolution, or just above the desktop resolution?


You do realise at them settings you are hitting vram limits and why you are getting such terrible performance ? 8GB VRAM is not enough at ..

5120 x 2880
5760 x 3240


There would be no point if your native resolution is 4k, the only way to get better quality with that feature is to go higher than 4k then down scales it to 4k..

What you want to use is DLSS, DSR can also impact your vram as it again renders at higher resolution and then down scales it to your native.

https://www.nvidia.com/en-gb/geforce/technologies/dsr/technology/

DLDSR does the same too as DSR it renders at higher resulution (which will use more VRAM) and scales it back down with better tech than DSR, but still hurts VRAM if you don't have enough and will degrade performance sometimes, works best really with a more powerful card for 4k.

https://www.nvidia.com/en-us/geforc...&ranSiteID=kXQk6.ivFEQ-hnyfgmSIWK4_d8.gXJJP8g



What you could do is run your screen at 1440p and then make it render the game at 4k and downscale it to 1440p for better performance and better than 1440p native image quality.
 
Last edited:
Caporegime
Joined
4 Jun 2009
Posts
31,039
Tried this for rdr 2 and wow, this is a killer feature! :cool: Especially when combined with dlss as you're able to render at a higher res. now.

PS.

I read somewhere that you won't see the true end result of DLDSR with screenshots taken outside of geforce experience because of the way the tensor cores/ai works, no idea if true or not???
 
Soldato
Joined
30 Jun 2019
Posts
7,875
Yup, I'm familiar with DSR.

Perhaps I was expecting too much of this tech. I'll probably just stick with DLSS where it's an option.

DLDSR did clear up the aliasing in The Witcher 3, but ruined performance. I don't know if it was related to a VRAM bottleneck or not though.

Nvidia could do with an upscaling tech like DLSS, that can be applied to all games (assuming they are reasonably modern).
 
Last edited:
Soldato
OP
Joined
7 Dec 2010
Posts
8,249
Location
Leeds
Tried this for rdr 2 and wow, this is a killer feature! :cool: Especially when combined with dlss as you're able to render at a higher res. now.

PS.

I read somewhere that you won't see the true end result of DLDSR with screenshots taken outside of geforce experience because of the way the tensor cores/ai works, no idea if true or not???


Screenshots are taken from the screen buffer/ frame buffer from what I understand so whatever is on the screen should be the same in the screenshot unless there is some strange layering that can cause some screencap tools to show like a black area where on the screen shows something there.


Maybe one for @Rroff .
 
Caporegime
Joined
4 Jun 2009
Posts
31,039
Yup, I'm familiar with DSR.

Perhaps I was expecting too much of this tech. I'll probably just stick with DLSS where it's an option.

DLDSR did clear up the aliasing in The Witcher 3, but ruined performance. I don't know if it was VRAM related or not though.

Nvidia could do with an upscaling tech like DLSS, that can be applied to all games (assuming they are reasonably modern).

Easy way to tell if it is vram is to look at frame latency, 0.1/1% lows and dedicated + allocated vram usage by using msi ab + rivatuner, if frame latency is all over the place or/and dips to low fps, most likely will be vram.

Screenshots are taken from the screen buffer/ frame buffer from what I understand so whatever is on the screen should be the same in the screenshot unless there is some strange layering that can cause some screencap tools to show like a black area where on the screen shows something there.


Maybe one for @Rroff .

Yeah not sure myself tbh, just read it here:

https://www.reddit.com/r/nvidia/comments/s4af9d/some_useful_information_about_dldsr_that_people/

  1. (Edit: This piece of information may be irrelevant if you're using GeForce Experience for screenshots.) If you take a screenshot of the game with DSR or DLDSR enabled to share it with the internet, the screenshot will be at the higher resolution you selected WITHOUT the processing of DSR or DLDSR applied. As such, if you take a screenshot comparing DSR 2.25x to DLDSR 2.25x and upload it for others to see, they will look identical as the processing of DSR and DLDSR have NOT been applied yet. As such if you take screenshots to compare DSR 4x to DLDSR 2.25x (Like Nvidia did), then most people will say that "DSR 4x looks better" because the DSR 4x image has more pixels than the 2.25x image and the extra "AI magic" in DLDSR hasn't been applied to the image yet.
 
Soldato
Joined
30 Jun 2019
Posts
7,875
What you could do is run your screen at 1440p and then make it render the game at 4k and downscale it to 1440p for better performance and better than 1440p native image quality.

Not sure the driver allows this yet for DLDSR. You would have to remove the resolutions above 1440p on a 4K monitor for that to work, but this is basically what I was getting at.

Alternatively, just like standard DSR, Nvidia could just add a 1.2x multiplier setting for DLDSR (and maybe even just replace the DSR option). This would be worth enabling if it reduced aliasing significantly.
 
Last edited:
Associate
Joined
8 Oct 2020
Posts
2,326
Tried it out on RDR2, which I’ve found to be a bit blurry at native with TAA. Even at 1.78x it’s a big improvement in quality.
 
Soldato
Joined
1 Jan 2003
Posts
2,968
Location
Derbyshire
Tried it out on RDR2, which I’ve found to be a bit blurry at native with TAA. Even at 1.78x it’s a big improvement in quality.

I tried RDR2 with 2.25 on a 1440p monitor and couldn't believe how fantastic it looked. Fps took a hit, although every in game setting was maxed :D
Dropping tree tessellation got the Fps back up.
 
Associate
Joined
8 Oct 2020
Posts
2,326
I tried RDR2 with 2.25 on a 1440p monitor and couldn't believe how fantastic it looked. Fps took a hit, although every in game setting was maxed :D
Dropping tree tessellation got the Fps back up.

Haha yea, it does hurt FPS but a noticeable upgrade in quality. BTW, you can still enable DLSS on top of it and you get some FPS back without losing quality.
 
Soldato
Joined
30 Jun 2019
Posts
7,875
I wonder who Nvidia is aiming at with DLDSR, it seems like it's only suitable for high end cards and doesn't help plebs with RTX 3060 TIs/ RTX 3070s, who struggle with 4K.

Couldn't they do an version of it designed to upscale 1440p to 4K?
 
Associate
Joined
23 Oct 2019
Posts
484
I wonder who Nvidia is aiming at with DLDSR, it seems like it's only suitable for high end cards and doesn't help plebs with RTX 3060 TIs/ RTX 3070s, who struggle with 4K.

Couldn't they do an version of it designed to upscale 1440p to 4K?

You literally described DLSS....
 
Soldato
Joined
30 Jun 2019
Posts
7,875
You literally described DLSS....
Yup, but DLSS has to be supported by each game. Tons of games lack DLSS support and always will.

DLDSR is quite limited because at the moment, it only allows upscaling above the native resolution of the display, not great for people with a 4K monitor (unless they have a high end card like an RTX 3080 or 3090).

AMD has Radeon Super Resolution that works in all games, because it's applied through the Control Panel.

What's clever about this implementation is that lowering the "in-game resolution to desired input level... will automatically upscale to native resolution" is all that's required.

It does require games to be run in exclusive full screen mode though...
 
Last edited:
Back
Top Bottom