NVidia DLDSR (Dynamic Super Res+AI) Examples/Thoughts

Soldato
Joined
9 Dec 2006
Posts
9,287
Location
@ManCave
thought it would be cool to discuss DLDSR experiences here in 1 thread
All my images will be DSR 1.78x from 4K (5461x2880)

What are these?
DLSS - takes a render image resolution (say 4k) Then renders at a % of that resolution & upscales. DLSS Balanced would rendering at 1080 & upscaling to 4k - Qaulity would be 1440p -> 4K

DSR - does the oppssirte renders at % higher & scales down the image for better visuals
DLDSR Does DSR but uses AI/Tensor cores for calculations to make the image better with less fps impact


https://www.nvidia.com/en-us/geforc...r-resolution-prey-dldsr-ssrtgi-comparison-01/


zKpeXFF.png

Snowrunner - works flawlessly no FPS drop (maybe 4fps max)
God of war - Does not detect higher res (maybe due to borderless only)



Snowrunner
F2fAgVV.jpg
 
Last edited:
I’ve tried it in two games, Diablo 3 and Destiny 2. This link provides a comparison within D2; https://imgsli.com/OTA4MDE

In D3 there is a definite improvement to the image clarity at 1440P at x2.25 DL, the edges are smoother and details a little sharper, no improvement to FPS as it’s capped at 141.

In Destiny 2 again at 1440P using resolution scaling of 150% (4K) I get an average FPS of 79fps in my are of testing, using x2.25 DL that drops to 74fps but again the detail is a little sharper, it’s most noticeable in fine details like grates, tangled wires and the like which would otherwise look some what pixelated in places.

Overall it seems to be a nice tech and something I’m sure we’ll see more of in future generations of cards. Something I’m curious about is if there’s more of a performance uplift with the 3xxx gen cards and their additional tensor cores?
 
Last edited:
I haven't noticed any difference, as if it's turned off....... Yes it's turned on correctly in the Control Panel.

Didn't do anything in Battlefield 2042 for example
 
From what I've seen, the steps are.
  1. Go to the NVIDIA Control Panel, under 3D Settings > Manage 3D settings > DSR - Factors. This is set globally, not per game.
  2. Choose between 1.78x or 2.25x.
  3. After applying the change, you should not have access to a higher resolution option on the desktop and in game e.g. on a 1440p screen, using 2.25x, I now have access to 4K (3840 x 2160).
  4. You don't need to change your desktop resolution, just the game.
  5. Open the game and set it to the new resolution.
  6. Seems to work fine with DLSS.
I tried it out with Shadow of the Tomb Raider. Using the in-game benchmark.
  • 1440p Native - 109 FPS
  • 1440p DLSS Quality - 140 FPS
  • 4K DLDSR - 58 FPS
  • 4K DLDSR + DLSS Quality - 85 FPS
The last config would be my pick.
 
I haven't noticed any difference, as if it's turned off....... Yes it's turned on correctly in the Control Panel.

Didn't do anything in Battlefield 2042 for example
But 'Resolution Scale' within the Battlefield series improves image quality, just fps takes a big hit. I can play BF4 @ 150% on 4K and it looks awesome (3080 mind you). Will give this new tech a try.
 
From what I've seen, the steps are.
  1. Go to the NVIDIA Control Panel, under 3D Settings > Manage 3D settings > DSR - Factors. This is set globally, not per game.
  2. Choose between 1.78x or 2.25x.
  3. After applying the change, you should not have access to a higher resolution option on the desktop and in game e.g. on a 1440p screen, using 2.25x, I now have access to 4K (3840 x 2160).
  4. You don't need to change your desktop resolution, just the game.
  5. Open the game and set it to the new resolution.
  6. Seems to work fine with DLSS.
I tried it out with Shadow of the Tomb Raider. Using the in-game benchmark.
  • 1440p Native - 109 FPS
  • 1440p DLSS Quality - 140 FPS
  • 4K DLDSR - 58 FPS
  • 4K DLDSR + DLSS Quality - 85 FPS
The last config would be my pick.
your last one is a weird combo but cool. but your intstructions are correct

so to break this down for 4K (now 5xxx due to dldsr)

DLSS your Saying render at 1080p/1440p & upscale to 4K
DLDSR then at the same time render at 5K and downscale to 4k..

my guess is it renders at 5K then DLSS uses 5K as % so use arounnd 1600p (% of 5k) hence you get 85fps

so DLSS qaulity would be around 1600/1800p
 
your last one is a weird combo but cool. but your intstructions are correct

so to break this down for 4K (now 5xxx due to dldsr)

DLSS your Saying render at 1080p/1440p & upscale to 4K
DLDSR then at the same time render at 5K and downscale to 4k..

my guess is it renders at 5K then DLSS uses 5K as % so use arounnd 1600p (% of 5k) hence you get 85fps

so DLSS qaulity would be around 1600/1800p

Well since you're telling the game engine to render at 4K, DLSS should be working from the 4K image, and not the DLDSR 1440p image.

I have no idea what the actual order of events are. The combo of DLSS and DLDSR worked flawlessly for me - gameplay was smooth.
 
Well since you're telling the game engine to render at 4K, DLSS should be working from the 4K image, and not the DLDSR 1440p image.

I have no idea what the actual order of events are. The combo of DLSS and DLDSR worked flawlessly for me - gameplay was smooth.
so worked it out

DLSDSR would be set at 5K (4k+DSR) (so game is rendering at 5k)
So DLSS would use % of that 5K image (1600p-1800p) to do its calculations (4K is 1440p)
so your using around 2890x1600p as your basis for all DLSS calcuations
 
Sorry to be the thick turd and ask this question.
DLDSR? What is this/ How does this differ to the infamous DLSS?

DLSS takes a lower resolution image and uses AI to upscale it to your native resolution. DLDSR takes a higher resolution image and uses AI to downscale it to your native resolution.

One has more (DLDSR) pixels to work with and the other has fewer (DLSS) pixels to work with. DLSS helps you get more FPS, DLDSR is useful if you have GPU power to spare and want a better quality image.
 
Sorry to be the thick turd and ask this question.
DLDSR? What is this/ How does this differ to the infamous DLSS?
DLSS - takes a render image resolution (say 4k) Then renders at a % of that resolution & upscales. DLSS Balanced would rendering at 1080 & upscaling to 4k - Qaulity would be 1440p -> 4K
DSR - does the oppssirte renders at % higher & scales down the image for better visuals

DLDSR DOes DSR but uses AI/Tensor cores to make the image better with less fps impact
 
I can only imagine this is possible because DLDSR is actually a combo of both DLSS and DSR.

e.g. the 1620 > 1080 with 3fps loss probably started with a 720 upscale to 1620 first, then the supersample to 1080 afterward.
 
Works in Diablo 3 (cant really notice the diff even with output resolution 6144x3240 :p) and Alien Isolation - both are games that run at good FPS with such settings though :)

Also tried GoW but since its running Borderless I have to set desktop to DSR first to get it to work. Get smallish black bars on left/right though as if aspect ratio isnt correct. With DLSS in game set to off it shows Output Res as 6144x3240 with Render Res showing as 5760x3240. With DLSS set to Quality Render Res changes to 3840x2160.
 
I thought that perhaps my 4K TV's annoying 4096x2160 resolution that shows up in Windows could be messing with the DSR factors and so I removed them using CRU and the 2.25x resolution is showing as 5760x3240 now and no black bars - looks sharper now too :p
 
Sorry to be the thick turd and ask this question.
DLDSR? What is this/ How does this differ to the infamous DLSS?

This feature is for systems/games that have GPU horsepower to spare, and can afford to render the image at a higher resolution that what your monitor supports (native). The GPU then uses that higher resolution to improve the image quality on your monitor at native without it actually costing the same amount as gaming at that high resolution.

So for example you can game on a 1440p monitor at 1440p using a 4k image resolution (DLDSR 2.25) to improve the native 1440p image without performance dropping by the amount 4k would cause.

I've done this on Doom eternal, and it looks pretty good, and runs over 100fps!
 
So for example you can game on a 1440p monitor at 1440p using a 4k image resolution (DLDSR 2.25) to improve the native 1440p image without performance dropping by the amount 4k would cause.
Actually it would drop the performance by the same/similar amount since as far as a game is concerned its rendering at the 4K resolution. You can see this by taking screenshots as they generally will happen before the downscaling takes place and be full 4K resolution. Nvidia's claim is that when the image is later downscaled to be sent to your monitor the newer DLDSR with will produce a better quality image than the DSR with the same scaling factor. They claim that 2.25x DLDSR will look as good as 4x DSR which is where the performance claims come from :)

Of course if a game supports DLSS then it gets somewhat awkward. Is native 3840x2160 quality better or worse than DLSS Quality using a base 3840x2160 resolution to upscale to the 2.25x 5740x3240 before downscaling back to 3840x2160 for output to monitor? :p
 
Back
Top Bottom