• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

NVIDIA announces DLDSR (Deep Learning Dynamic Super Resolution), coming to GeForce drivers on Jan 14

Soldato
Joined
7 Dec 2010
Posts
6,136
Location
Leeds
NVIDIA announces DLDSR (Deep Learning Dynamic Super Resolution), coming to GeForce drivers on January 14th

https://videocardz.com/newz/nvidia-...ion-coming-to-geforce-drivers-on-january-14th

NVIDIA responds to Radeon Super Resolution upscaling with Deep Learning Dynamic Super Resolution downscaling tech
At CES 2022 AMD announced it is releasing a new feature based on FidelityFX Super Resolution called RSR (Radeon Super Resolution). The company confirmed that this technology will arrive with the new drivers somewhere in the first quarter. AMD FSR is a full-screen upscaling technology that does not rely on software implementation by game developers.

Meanwhile, NVIDIA made a surprising announcement today of Deep Learning Dynamic Super Resolution technology (DLDSR), which is an AI-powered Dynamic Super Resolution (DSR) downscaling technology that has been available in GeForce drivers for a while now. The DLDSR is an improvement over DSR that adds an AI layer powered by NVIDIA Tensor cores. This also means that this feature will be exclusive to NVIDIA RTX GPUs.

https://cdn.videocardz.com/1/2022/0...formance-image-quality-comparison-768x432.jpg

7tUeICb.jpg


Advanced Freestyle Filters
Our January 14th Game Ready Driver updates the NVIDIA DSR feature with AI. DLDSR (Deep Learning Dynamic Super Resolution) renders a game at higher, more detailed resolution before intelligently shrinking the result back down to the resolution of your monitor. This downsampling method improves image quality by enhancing detail, smoothing edges, and reducing shimmering.

DLDSR improves upon DSR by adding an AI network that requires fewer input pixels, making the image quality of DLDSR 2.25X comparable to that of DSR 4X, but with higher performance. DLDSR works in most games on GeForce RTX GPUs, thanks to their Tensor Cores.


https://cdn.videocardz.com/1/2022/01/nvidia-control-panel-dldsr-configuration-1.png

CKjulYr.png


Furthermore, NVIDIA has teamed up with the author of a popular tool called ReShade. NVIDIA will be adding new custom ReShare FreeStyle filters into its GeForce Experience tool:

  • SSRTGI (Screen Space Ray Traced Global Illumination), commonly known as the “Ray Tracing ReShade Filter” enhances lighting and shadows of your favorite titles to create a greater sense of depth and realism.
  • SSAO (Screen Space Ambient Occlusion) emphasizes the appearance of shadows near the intersections of 3D objects, especially within dimly lit/indoor environments.
  • Dynamic DOF (Depth of Field) applies bokeh-style blur based on the proximity of objects within the scene giving your game a more cinematic suspenseful feel.
    With DLDSR and SSRTGI combined, you can enjoy a remastered experience in classic games like Prey.
Both features will be available in the upcoming drivers that will be released on January 14th.
 
Last edited:
Soldato
Joined
19 Oct 2004
Posts
3,662
Location
London
interesting - so I can run games at 4k on my 1440p screen with no performance loss. I wonder if it will be all smeary though? I think I'd still rather just have DLAA...though i suppose this is kinda it isn't it
 
Last edited:
Associate
Joined
8 Sep 2020
Posts
482
I am actually looking forward to this as i use DSR to run both Dark Souls 3 & Sekiro at 6k via DSR ( 5760 x 3240 ) and notice an improvement in picture quality from that alone so any extra this offers will be a benefit :D
 
Associate
Joined
8 Oct 2020
Posts
1,154
For the lay-person, what's the point of this? What does it do and should I care about it?

Say you game at 1440p.

DLSS renders your game at 1080p, which is obviously much easier on your GPU, and uses AI to "add more pixels" so that it becomes a 1440p image. Under perfect conditions, you get a 1440p image at the cost of rendering a 1080p image (plus some overhead for the AI component).

This new release (DLDSR) renders the game at a higher-than-native resolution e.g. 4K, which means it has more pixels to play with - AI then figures out how to condense that into your 1440p image, because that's what your screen can display. In this scenario, you have a beefy GPU that can handle higher resolutions, and using that extra power to make your native image look better.
 
Soldato
Joined
29 Aug 2006
Posts
3,809
Location
In a world of my own
Say you game at 1440p.

DLSS renders your game at 1080p, which is obviously much easier on your GPU, and uses AI to "add more pixels" so that it becomes a 1440p image. Under perfect conditions, you get a 1440p image at the cost of rendering a 1080p image (plus some overhead for the AI component).

This new release (DLDSR) renders the game at a higher-than-native resolution e.g. 4K, which means it has more pixels to play with - AI then figures out how to condense that into your 1440p image, because that's what your screen can display. In this scenario, you have a beefy GPU that can handle higher resolutions, and using that extra power to make your native image look better.

I game at 4k with a 3080ti and use DLSS whenever I can. Will this be better?
 
Associate
Joined
8 Oct 2020
Posts
1,154
I game at 4k with a 3080ti and use DLSS whenever I can. Will this be better?

It will be better for quality because it has more data to work from, but worse for FPS as your card will need to render above 4K. DLSS renders below 4K and then figures out how to fill the gaps, which means it needs less GPU power.
 
Associate
Joined
8 Sep 2020
Posts
482
I game at 4k with a 3080ti and use DLSS whenever I can. Will this be better?

It will look better yes but will not produce the same fps as you get with DLSS , you will get the same fps as if you were running 4K native but the game will be rendered at a higher resolution via DSR ( basically super sampling )

You can enable DSR on its own now but comes at a performance hit as your GPU will be rendering the game at a much higher resolution . If you have either Dark souls 3 or Sekiro you can run them both at 6k 60fps with no hit to performance on a 3090 ( 3080ti will be identical pretty much ) , set DSR in Nvidia control panel :)
 
Soldato
Joined
19 Oct 2004
Posts
3,662
Location
London
I game at 4k with a 3080ti and use DLSS whenever I can. Will this be better?

By the looks of it you can get better IQ for no loss of frames, but it won't get you higher frames than native. What I'd really like is a boost of native FPS with the same IQ. Hopefully the 1.78 option will deliver that
 
Associate
Joined
10 Aug 2010
Posts
134
Say you game at 1440p.

DLSS renders your game at 1080p, which is obviously much easier on your GPU, and uses AI to "add more pixels" so that it becomes a 1440p image. Under perfect conditions, you get a 1440p image at the cost of rendering a 1080p image (plus some overhead for the AI component).

This new release (DLDSR) renders the game at a higher-than-native resolution e.g. 4K, which means it has more pixels to play with - AI then figures out how to condense that into your 1440p image, because that's what your screen can display. In this scenario, you have a beefy GPU that can handle higher resolutions, and using that extra power to make your native image look better.

So a 1440p image that is scaled down from 4k by AI looks much better than a native 1440p image on a 1440p screen?
 
Soldato
Joined
15 Oct 2019
Posts
8,460
Location
Uk
That's the idea, 4k image on a 1440p screen without it costing 4k performance, and looking better than native.
Will the fps performance land somewhere between 4K and 1440p or will this work without reducing the fps at all?
 
Associate
Joined
8 Oct 2020
Posts
1,154
So a 1440p image that is scaled down from 4k by AI looks much better than a native 1440p image on a 1440p screen?

Yea, the same image at 4K has more pixels than at 1440p, so the AI has more context as to what that pixel should look like at 1440p.

It’s going to be pretty niche though.
 
Associate
Joined
1 Jan 2003
Posts
2,401
Location
Derbyshire
I've played with dsr, and it's pretty cool running 4k on a 1440p screen, but it costs performance, so you need plenty of headroom, and it's not as sharp as native. But more detailed, if that makes sense.

Fingers crossed this bridges the gap and can be something great.
 
Soldato
Joined
15 Oct 2019
Posts
8,460
Location
Uk
Still waiting for RTX IO which was marketed at the 3000 series reveal yet not heard any news on it since.
 
Man of Honour
Joined
13 Oct 2006
Posts
82,513
The main gains will be things like edges (without the reliance on anti-aliasing techniques), text, areas which have anisotropic filtering issues at certain angles, etc. as there is more data to play with so as to smooth edges without blurring them, increase the detail of the edges of text, etc. which does make for a nicer image over native rendering but at a performance cost - it looks like a lot of that can be achieved by only slightly rendering above native resolution without necessarily having to render at the performance hit of 4x DSR - but it can also reproduce a passable imitation of having 4K resolution on 1440p (or other combinations of resolution i.e. 1440p on 1080p screen) as well if you don't mind the performance hit - though it isn't the same as having true 4K.
 
Associate
Joined
8 Oct 2020
Posts
1,154
Still waiting for RTX IO which was marketed at the 3000 series reveal yet not heard any news on it since.

Does that not rely on the Microsoft implementation? But yes, excited to see it and how it affects VRAM.
 
Top Bottom