Associate
Does anyone know the difference between DLSS Quality, Balanced and Performance?
Don't care for the lowest one, Ultra performance is it? Not even tried that one.
It is weird, but while actually playing it is hard to see the difference between DLSS Quality and Performance for me. I get like near 60fps by sticking DLSS to performance, where as on Quality it is in high 40's but dips into the 30's when in the most demanding areas.
Not done a proper test so wondering what others think. By the way I am on 4K so that might be different to the experience people on lower resolutions have. But I am surprised by how well DLSS Performance does at 4K relative to DLSS Quality.
Basically DLSS renders the game world at a lower resolution than your target output resolution, and then upscales the output back up to your native resolution. The different levels of DLSS decide the relative ratio/multiplier between the rendered original resolution and your target resolution. In order of worst quality to best they are:
Ultra performance (x3 the resolution, x9 the area)
Performance (x2 the resolution, x4 the area)
Balanced (x1.72 the resolution, x3 the area)
Quality (x1.5 the resolution, x2.25 the area)
These multipliers are essentially for the vertical pixel count, square that multiplier to get the total area difference. So for example at 4k (2160p) the numbers look like this
Ultra performance 720p
Performance 1080p
Balanced 1252p
Quality 1440p
DLSS 2.0 does a good job of reconstructing images but in terms of performance vs quality it does seem to be a bit of a non-linear relationship. What it subjectively feels like to me is that the higher the output/target resolution the more aggressive you can be on the scaling without getting too much image quality loss. I think what is happening is there's a realistic minimum resolution that you can upscale from and still get a good output, that floor seems to be around 1080p. The implications of this is that the sweet spot for what is best for you depends on your target resolution, you see that reflected by Nvidia and developers. For example Ultra Performance mode with a massive x9 difference in pixels was added in DLSS 2.0 to explicitly target 8k gamers because even with such aggressive scaling the rendered resolution is still 1440p. If you look at the CoD Cold War DLSS settings it actually recommends this, Ultra Performance for 8k, Performance for 4k, Quality for 2k. And for Cyberpunk it looks like it's doing something very similar, the "auto" setting appears to be using Performance at 4k, Balanced at 1440p and Quality at 1080p.
I've been toying with different settings for Cyberpunk as I've been playing through and my conclusion so far is that for 4k gaming with a 3080 is best at Ultra RT preset and using DLSS in Performance mode. The bottom line is that DLSS performance mode will not recreate an output as good or better than native 4k, you can get that with DLSS Quality for sure, but the frame rates are too low. I'm really trying to keep the game above 45fps at all times which DLSS Performance manages. While DLSS performance for 4k it's not quite as good as native 4k, it's definitely close enough that it's a worthy trade off to enable use of Ray Tracing, the game is really stunning.