• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

Why care about how much uplift dlss gives, us on amd cant use dlss but we can use fsr and thats what this topics about, if you wana discuss nvidia stuff there is the nvidia threads to do that.
 
Never said you do.

You know you may be right. I think the ultraFSR is bigger than 1440p at 4k so if DLSS quality is 1440p then it should be compared with qualityFSR ( at least most of the time, i think the quality FSR is upscaled from 1440p). So the performance gains on average should be those from FSR quality.

Yeah i checked now and Ultra fsr is 1.3 and Quality FSR is 1.5. Pretty bad quality tbh but still decent for a chinese piece of tech. :D
But i think you are wrong, the perf uplifts are pretty much the same when upscaled from the same res.
 
Also another factor that needs to be looked at in terms of performance gain is that whilst these aim to have the same outcome i.e. retaining close to native res. with a performance uplift.... they are still rather different in the sense that dlss is also essentially doing AA work (which is arguably superior than the majority of current AA methods/implementations) so if dlss was to be purely just upscaling and nothing else, I suspect the performance uplift from dlss would be even larger.
 
Also another factor that needs to be looked at in terms of performance gain is that whilst these aim to have the same outcome i.e. retaining close to native res. with a performance uplift.... they are still rather different in the sense that dlss is also essentially doing AA work (which is arguably superior than the majority of current AA methods/implementations) so if dlss was to be purely just upscaling and nothing else, I suspect the performance uplift from dlss would be even larger.
How can it be larger since they are very close to the initial res? You get 100 FPS at 1440p and you get around 100 FPS at 4k upscaled from 1440p with DLSS and FSR. Will it render 1440p faster if you enable DLSS? I guess not, otherwise it will do that at native res too. :)
Maybe you only get 98 or 99 with FSR, but they are very close.
 
You know you may be right. I think the ultraFSR is bigger than 1440p at 4k so if DLSS quality is 1440p then it should be compared with qualityFSR ( at least most of the time, i think the quality FSR is upscaled from 1440p). So the performance gains on average should be those from FSR quality.

Yeah i checked now and Ultra fsr is 1.3 and Quality FSR is 1.5. Pretty bad quality tbh but still decent for a chinese piece of tech. :D
But i think you are wrong, the perf uplifts are pretty much the same when upscaled from the same res.

Thing is though, ignoring that FSR doesn't currently have an output quality compared to DLSS quality, you need a higher input resolution to match the same output resolution/quality as DLSS as things stand - without some difference analysis though it would be only a guess how they stack up on that front currently albeit with experience it would be a close enough guess. It is easier just to look at the best efforts of each in a more general light - that isn't going to magically change if/when games support both.

Also another factor that needs to be looked at in terms of performance gain is that whilst these aim to have the same outcome i.e. retaining close to native res. with a performance uplift.... they are still rather different in the sense that dlss is also essentially doing AA work (which is arguably superior than the majority of current AA methods/implementations) so if dlss was to be purely just upscaling and nothing else, I suspect the performance uplift from dlss would be even larger.

FSR will also do a lot of anti-aliasing as it concentrates mostly on lines and filling in the missing pixels along edges rather than just simply just scaling up the pixel then blurring it with neighbours - where FSR can't match DLSS is the detail within textures and where DLSS can guess at missing information that might make things like text or odd shapes that might get decimated by LOD at lower resolution clearer. Where FSR is possibly better is when it comes to areas where DLSS can suffer from trails or ghosting/grain but the trade off there is blurring (not intentional it just doesn't have the information to produce any other output) so it isn't a great situation either and somewhat subjective what people will be bothered less by.
 
Actually idk what that video I linked shows, it doesn't show 50% gain from DLSS quality at all. Go to the actual DLSS part and its the same fps for every mode.

Hang on, that appears to be normal, DLSS changes nothing for a 3080 in watchdogs legion at 1440p.

So not sure why Rroff name dropped watchdogs for a 76-80% uplift. Even if we add in RT it's nowhere near that kind of uplift.

More comparable data needed still.

I just tried WD: L on my RTX 3080 with 4K mostly ultra settings. DoF and Motion blur off, RT Ultra, TAA and Shadows High. Now this was a quick look at the FPS in a static position to get an idea of performance in each setting.

DLSS off 28 FPS
DLSS Quality (highest setting) 40 FPS - ~43% uplift
DLSS Balanced 46 FPS ~64% uplift.

I tend to play at DLSS Balanced as it keeps me within my 4K monitor Freesync range. The uplift between DLSS and FSR seems somewhat similar but without testing both in the same game it is hard to quantify. Either way I have yet to find a single game where DLSS 2.0 gave me anywhere near an 80% uplift at similar settings to DLSS off.

EDIT: Adding some numbers for 1440p. This is a different area but identical settings other than resolution than I tested earlier but just for reference on performance differences between DLSS off and on.

DLSS off 47 FPS
DLSS Quality 61 FPS - ~36% uplift
DLSS Balanced 65 FPS - ~38% uplift

For some reason at 1440p changing DLSS from quality to balanced seemed to have no significant effect on the performance. But once again nowhere near the 76%-80% uplift for DLSS at 1440p claimed earlier.
 
Last edited:
Thing is though, ignoring that FSR doesn't currently have an output quality compared to DLSS quality, you need a higher input resolution to match the same output resolution/quality as DLSS as things stand - without some difference analysis though it would be only a guess how they stack up on that front currently albeit with experience it would be a close enough guess. It is easier just to look at the best efforts of each in a more general light - that isn't going to magically change if/when games support both.
Better pray that games won't start to support both soon, especially esports games. Not even the 3090 owners won't use DLSS anymore. :D
 
I just tried WD: L on my RTX 3080 with 4K mostly ultra settings. DoF and Motion blur off, RT Ultra, TAA and Shadows High. Now this was a quick look at the FPS in a static position to get an idea of performance in each setting.

DLSS off 28 FPS
DLSS Quality (highest setting) 40 FPS - ~43% uplift
DLSS Balanced 46 FPS ~64% uplift.

On Godfall it has 34% on UltraQual and 53% on Qual. This is without RT and any driver support, probably will get more if Nvidia makes some drivers.
 
On Godfall it has 34% on UltraQual and 53% on Qual. This is without RT and any driver support, probably will get more if Nvidia makes some drivers.

I would expect DLSS to be more mature and have a better uplift at this point. Just testing WD: L showed how far DLSS has come since release and I was very impressed with the quality even in balanced. But FSR is genuinely very impresive from my tests so far and the only way to tell the difference at 4K native vs 4K Ultra FSR is by pixel peeping.

Both do a great job with DLSS being overall a slightly better option but FSR has the potential to be far more widely supported. I don't give a crap how either works, they just do and they are both far better than standard upscaling and sharpening.
 
Last edited:
Don't make me actually round up and break down all the numbers because it won't look pretty...
You can do it but the games you mentioned aren't equivalent. The DLSS games have much more heavy RT effects & overall demands on a system. Best we can do now is raster vs raster in something like Godfall. That's why I said what you said makes sense only from a DLSS P vs FSR UQ perspective, but I didn't think you'd make the error of using RT-heavier titles with DLSS for a conclusion.

In Godfall (RT off), at 4K, on a 6800 XT, you get +42% performance with FSR UQ vs native.
In what I would call a similar title also on UE4, Mortal Shell, you can find a +40% improvement on a 3080 with DLSS quality at 4K (RT off).

Even granting you that DLSS Q looks better than FSR UQ (which in reality is still de gustibus), that difference cannot amount to what you claimed yesterday that DLSS will give you 2-3x more performance than FSR. At least not if we compare quality presets.
 
You can do it but the games you mentioned aren't equivalent. The DLSS games have much more heavy RT effects & overall demands on a system. Best we can do now is raster vs raster in something like Godfall. That's why I said what you said makes sense only from a DLSS P vs FSR UQ perspective, but I didn't think you'd make the error of using RT-heavier titles with DLSS for a conclusion.

In Godfall (RT off), at 4K, on a 6800 XT, you get +42% performance with FSR UQ vs native.
In what I would call a similar title also on UE4, Mortal Shell, you can find a +40% improvement on a 3080 with DLSS quality at 4K (RT off).

Even granting you that DLSS Q looks better than FSR UQ (which in reality is still de gustibus), that difference cannot amount to what you claimed yesterday that DLSS will give you 2-3x more performance than FSR. At least not if we compare quality presets.

There are obviously more DLSS games out there than I mentioned I just gave a few examples - trying to exactly compare games at this point is going to run into a lot of issues but you can broadly compare using a range of titles.

When I said 2-3x more performance I'm talking about the uplift i.e. if FSR is giving a +28% increase over native then DLSS would be somewhere around say +76% that is going to vary a lot from title to title, GPU to GPU and resolution, etc. hence why I said take a broad look at it and linked to FSR results which shows a spread of scenarios to use as a rough guide to the baseline.
 
I dunno about FSR, but with DLSS the uplift is roughly proportional to th base framerate.

so the lower the framerate is at native, the higher the dlss uplift will be

I suspect FSR's uplift is more on a fixed level
 
I have only tested Godfall so far, but here are the performance uplifts I saw.

Native Vs Ultra Quality 50%
Native Vs Quality 79%
Native Vs Balanced 95%
Native Vs Performance 104%

I would only ever use Ultra or Quality. Most likely Ultra Quality as it's not easy to tell the difference vs native unless you stop, take screenshots, zoom in and pixel peep looking for differences. Whilst playing this is something you don't do. In motion there are zero issues like ghosting or trails on Ultra Quality or Quality.

I will test Rift breaker and Terminator at some point over the weekend.
 
Like others you are focussing on the wrong part of my point. Though I agree that nVidia should be making more of an effort to make DLSS widely available (well sort of - at the same time I'm not really a fan of DLSS).

at the i
Better pray that games won't start to support both soon, especially esports games. Not even the 3090 owners won't use DLSS anymore. :D

none of my friends use DLSS in warzone they all complain about latency
 
Thing is though, ignoring that FSR doesn't currently have an output quality compared to DLSS quality, you need a higher input resolution to match the same output resolution/quality as DLSS as things stand - without some difference analysis though it would be only a guess how they stack up on that front currently albeit with experience it would be a close enough guess. It is easier just to look at the best efforts of each in a more general light - that isn't going to magically change if/when games support both.

If they put FSR in Battlefield 2042 there will be no reason to use DLSS most people dont care about the quality they just want frames and turn most of the junk of and just raise textures

RT who cares shadows nahh film gain nahh motion blur nahh eveything turned of

also DLSS has some latency issues in first person shooters but we do have no way of knowing if FSR will have the same issues yet
 
If they put FSR in Battlefield 2042 there will be no reason to use DLSS most people dont care about the quality they just want frames and turn most of the junk of and just raise textures

RT who cares shadows nahh film gain nahh motion blur nahh eveything turned of

also DLSS has some latency issues in first person shooters but we do have no way of knowing if FSR will have the same issues yet

I did use to play Quake 3 and Battlefield 2 and 4 ;)

Most of these resolution scaling systems are less than ideal if you want ultra-precise aiming for competitive situations as well due to lower input resolution.

Though both Quake 2 RTX and CP2077 I didn't find it too bad except under very specific circumstances though things like long range sniping can suffer more.
 
Yeah for competitive gaming you definitely do not want any image processing, upscaling, reconstruction etc - you want clean native image with nothing extra for maximum clarity and lowest latency
 
I did use to play Quake 3 and Battlefield 2 and 4 ;)

Most of these resolution scaling systems are less than ideal if you want ultra-precise aiming for competitive situations as well due to lower input resolution.

Though both Quake 2 RTX and CP2077 I didn't find it too bad except under very specific circumstances though things like long range sniping can suffer more.

I have an open mind on either solution

But I do believe you could be right about the lower input resolution
 
Back
Top Bottom