• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

wish we had some comparisons

did dlss 1.0 actually use sharpen? most of the magic of FSR seems to be coming from an intelligent use of sharpening. heck, this is even the case for many dlss 2+ titles out there

When you get image that actually has no details at all (all lost in upscaling) there's nothing to sharpen. That was the issue with DLSS 1 - a lot of early games (especially BF V) had so bad quality in DLSS 1 because it just completely removed texture details, especially bit further away from the camera. FSR doesn't do that, hence sharpening helps to bring them back.
 
Plus the fact Nvidia did it very badly and lazily it seems, AMD's solution seems to be much superior to DLSS 1.xx

From an objective standpoint FSR can never match DLSS 1 because it just has no way to recreate the lost details. AMD have a patent to add AI to FSR.

DLSS 1 is the least issue for FSR. DLSS 2.2 is so good it like voodoo withchcraft. So going forward we will get subject arguments for FSR in the media because any objectinr argument will show that image quality suffers at every FSR quality setting. Also FSR at 1080p interal resolution upscaling to 4k or performance mode, will show just how poor FSR is compared to DLSS 1.x. It will have a nasty blurred output image at 4k, showing the internal resolution and not the clear sharp near native image of DLSS performance at 4k provides. Its not that DLSS 2.2 is better than FSR, its that its very easy to see how much better it is.

Basically at ultra quality you have an internal resoution of 1662p and quality 1440p. An upscale from these resolution would be decent but still blurred. The issue with sparial upscaling is that you cant hide the internal resolution. DLSS native 1080p at output 4k is used in most games. This would be performance mode on FSR.

Now compare 1080p upscaled to 4k FSR to DLSS 1 and you will see the problem. Both have the same issues but dlss is better. DLSS hides the lower internal resolution better and provide better image quality. Creating a better image when scaling from 1080p to 4k. DLSS 2.2 upscales from 1080p to 4k in performance mode and looks close to native. FSR from 1080p to 4k look like complete garbage.


This video shows at every quality settings FSR destorys detail and becomes blurred. By the time you hit performance mode the image is nasty. The higher quality setting like ultra quality and quality have high enough starting resolutions to hide the low quality of the upscaling method.

AMD FidelityFX Super Resolution FSR Review: Big FPS Boosts, But Image Quality Takes A Hit - Digital Foundry.


With FSR you always lose more quality, compared to DLSS 2.2 and native.
 
Last edited:
Fair enough, just thought I'd check. :)

I posted my comparison in Godfall using true Native vs Ultra Quiality and when playing I can't really notice a difference unless I take screenshots and start pixel peeping at 4K with screenshots.

If pixel peeping then i could see a very slight reduction in image quality, softening of the image. There were no artifacts or issues with movement either when playing, other than a 50% increase in FPS.

A worthwhile trade for me given the graphical demands of Godfall max settings at 4K, but I can understand that others may see it differently.

FSR Native v Ultra Quality - Imgsli

Maybe it's worth mentioning that I do play in HDR and that makes a big difference, and I also sit close to a 55" 4K TV as my main display. I find that SDR looks flatter in general and changes the perspective on things. Now that I've read your post again I'm starting to think that might also play a big factor.
 
Maybe it's worth mentioning that I do play in HDR and that makes a big difference, and I also sit close to a 55" 4K TV as my main display. I find that SDR looks flatter in general and changes the perspective on things. Now that I've read your post again I'm starting to think that might also play a big factor.
Interesting, the screenshots were taken with HDR off however prior to taking those I was using HDR and still didn’t notice a significant difference. I’ll recheck though and see.
 
Basically AMD hardware is less powerful and less able to deal with more divergent rays. The RT GPU performance is more important at this point than the raster performance. If you look at Unreal Engine 5 most of the raster processing is done by the engine in software because no GPU hardware can match the performance. Thus a fast raster proformance on the gpu is less important. If you turn of RT then the performance of the GPU is very important. NVidia cards can provide better DXR performance at higher quality settings and get higher frame times. In every game nvidia is faster. AMD design a card that is faster at current games in raster but left DXR performance low. NVidia gives good raster perfomance and far fastest DXR performance.


This is why DLSS exists, to make DXR games able to run at high resolutions. DLSS 1 used sparial upscaling, an AI netwark and sharpening. This is better than just using sparial upscaling and sharpening for image quality. The media attacked DLSS 1's image quality. We had zoomed in image and pixel by pixel criticism of every deviation from native. So to get the image quality nvidia move to temporal upscaling in DLSS 2.x. Temporal upscaling was the breakthrough for image quality in DLSS 2.x.

AMD is forced to create FSR. FSR uses the same method nvidia rejected for image quality reasons in DLSS 1.x. There is no AI network. You can see the the image is not native, lacks detail and is blurred compared to native. All the problems of DLSS 1. At this point DLSS 2.2 is being called witchcraft its so good at increasing performance and keeping image quality.

There has been no attack on FSR image quaility in the same way as DLSS 1. FSR is blurred and there is less deatil in the image. Youtube posts on a video talking about how good FSR is. This is from a gamer using FSR. They tell the truth.



The massive performance lead in DXR is being ignored. FSR image quality problems are not being addressed. DLSS being ignored for benchmarks results.
I don't think we needed that long of an expansion for an obvious fact i.e. Nvidia is better at RT. I said that already in my post. I don't think the importance of raster has gone away. We have not even reached 1440p 144FPS let alone 4K 144 FPS for a majority of AAA games.
 
That's where we'll see it kick up a gear, when they add it to hardware (as we're expecting with RDNA3) :cool:

Wow man how did you get a 3080 ti. I would buy one today.

Indeed we want better features from both companies. If one gets behind we want them kicked back into line. Would be nice to have competition and not price gouging. The only thing AMD want to compete in, is how much they can charge.
 
Digital Foundry are Nvidia Employees, of course they are the only people who disagree with the consensus, its what they are paid to do.
 
Digital Foundry are Nvidia Employees, of course they are the only people who disagree with the consensus, its what they are paid to do.

AMD on Ray Tracing

120913451_1810769802394372_4425436853872406132_n.jpg
 
that dota 2 1080p %75 fsr is a pure lie though

There is so much wrong with this statment! Serioulsy, no punctuation or capitilsation?

:)

All joking aside, I have tried FSR in DOTA and at 1080p with 75% render scale FSR sharpens the textures and adds some missing details but it softens the edges a bit. So while it really is subjective it does arguably look better and his statement is not a lie.

Native.jpg


FSR-75.jpg
 
Last edited:
Sharpening means a lot for all these upscaling tricks: just try to sharpen a native image and then compare it with DLSS, or someone made a comparison between UE5 TAAU vs DLSS, they look close but if you sharpen the TAAU a little then it looks much better than DLSS.
No wonder why a lot of people on reddit will tell you to sharpen the image after using DLSS. And why Alex sharpened the images before comparing them with FSR because " FSR uses sharpening". So does DLSS but he never sharpened the images before comparing them with the DLSS. :D
It plays the biggest part in giving the players that "better than native" feeling.
 
Sharpening means a lot for all these upscaling tricks: just try to sharpen a native image and then compare it with DLSS, or someone made a comparison between UE5 TAAU vs DLSS, they look close but if you sharpen the TAAU a little then it looks much better than DLSS.
No wonder why a lot of people on reddit will tell you to sharpen the image after using DLSS. And why Alex sharpened the images before comparing them with FSR because " FSR uses sharpening". So does DLSS but he never sharpened the images before comparing them with the DLSS. :D
It plays the biggest part in giving the players that "better than native" feeling.
but in many cases it hurts the natural look for the image. in some cases, its a compromise, you get rid of the softness but then the image looks weird

rdr 2 is one pure example. it looks hideous with any kind of sharpening. i tried cas, nvidia driver sharpen, geforce sharpen, luma sharpen. no matter what I do, the original look and feel of the game is ruined

some games do play better with sharpening, cyberpunk is so-so

valhalla is crazy. literally i see no downsides but only upsides when sharpening that game

so it depends on game engine or graphical features of the said game

dunno about dota 2 but i watched Hardware Unboxed's review on Dota 2 FSR test. 1080p %75 clearly looked hideous. i will try installing the game today and try myself. i don't have high hopes tho
 
but in many cases it hurts the natural look for the image. in some cases, its a compromise, you get rid of the softness but then the image looks weird

rdr 2 is one pure example. it looks hideous with any kind of sharpening. i tried cas, nvidia driver sharpen, geforce sharpen, luma sharpen. no matter what I do, the original look and feel of the game is ruined

some games do play better with sharpening, cyberpunk is so-so

valhalla is crazy. literally i see no downsides but only upsides when sharpening that game

so it depends on game engine or graphical features of the said game

dunno about dota 2 but i watched Hardware Unboxed's review on Dota 2 FSR test. 1080p %75 clearly looked hideous. i will try installing the game today and try myself. i don't have high hopes tho

Check out the spoiler in my post above to get an FSR 75% vs Native at 1080p in DOTA2. I would say it is subjective on which one is better and this highlights the problem where we presume to have all the facts based on one video/opinion.
 
Check out the spoiler in my post above to get an FSR 75% vs Native at 1080p in DOTA2. I would say it is subjective on which one is better and this highlights the problem where we presume to have all the facts based on one video/opinion.
%75 looks good, but i'll believe it when i see it myself

it definetely didn't work like this in riftbreaker and it doesn't correlate with hardware unboxed's comparisons
 
Back
Top Bottom