• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

FSR removes a lot of detail at lower input resolutions, that sucks for gamers with older hardware - while DLSS improves the quality above native.

The difference in the two techniques is very clear. Even in the second example with a 50% higher resolution Native still has missing details which FSR takes and runs with producing rather poor image - DLSS improves the image beyond what Native could by filling in all the missing details




 
Last edited:
That happens at all resolutions for FSR because it cannot reconstruct the fine details. DLSS restores the fine detail because its temporal. There is more fine detail than native in the DLSS image, again because of the temporal reconstruction. FSR is basically limited to the information in one frame and thus cannot add new detail. So when a developer sets up FSR they need to make sure to preserve as much detail as possible. Like using AMD Cauldron TAA and not TAA.

Reguardless the small fine cables are the same data from the lower internal resolution with FSR. Native does better but DLSS has multiple frames of information to reconstruct the fine detail for the cables thus it looks better than native. The bridge just makes it impossible to objectively state FSR is better and like native. It can be clearly seen DLSS is the best image quality here, beating native. This is not to state DLSS is perfect or it truely beats native. Or that a still image is not better than an moving one. Just that the scene just plays to the strength of DLSS in a way that cannot be denied.

DLSS is objectively better looking than both native and FSR. With FSR clearly last in image quality. This is needed because already people are stating FSR is as good as native. That you cant see the difference. Its on power with DLSS 2.2.11. Better than DLSS 1.x.... That DLSS is dead, long live FSR.

That all makes sense and I suppose what it means is that the quality of FSR depends solely on how much information can be preserved in the native frame and for developers wanting to use FSR in their game during development they need to alter their game for this - something that was already confirmed during a recent developer interview where the developer said FSR was easy to implement but he had to completely change the game's rendering pipeline to deliver enough information to FSR otherwise the output was bad.

So that's why with what we have now (FSR being added after the fact into games) the implementation quality varies massively from game to game. If FSR is to become an industry standard it means all developers will need to build their games from the ground up targeting the FSR image output so that they design their game for that
 
Thats put better than I said. DLSS is trained that the 16k ground truth is the correct output, its what the network uses to work out if it is right or not and create feadback weights to improve the network. When you run the AI network it will try to recreate the image as it was trained to do.

Yep, lets see it in action:

DLSS Ultra Performance (720p) vs FSR Ultra Quality (1662p)

x63vpuomtyb71.jpg
 
Last edited:
These differences in thin line textures and assets are something DLSS is very good at recreating.

A good example of this can be seen when compared even close up images of hair on characters and NPC's in various games - often DLSS contains more details in the hair than native and same goes for FSR, hair lines look more natural and have more depth to them.
 
This thread reminds me of all the other NV vs. AMD OCUK 'post war' threads.

- Frame-pacing
- Adaptive Sync
- Mantle
- FXAA vs. MLAA

Every single time whilst Nvidia has the initial upper-hand, AMD gets it right eventually, they can't compete with the marketing muscle nvidia have but they do act in the right way for the whole of the community, given time.

NV also have plenty of shills available to keep posting on forums like these, I would give it 12 months and then call the victor. It won't be Nvidia simply on the basis that pretty much every new console game running will implement FSR.


AMD has best OpenGL and Vulkun support now and Tesselation right?

Consoles already had checkerboard upscaling for years yet very few even used it
 
The clear summary of the Avengers and TAA used in it (which looks only worse with FSR but has nothing to do with FSR as such), as Tim himself said, is that TAA in that game is just horrible and devs should fix it. Part of the FidelityFX is a much better TAA implementation and devs already added FSR to it, which means they have 0 excuse not to use that TAA either - it would improve image quality considerably on ALL GPUs, not just AMD. And of course, it would make FSR much better in motion too. This is a clear case of - the problem isn't with FSR, the problem is with devs implementing bad AA solutions in their games, which spoil the image quality even in native.

That's what I've been saying for multiple posts. FSR requires the developer to put in more work than we were led to first believe, unless they are happy to answer to a horde of FSR fans who will make posts like "lazy devs, FSR isnt bad, this game is bad"

DLSS is a lazier implementation that "just works" better than FSR and doesn't require bespoke changes to the game's render pipeline

Maybe the horde of FSR fans will convince developers to make their games look better at native so that FSR doesn't look bad, hopefully - as that will benefit everyone, even those who don't use FSR. But I doubt they will, they are lazy and time restricted - it would be better if AMD just updates FSR to use a custom AA reconstruction.

If you want a quick example of this, the default implementation of Anti Aliasing in the Unity game engine (which lots of games use) looks quite bad when you enable FSR on top of it. So now you're asking developers to make changes to the Unity engine to implement a new AA method or asking for the base game engine to be changed - AMD would have better luck just doing their own AA.
 
Last edited:
As per motion complaints - not always relevant because you don't see it. Image deficiencies are only noticeable when you capture a still image of it - you don't see any of that with your eye while you play the game. Still images though you do notice whenever the camera stops moving

And HUB mentioned this, in practical testing you can't tell the difference between dlss on or off in motion - the only way to see anything is to take a screenshot. And that becomes even more so in games that use motion blur which is most of them
 
CP2077 is a bit disappointing in this respect - you have all the overhead of traditional rendering techniques then some RT slapped on, with all the extra overhead of that, to fix the bits where traditional rendering techniques fall down rather than utilising RT for its best effect. Although they'd really need to design the game for RT from the ground up to take advantage of that which would eliminate non-RT capable hardware.

There are isolated bits in CP2077 where RT shines but 90% of the game you can barely tell it is enabled other than when you see reflections under conditions that screen space type techniques fall down. I can't personally un-see that the RT reflections in CP2077 are downgraded versions of the actual scene with some objects missing, etc. such as the player as well.

The only time I'd accept the compromises of FSR or DLSS personally is to get a decent RT experience in a game and nothing really does that yet. The closest is the path tracing in Quake 2 but that won't excite a lot of people and still struggles with noise in high contrast or dark areas.

I have no issue turning on dlss 2.0 or newer in any game and in fact I do because I like the look -even if there is no rt I still use dlss when available
 
FSR in Resident Evil village analysed

in this game, for some reason FSR produces a lower quality image than checkerboard rendering and has lower performance than checkerboard rendering - the difference is especially noticeable in the outdoor brighter areas of the game

 
Both look awful, DLSS lost detail in the wood planks and the roof completely and FSR jaggies galore, probably the over sharpening from FSR not helping matters.

Would defo be sticking to native there.

something interesting is that now that the SDK is out, it's being easy to dig through the files and as it turns out nearly all implementations of dlss in games don't use sharpening even though dlss supports it. Can't wait to start having sharpening sliders as a common place in games
 
Thanks to AMD being all pro open source. Documentation and digging through code confirms what a few peeps have suspected for months.

FSR is a slightly modified version of the Lanczos pass scaler technique.

What's more interesting is this scaler exists in some form inside the Nvidia driver for several years now

So with just a few clicks in the Nvidia driver so you can essentially enable FSR for all games. Also, unlike FSR and DLSS the Nvidia Lanczos pass gives you the freedom to scale from 1% of input resolution all the way up to 99% instead of 4 or 5 presets

https://wccftech.com/amd-fsr-based-...nvidia-gpus-using-control-panel-in-all-games/
 
Last edited:
The relatively new open source multi-platform 3D game engine Flax Engine has added FSR support.
https://flaxengine.com/blog/flax-1-2-released/
E8FzTeyXMAEL2dS.jpg


Significant reduction in image quality though in that comparison, that need to put work into the engine to clean it up.

I was first on my phone and it's hard to tell, but on a monitor it's rather obvious the one of the right is of a much lower quality.

Unless they got the labels wrong, it says "ultra quality" but it's closer to what I'd expect to see with "performance", i.e the image on the right looks like it's 1080p
 
Myst will be the first Xbox Series X/S game to use AMD FidelityFX Super Resolution

https://www.kitguru.net/gaming/matt...-game-to-use-amd-fidelityfx-super-resolution/

#Myst:

✅Xbox Play Anywhere
✅Ray Tracing
✅4K 60FPS
✅FSR

Thanks to #FSR, Myst can run in 4K at a smooth 60fps on #XboxSeriesX with the highest graphics setting across the board, and at 1440p at 60fps on #XboxSeriesS! https://t.co/o7ZB8gS764


If it's 4k then it doesn't need FSR

classic Microsoft with the bait and switch again, bet it's running at somewhere between 1080 and 1600p and they wanna call it 4k haha.

but hey look it's not just them, Sony spent half the previous generation calling it's games "4k" when they were all checkerboarding from 1440p
 
Last edited:
Back
Top Bottom