• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

For example look at this image:
And this one which is native 720p vs 1440 DLSS performance, ( because he thought it is fair to compare the two )
And again more info on that thing in the middle with DLSS on, but still useless. Meanwhile the quality of RT reflection in the water is atrocious with DLSS on.
https://www.capframex.com/assets/static/imageslider.html?url1=https://cxblobs.blob.core.windows.net/images/FSR_article/dlss/Metro_Exodus_EE/scene_B/DLSS%202.2/B_Metro_Exodus_DLSS_performance.png&url2=https://cxblobs.blob.core.windows.net/images/FSR_article/dlss/Metro_Exodus_EE/scene_B/DLSS%202.2/B_Metro_Exodus_720p.png&title1=DLSS performance&title2=720p
Is there any reason for why the RTreflections can't be upscaled like the rest of the image? I understand it starts from much lower number of rays and with a feature like FSR it will remain lower res no matter what. But i thought the temporal data can also help the RT reflection to be reconstructed. It is because the denoiser is also using temporal data, or what is the cause? It is the same on Quake2 with that TAA thing you keep talking about?
 
And this one which is native 720p vs 1440 DLSS performance, ( because he thought it is fair to compare the two )
And again more info on that thing in the middle with DLSS on, but still useless. Meanwhile the quality of RT reflection in the water is atrocious with DLSS on.
https://www.capframex.com/assets/static/imageslider.html?url1=https://cxblobs.blob.core.windows.net/images/FSR_article/dlss/Metro_Exodus_EE/scene_B/DLSS%202.2/B_Metro_Exodus_DLSS_performance.png&url2=https://cxblobs.blob.core.windows.net/images/FSR_article/dlss/Metro_Exodus_EE/scene_B/DLSS%202.2/B_Metro_Exodus_720p.png&title1=DLSS performance&title2=720p
Is there any reason for why the RTreflections can't be upscaled like the rest of the image? I understand it starts from much lower number of rays and with a feature like FSR it will remain lower res no matter what. But i thought the temporal data can also help the RT reflection to be reconstructed. It is because the denoiser is also using temporal data, or what is the cause? It is the same on Quake2 with that TAA thing you keep talking about?

Would need more examples for the quality of RT reflections - I've not noticed anything atrocious.

With Quake 2 reflections are the same quality as the rest of the world - though with either TAA or TU for some reason reflections of alpha/transparent surfaces are shimmering like crazy (temporal hall of mirrors kind of stuff) but I think that is because the developer hasn't really bothered to implement a solution there as in the stock maps you don't really have surfaces which show it but if you create a custom map with highly reflective/mirror surfaces it is quite bad.
 
Would need more examples for the quality of RT reflections - I've not noticed anything atrocious.

With Quake 2 reflections are the same quality as the rest of the world - though with either TAA or TU for some reason reflections of alpha/transparent surfaces are shimmering like crazy (temporal hall of mirrors kind of stuff) but I think that is because the developer hasn't really bothered to implement a solution there as in the stock maps you don't really have surfaces which show it but if you create a custom map with highly reflective/mirror surfaces it is quite bad.
I posted this some pages ago :
[URL="https://www.overclockers.co.uk/forums/posts/34900811/"]Fidelity Super Resolution in 2021[/URL]

Yes i know it is perf mode but it looks pretty bad, more like console like quality.
 
Main use of FSR as I see it is people on lower end hardware who'll just be happy to get a balance where they get reasonable frame rates and are resigned to the fact they are going to be compromising somewhere.

Well it is a compromise of quality still, it's just a different kind of compromise than say lowering other visual settings. In general however, I am pro having more options as trade offs. I'm just skeptical of how many people are going to really use this as a trade off when others are available. As I said, I'd love to know for sure with some hard data, that'd be super interesting.

In part this comes back to my initial thoughts on upscaling techniques which is that if you're just doing is via post processing, then why was this not done 10+ years ago? Why now? FSR seems like a response primarily to DLSS and it's obvious that Nvidia only bothered the huge investment in that tech to enable RT, it wasn't fueled because of a more general demand for better upscaling. It seems to me that AMD are preparing their suite of products for a move towards RT in the next generation which will necessarily require good upscaling. But in a 1080p -> 4k upscaled comparison in an RT supported game, I just see FSR losing hard to DLSS.

I'm more on the IQ purist side of things anyway. I've been like a pig in **** playing old games like the Assassin's Creed series in 4k with real genuine 8xMSAA it makes me do this https://www.youtube.com/watch?v=otwYnQ5TOh0&t=63s
 
Last edited:
I'd be really interested to see usage data for both DLSS and FSR (as it becomes adopted) and how many people are playing without RT enabled but using upscaling of some kind.

Be sure to check back once RDR2 gets DLSS.

Although I'm only targetting 1440p/60Hz I will still use DLSS where possible as it keeps the temps down. I was surprised to see the GPU fans at idle when playing Control Ultimate Edition maxed out, although it does take a cool night and the window open.

My old laptop has a 1070 with a 1080p/144Hz Gsync panel which I cap to 60. I plan on using FSR on this just to keep the temps down as long as the quality doesn't degrade too much.
 
Eesh, OK so my thoughts on this after watching the DigitalFoundary comparison video by Alex, who right now I consider to be one of the best people with an eye for detail and technical breakdown of visuals. https://www.youtube.com/watch?v=xkct2HBpgNY

Right off the bat I think they do a better job than I was expecting, given the constraints of what they're dealing with. Single frame using essentially post processing upscaling without access to temporal frames or much other data. It's certainly a win over general purpose up-scaling techniques in terms of quality so that's very positive. It does seem like the focus is more on edges than something like texture detail and this make sense because edges of straight lines have patterns you can easily detect/predict, where as internal texture detail is complex and can vary so much that enhancing it with post processing alone is going to be impossible without some form of additional data (from DL or subsamples or whatever). So it really feels like more of an aid to anti-aliasing than anything else.

I do think that vs DLSS 2.x Nvidia has the edge most noticeably in the more aggressive modes, this is where the deep learning really helps clean up textures and not just edges. Where as the higher quality 1440p to 4k upscaling looks somewhat comparable between the 2 and seems to be the core strength of FSR right now.

This leads me into use cases, which is tricky...to me it was always obvious that DLSS was a sister technology to ray tracing. RT was the core goal of Nvidia and they knew that rendering at more than 1080p is an impossibility. Adoption of RT was only ever going to happen if gamers could maintain their lovely 1440p or 4k resolutions. So DLSS was really invented to ease RT adoption. That comes across in DLSS use cases, typically you're taking games with RT at an internal res of 1080p and getting them up to 1440p or 4k. It has good enough upscaling from 1080p at least with DLSS 2.x to do this. It seems to be a sensible trade off for I think most people because of the improvement RT brings. However I'm not convinced it'll be commonly used anywhere else outside of getting RT playable, I certainly don't.

This is where I think this'll be a problem for AMD and FSR. They did not push RT hard in the current gen instead going for rasterization wins, they wanted reviewers to basically avoid it and treat the cards like more traditional rasterization cards because the performance just wasn't there. But I do expect them to move towards more RT cores in the next generation to catch up with Nvidia and that'll mean a push to more wide spread RT adoption and tackling the same problem as Nvidia. But the weakness is FSR is that it's pretty bad at 1080p -> 4k in terms of final quality, barely above regular upscaling it seems. What I believe will become the most common use case and primary reason for FSR in the next gen will struggle to compete. I'd be really interested to see usage data for both DLSS and FSR (as it becomes adopted) and how many people are playing without RT enabled but using upscaling of some kind. What % of people and what use cases are most common. My gut feeling on this is that it's close to zero. But it would be cool to see some real data on this rather than just speculating.

I could have sworn that FSR was aimed at general support across all games? Maybe I'm not remembering that correctly? That would have been a much bigger win for AMD over DLSS. But that opens up the question of integration into games, as Alex showed in his video game engine specific upscaling can be significantly better for the same performance cost. So for example if you're making an unreal engine game why would you bother integrating FSR when native upscaling is better?

The hope is of course that they do what Nvidia did which is continue to increase the quality over time, we want competition in this space. However I have a sneaky feeling this wont happen...Nvidia got wins because they could keep training the ML model over time, but this kind of more basic post processing it seems harder to get decent wins. The same way we've not really seen something like FXAA or SMAA improve over the years.

DF are on Nvidia's pay role.

Tim from HUB also has a great eye for detail, he commented on the shimmering, its not something FSR adds, its something that's more clearly visible if its already there, especially if you use the lower quality FSR settings.

Its easy to exaggerate it and point at it if you want make something of it, nothing is ever perfect and with that you can always find something if you want it to be there, they don't make half as much fuss about the trails and ghosting in DLSS, if they ever bring it up at all.
A few other reviewers did, but didn't bang on it like a drum because the ghosting is an artefact of DLSS that comes with the nature of it and its not that big of a deal, that's just rational. its not fulfilling the editorial will of a paymaster.
 
Be sure to check back once RDR2 gets DLSS.

Although I'm only targetting 1440p/60Hz I will still use DLSS where possible as it keeps the temps down. I was surprised to see the GPU fans at idle when playing Control Ultimate Edition maxed out, although it does take a cool night and the window open.

My old laptop has a 1070 with a 1080p/144Hz Gsync panel which I cap to 60. I plan on using FSR on this just to keep the temps down as long as the quality doesn't degrade too much.

1440p@60hz is a fairly easy standard for the 3080 to hit so it doesn't surprise me that if the game is capped that its usage is quite low, especially with good ambient cooling. But I'd argue this kind of standard is quite rare, people tend to get high end hardware to push the envelope. This is why I'm interested in numbers on this because to my knowledge there's no good data anywhere on it, and these new technologies are cool but what is their real impact?

Maybe leave it a few months for adoption to pick up and then do a poll on the forums and try and gather some data. My expectation is strongly that almost no one cares except for a handful of people trying to run RT and being forced to admit that 1440p/4k upscaled is better than 1080p native.
 
DF are on Nvidia's pay role.

Tim from HUB also has a great eye for detail, he commented on the shimmering, its not something FSR adds, its something that's more clearly visible if its already there, especially if you use the lower quality FSR settings.

Its easy to exaggerate it and point at it if you want make something of it, nothing is ever perfect and with that you can always find something if you want it to be there, they don't make half as much fuss about the trails and ghosting in DLSS, if they ever bring it up at all.
A few other reviewers did, but didn't bang on it like a drum because the ghosting is an artefact of DLSS that comes with the nature of it and its not that big of a deal, that's just rational. its not fulfilling the editorial will of a paymaster.

KitGuru did the direct comparisons and he dismissed them out of hand on another forum:
https://www.overclockers.co.uk/forums/posts/34901077/
https://www.overclockers.co.uk/foru...lution-in-2021.18923620/page-56#post-34901162

DF were saying UE TAAU was better than FSR,but then proceeded not to test the latter at lower settings,and didn't even look at FPS properly. At the same time,when they tested DLSS1.0,they didn't test UE TAAU with it AFAIK.

The DF video is an ideal singular data point,unless you actually read the rest of the articles on it.
 
Last edited:
Yep, Digital Foundry are going out of their way to ensure FSR has every opportunity to fail, it'a almost like they have some agenda. If only we had some kind of proof of their close ties and funding from Nvidia... Like their Nvidia sponsored "preview" for the RTX 3080 being "80% - 100% faster than an RTX 2080" and subsequent unbiased reviews showed the real difference to be 50%. I simply cannot fathom how people would take the word of proven liars and paid for shills.

I mean you have to be a special kind of idiot to think DF are even remotely objective. What is even more idiotic is basing your opinion on only this one single outlier review and ignoring the other postive reviews and even refusing to test the free FSR tech for yourself in a free demo.
Stop it I’m running out of signature space for your nail on the head quotes.
 
DF are on Nvidia's pay role.

Tim from HUB also has a great eye for detail, he commented on the shimmering, its not something FSR adds, its something that's more clearly visible if its already there, especially if you use the lower quality FSR settings.

Its easy to exaggerate it and point at it if you want make something of it, nothing is ever perfect and with that you can always find something if you want it to be there, they don't make half as much fuss about the trails and ghosting in DLSS, if they ever bring it up at all.
A few other reviewers did, but didn't bang on it like a drum because the ghosting is an artefact of DLSS that comes with the nature of it and its not that big of a deal, that's just rational. its not fulfilling the editorial will of a paymaster.

I'd be interested to what degree DF are on Nvidia payroll and evidence for this, it's not something I've heard before. Either way the depth of analysis that Alex provides is impressive. Most recently going back to his WDL deep dive to address the claims of another forum user on ray tracing settings, this was invaluable. It's just a depth of analysis I can't find anywhere else. But his eye for things like pixel counting to decide real resolutions of things like console games using dynamic resolution is extremely high quality and welcomed.

I'm still watching other videos on FSR and watching the HUB one now, they seem to concur that high res upscaling is good, but that lower resolutions end up blurry and not very good, they also confirm my suspicions which is that it's more of an aid for AA and edge detection rather than a balanced upscaler. This explains why it's doing an OK job at near-native resolutions but then large leaps from small to high resolutions take a huge impact. The edges remain quite good throughout but the texture detail inside poly faces are suffering, and that's the harder part with upscaling, and probably why Nvidia opted for ML approach rather than post processing approach.
 
I'd be interested to what degree DF are on Nvidia payroll and evidence for this, it's not something I've heard before. Either way the depth of analysis that Alex provides is impressive. Most recently going back to his WDL deep dive to address the claims of another forum user on ray tracing settings, this was invaluable. It's just a depth of analysis I can't find anywhere else. But his eye for things like pixel counting to decide real resolutions of things like console games using dynamic resolution is extremely high quality and welcomed.

I'm still watching other videos on FSR and watching the HUB one now, they seem to concur that high res upscaling is good, but that lower resolutions end up blurry and not very good, they also confirm my suspicions which is that it's more of an aid for AA and edge detection rather than a balanced upscaler. This explains why it's doing an OK job at near-native resolutions but then large leaps from small to high resolutions take a huge impact. The edges remain quite good throughout but the texture detail inside poly faces are suffering, and that's the harder part with upscaling, and probably why Nvidia opted for ML approach rather than post processing approach.
Regarding DF and nvidia - people are hung up about the DF 3080 review in which Richard quite clearly stated “do not trust these figures”. He stated many times that nvidia provided the cards and settings/games to test and people should wait for the full analysis.

Rewlly not sure why people don’t see this.
 
I'd be interested to what degree DF are on Nvidia payroll and evidence for this, it's not something I've heard before. Either way the depth of analysis that Alex provides is impressive. Most recently going back to his WDL deep dive to address the claims of another forum user on ray tracing settings, this was invaluable. It's just a depth of analysis I can't find anywhere else. But his eye for things like pixel counting to decide real resolutions of things like console games using dynamic resolution is extremely high quality and welcomed.

The issue is that he basically is argueing on multiple forums(Resetera and Reddit) and in his video,that UE TAAU is better than FSR. But the issue is he didn't bother testing UE TAAU when DLSS1.0 was out(it was there from 2018),then didn't bother testing Ultra Quality FSR,and seemingly did not see any added shimmering with UE TAAU. The whole video was rather weird WRT to UE TAAU.

But the issue is KitGuru basically tested it in Godfall,and saw UE TAAU was having a bit more sharpening,but looked worse overall due to shimmering and performed a bit worse. Then he dismissed it out of hand on another forum.

This is despite people pointing out others had also done the comparisons.Then he called out HUB for some weird reason,and basically on Reddit basically called out the entire press,that they were getting things wrong. Yet,it was a strawman because watching many of the videos,nobody was calling DLSS and FSR the same.Considering you have people with engineering backgrounds,and people with computing PhDs he is by far not the most qualified to make these kinds of hints.

DF historically made mistakes too,which have been pointed out.....yet they are poor at also recognising their own limitations.

Just because he can spend time analysing stuff,or talking a lot of technobabble does not mean anything - because in science and engineering that does not mean you will get a high impact paper.

Too many people get stars in their eyes with them,and gloss over obvious flaws that have been shown in some of their analysis(or things they themselves have glossed over for some weird reason).

If that is the case people such as Steve Burke at Gamersnexus, Dr Ian Cutress at Anandtech,etc would never be wrong. If anything Dr Ian Cutress is quite a humble guy(and won't cover up any mistakes he makes).
 
Last edited:
The issue is that he basically is argueing on multiple forums(Resetera and Reddit) and in his video,that UE TAAU is better than FSR. But the issue is he didn't bother testing UE TAAU when DLSS1.0 was out(it was there from 2018),then didn't bother testing Ultra Quality FSR,and seemingly did not see any added shimmering with UE TAAU. The whole video was rather weird WRT to UE TAAU.

But the issue is KitGuru basically tested it in Godfall,and saw UE TAAU was having a bit more sharpening,but looked worse overall due to shimmering and performed a bit worse. Then he dismissed it out of hand on another forum.

This is despite people pointing out others had also done the comparisons.Then he called out HUB for some weird reason,and basically on Reddit basically called out the entire press,that they were getting things wrong. Yet,it was a strawman because watching many of the videos,nobody was calling DLSS and FSR the same.Considering you have people with engineering backgrounds,and people with computing PhDs he is by far not the most qualified to make these kinds of hints.

DF historically made mistakes too,which have been pointed out.....yet they are poor at also recognising their own limitations.

Just because he can spend time analysing stuff,or talking a lot of technobabble does not mean anything - because in sciene and engineering that does not mean you will get a high impact paper.

If that is the case people such as Steve Burke at Gamersnexus, Dr Ian Cutress at Anandtech,etc would never be wrong. If anything Dr Ian Cutress is quite a humble guy(and won't cover up any mistakes he makes).
Can you evidence these mistakes? I owe DF nothing, but I’ve always been impressed.
 
Example, exactly from Alex :

Native 4k
4k.jpg


XboxX:
xbox.jpg

DLSS perf:
dlss.jpg


But in the video you will only see the XboxX compared with the native PC, never with the DLSS.
WOW! Let's play a game of is it DLSS or Xbox? Lmao! :cry: That's brutal.

All that super computer time spent fettling DLSS, hardly worth it :p
Someone should do a calculus so we can see how many trees died for DLSS. :D

Can you evidence these mistakes? I owe DF nothing, but I’ve always been impressed.
The issue is simply this: all these technologies have strengths & weaknesses. You can focus more on the pros than the cons if you want, and then that will showcase the tech as greater than it is, like how they do with dlss, or the opposite, like how they do with FSR. So f.ex you see how they focus on that one scene in an alpha game for TAAU vs FSR comparison, but they didn't do UQ, and they didn't test other scenes, and they didn't test other games (though available), and the people who did do that came away with a more positive impression of the technology. That's why we say they're very biased against AMD, because we can see what they're doing. And it's not even just with GPUs, they do the same for CPUs, in fact they used a specific cutscene in Metro Exodus which was bugged and AMD faltered a little, and then they use that as evidence for how "Intel had more stable frametimes", meanwhile if you look at the rest of the game and other cutscenes which aren't bugged, the opposite is true! So it's a systematic thing for them, don't ask me why, but it's very clearly there.
 
@Poneros has mentioned a few instances. Even their early coverage of RT and DLSS1.0,had Gamersnexus and HUB doing more in-depth testing of the issues than DF did(and were less forgiving in their assessments). However,with the FSR coverage,what is being said on social media is interesting. Just re-reading one response he made on Resetera,he literally said nobody did proper comparisons. Yet,he tested UE TAAU and said it was better than FSR in one game?

Yet,KitGuru tested both in the same game and concluded FSR was better due to less shimmering. Yet in the past they were quite sensitive about shimmering,yet now all of a sudden UE TAAU having more of it doesn't matter?? Then in the same post says most reviewers said FSR was not close to native,and criticised HUB and implied they were the only ones who saw otherwise,and were not looking at their own images. Yet plenty of places such as TPU literally said the same thing:

TPU said:
From a quality standpoint, I have to say I'm very positively surprised by the FSR "Ultra Quality" results. The graphics look almost as good as native. In some cases they even look better than native rendering. What makes the difference is that FSR adds a sharpening pass that helps with texture detail in some games. Unlike Fidelity FX CAS, which is quite aggressive and oversharpens fairly often, the sharpening of FSR is very subtle and almost perfect—and I'm not a fan of post-processing effects. I couldn't spot any ringing artifacts or similar problems.

Most reviews said it was close to native? Then the weird post on Reddit "correcting" the tech press about FSR and DLSS. Most made it clear they were not the same. The DF review is easily the most negative of all FSR reviews out there,especially with them bigging up UE TAAU. The reviewers were all testing the same games after all. So do you take the odd review out(testing the same stuff),or a balance of reviews which come to similar conclusions??

It's almost like he has prejudged the issue before doing the test,which might have worked with DLSS2.0 which is superior,but UE TAAU? AFAIK,DF didn't even really bother to test it with DLSS1.0(it came out in 2018).
 
Last edited:
Regarding DF and nvidia - people are hung up about the DF 3080 review in which Richard quite clearly stated “do not trust these figures”. He stated many times that nvidia provided the cards and settings/games to test and people should wait for the full analysis.

Rewlly not sure why people don’t see this.
Conflict of interests is a thing you know. That's why reputable outlets don't get into exclusive deals with specific vendors. Credibility is hard to create and easy to lose. DF is a great example of this.
 
Don't get why people are getting excited about it personally - I guess due to ignorance. At the same performance uplift as the ultra setting a decent temporal implementation will give pretty much identical image quality as native, with none of the softening seen with FSR. With the same image quality decrease as the ultra setting a good temporal implementation will see around double the performance uplift compared to native.

only about 12-15% of the user base can use DLSS
 
Main use of FSR as I see it is people on lower end hardware who'll just be happy to get a balance where they get reasonable frame rates and are resigned to the fact they are going to be compromising somewhere.

dont know but almost certain if its implemented in some decent first person shooters the main use will be to cap out your monitors FPS even if you lose some fidelity and most fps players dont care about how pretty the games are

and if that second brick on the left has the right grooves in it
 
Back
Top Bottom