• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

There is a part two of that tweet:

Nvidia shill :)

My post was in response to oguzsosoo incorrect statements with regards to singling out about how bad dlss is at motion and how it can't render details or be as clear when HU video shows the complete opposite (not to mention as per mine and the vast majority of peoples experience and evidence to back up why TAA is considerably worse for motion than dlss in RDR 2). When it comes to native/TAA VS DLSS, there is no question that dlss is a far better alternative on the whole, take your pick, TAA being awful or DLSS just being a "superior" method of AA... Important point to remember, TAA is in the vast majority of games these days...

Did you even watch the video? And if you did, surely you would understand that you can't just come to a conclusion about the overall picture/motion after a 10 line tweet.... :o HU video shows the motion issues/advantages of both dlss and FSR perfectly well and in avengers, it is clear that DLSS is better in that game with "evidence" to show why.... Unless of course HU are considered to be nvidia shills now :cry:

And if people keep insisting on dlss being so bad at motion, then I guess we can also paint FSR motion capability with the same brush too then???? After all, HU stated it in a 4 line tweet.... :o
 
Last edited:
My post was in response to oguzsosoo incorrect statements with regards to singling out about how bad dlss is at motion and how it can't render details or be as clear when HU video shows the complete opposite (not to mention as per mine and the vast majority of peoples experience and evidence to back up why TAA is considerably worse for motion than dlss in RDR 2). When it comes to native/TAA VS DLSS, there is no question that dlss is a far better alternative on the whole, take your pick, TAA being awful or DLSS just being a "superior" method of AA... Important point to remember, TAA is in the vast majority of games these days...

Did you even watch the video? And if you did, surely you would understand that you can't just come to a conclusion about the overall picture/motion after a 10 line tweet.... :o HU video shows the motion issues/advantages of both dlss and FSR perfectly well and in avengers, it is clear that DLSS is better in that game with "evidence" to show why.... Unless of course HU are considered to be nvidia shills now :cry:

And if people keep insisting on dlss being so bad at motion, then I guess we can also paint FSR motion capability with the same brush too then???? After all, HU stated it in a 4 line tweet.... :o
Of course i watched it and i was joking i always thought HU were fair compared with other reviewers.
And DLSS does TAA too but it does its own TAA.
 
It's a fair video overall and the part where he spots the particle changes (which I actually like, and would look great in HDR) underline something about DLSS which I haven't seen talked about much and that is the change to the "reference" look like when I showed in Avengers DLSS showing speakers' cover being opaque compared to reference where it was less opaque. I'm willing to bet there's many such instances which haven't been discovered simply because people don't play the games to find such differences. So it comes down to how much you care about 'artistic intent'.
Also haven't seen much focus on the softer nature of DLSS (vs FSR) in many cases, such as with the snack vending machine in Avengers that I showed, which also happens in other cases.

Will also be interesting to see Radeon vs Geforce GPU comparisons because I'm willing to bet that FSR will make RDNA 2 look better than Ampere given how well they do at lower resolutions & how better they can use the CPU - so that people who don't have 5950x can actually fully utilise their GPUs as well.
 
you dont understand, nexus

dlss can produce the same native-like image from 720p, 1080p and 1440p

the three modes (720p, ultra perf), (1080p, perf), and (1440p, quality), they will all look SAME, or appear SAME in still shots and still scenery

when you move the camera, 1440p input dlss will be SUPERIOR to other ones

digital foudnry spuns the tale of 900p input rivaling a 1800p native image is simply a misdirection
 
Also haven't seen much focus on the softer nature of DLSS (vs FSR) in many cases, such as with the snack vending machine in Avengers that I showed, which also happens in other cases.

FSR in that situation was overly sharpening the jagged edges while DLSS was trying to smooth them - not sure either end result was great but I'd rather the DLSS result as it didn't lose too much detail.
 
Will also be interesting to see Radeon vs Geforce GPU comparisons because I'm willing to bet that FSR will make RDNA 2 look better than Ampere given how well they do at lower resolutions & how better they can use the CPU - so that people who don't have 5950x can actually fully utilise their GPUs as well.
The thing is you can't easily compare FSR impact due to the fact that RDNA2 behaves worse at 4K. So by default a lower initial res will give it bigger FPS gains ( this is true also for RT, once we will see FSR in RT games the gains on Radeon cards will be huge comparing with Nvidia cards ). But that is not the merit of FSR, it is simply the lack of performance of Radeon cards at 4k or RT. Or the better performance at lower resolutions.
 
Consoles already had checkerboard upscaling for years yet very few even used it

Checkerboard has big issues with quality, it causes huge artefacts, especially on thinner vertical lines - where it turns them literally into checkerboard in movement. No wonder it was quickly ditched on a PC after short trial and not many console games use it (unless no other option). It's MUCH worse than FSR or DLSS in my opinion. Hell, DLSS 1.0 had less artefacts.
 
Good video by HUB. Their tldr: DLSS is the best, if you disagree you're a salty fanboy

If that's all you took from their video, you're very biased. Yes, in lower resolutions or quality settings DLSS is better. In 4k there's pretty much no difference unless you like to look at static images with 300%+ magnification - not sure how you play games but that's not how normal people do it and it's been mentioned by HU in the video too. Likely in 8k (for that 2 people who have such TVs) they would likely look even closer. In 1440p with ultra quality neither looks ideal, though both are still usable. And in 1080p both are a mistake and it's much better to just use native.
 
Think people should "fully" watch the video before commenting on FSR being better for motion or making incorrect statements again.... That tweet doesn't cover everything and notice the words "dlss performance" is mentioned only i.e. this doesn't apply to the better dlss setting such as quality and balanced.....

TLDW, for 4k and dlss quality/balanced compared to fsr ultra quality/quality:

- avengers; dlss motion looks better or rather less obvious motion issues because it can bypass/disable TAA thus get rid of the nasty ghosting/blur, in return, fine lines become doubled/shimmery in appearance whereas FSR still needs TAA thus nasty ghosting/blur is still there and possibly enhanced a bit more along with more shimmering because of how FSR works, therefore overall dlss offers better motion clarity in this game or as Tim put it.... "dlss is superior"

Stated here - https://youtu.be/zDJxBykV1C0?t=752

- necrumada; due to the fast nature of the game, Tim couldn't see a difference in motion between FSR UQ and dlss quality

Stated here - https://youtu.be/zDJxBykV1C0?t=1060

The clear summary of the Avengers and TAA used in it (which looks only worse with FSR but has nothing to do with FSR as such), as Tim himself said, is that TAA in that game is just horrible and devs should fix it. Part of the FidelityFX is a much better TAA implementation and devs already added FSR to it, which means they have 0 excuse not to use that TAA either - it would improve image quality considerably on ALL GPUs, not just AMD. And of course, it would make FSR much better in motion too. This is a clear case of - the problem isn't with FSR, the problem is with devs implementing bad AA solutions in their games, which spoil the image quality even in native.

Nercomunda doesn't have such issues with TAA. However, it reveals a problem with DLSS and particle effects - it doesn't always work right (not just in this game). Oddly, I've seen people praising DLSS for... messing up particle effects in games - that's just wrong. Not better than native, a clear bug to be fixed.
 
lol. taa is not a "fixable" thing. its intensity varies from game to game

avengers and rdr 2 are complex games, they need heavier blur to mask up artifacts. there's no fixes there

necromunda just used taa for the giggles. turn of taa in that game and %99 of the game will render fine. its just a bad looking game that happens to use a soft application of taa. there's no fixes, errors or mistakes here. avengers and rdr 2 needs heavier implementations of TAA, amd's special "TAA" is a really soft TAA that can't probably fix the rendering issues on both avengers and rdr 2

funny to see hardware unboxed and all other AMD warriors suddenly taking their pitchforks against TAA and their implementations, lol

its a reality, you have to live with it. most of the capable developers chose it and its intensity is mostly determined by the game's complexity in terms of graphics

so necromunda is not a good showcase for "good" TAA. there's no good TAA. it can only be good if you really use it as few as possible and you can only employ it weaker if your game does not need it. this is the case for necromunda. its a horrible looking game that mostly looks like a ps3/xbox360 era game. you can easily shut off taa and game would look %99 fine. you shut off taa in rdr 2 and avengers and all hell breaks loose, because both games' engines, as with other many various late "actual" AAA games, depend on HEAVY usage of taa to function.
 
Last edited:
lol. taa is not a "fixable" thing. its intensity varies from game to game

avengers and rdr 2 are complex games, they need heavier blur to mask up artifacts. there's no fixes there

necromunda just used taa for the giggles. turn of taa in that game and %99 of the game will render fine. its just a bad looking game that happens to use a soft application of taa. square enix and rockstar developers would destroy, decimate annd crush that small dev team. there's no fixes, errors or mistakes here. avengers and rdr 2 needs heavier implementations of TAA, amd's special "TAA" is a really soft TAA that can't probably fix the rendering issues on both avengers and rdr 2

funny to see hardware unboxed and all other AMD warriors suddenly taking their pitchforks against TAA and their implementations, lol

its a reality, you have to live with it. most of the capable developers chose it and its intensity is mostly determined by the game's complexity in terms of graphics

so necromunda is not a good showcase for "good" TAA. there's no good TAA. it can only be good if you really use it as few as possible and you can only employ it weaker if your game does not need it. this is the case for necromunda. its a horrible looking game that mostly looks like a ps3/xbox360 era game. you can easily shut off taa and game would look %99 fine. you shut off taa in rdr 2 and avengers and all hell breaks loose, because both games' engines, as with other many various late "actual" AAA games, depend on HEAVY usage of taa to function.

Damn, you have no clue about algorithms, do you? TAA is a generic name, and it can be done in a bunch of different ways - some games implement a good way of doing it, some horrible. Hence, difference in quality of image. Earlier in this thread there was a quote by a game (Edge of Eternity) dev, who said they had to replace garbage Unity TAA with AMD's implementation of it, as it produced superior results and by that also made FSR look much better in the final image - though native also gained on it.

The main issue isn't quality of antialiasing various implementations provide (though that varies too) but the amount of ghosting and blurring they introduce - you can have TAA with almost no ghosting or the horrible version that Avengers is using or the full screen blur that Final Fantasy is using. Nothing to do with "complexity" but simply bad implementation.
 
keep believing the lie you're telling yourself that the one of the best developers in Rockstar managed to gobble up the "worst" TAA implementation ever, then
 
keep believing the lie you're telling yourself that the one of the best developers in Rockstar managed to gobble up the "worst" TAA implementation ever, then

If that's your argument, then yeah. I'll stick to these "lies" described by actual game devs than "truth" of a person who clearly have not a slightest clue about it.
 
yeah, you have lots of clues to yourself apparently, since you have the audacity to compare necromunda, ps3 era level of graphics game to rdr 2, one of the games that pushed current gen's graphics to its boundaries thanks to the benefits of heavy usage of TAA for reconstructing lots of stuff

another one to the list

don't bother replying, keep enjoying your ps2-ps3 mix of graphics necromunda. even ps2 game Black can be comperable to that weak game that only 10 players seem to play
 
I really can’t fathom why this thread makes people so emotional on both sides.

I assume some spent so much cash on their new GPUs with scalpers prices that they need to justify that purchase in any way possible, irrelevant of how emotional it is - especially that it's not just this thread, it's all over the Internet, all websites and forums and comments that look even worse and much more emotional.
 
Back
Top Bottom