• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Fidelity Super Resolution in 2021

Resident evil village in 4k runs great already with a 6800xt but I will definitely give this ago later and see how far it can be pushed.

The stuttering fix is a nice addition but I have already completed the game thankfully in my play through I didn't experience it enough to make me want to stop playing.
 
Suddenly that bridge doesn't look that great anymore if you take a screenshot at the right moment. :D
I wasn't doing it if there were not all those pictures circulating showing how great the DLSS is compared with FSR. They did the same thing, took a screenshot in the worst FSR moment/best DLSS moment.


Clipboard01.png
 
Still shots while useful do not tell even 10% of the story. For example I can enable FXAA in Chernobylite and it looks better than DLSS BUT only in built up areas or indoors, go into a forest and the shimmer is hard to bear.

Though it must be said as a reminder, most credible reviews were saying that in movement it was hard to tell DLSS and FSR apart best case vs best case.
 
As per motion complaints - not always relevant because you don't see it. Image deficiencies are only noticeable when you capture a still image of it - you don't see any of that with your eye while you play the game. Still images though you do notice whenever the camera stops moving

And HUB mentioned this, in practical testing you can't tell the difference between dlss on or off in motion - the only way to see anything is to take a screenshot. And that becomes even more so in games that use motion blur which is most of them
 
As per motion complaints - not always relevant because you don't see it. Image deficiencies are only noticeable when you capture a still image of it - you don't see any of that with your eye while you play the game. Still images though you do notice whenever the camera stops moving

And HUB mentioned this, in practical testing you can't tell the difference between on or off in motion

Disagree on that.

Days gone, rdr 2, avengers and many other games where TAA is used, the motion issues with ghosting/blur is very noticeable in gameplay and sadly you can't turn off TAA as it is either not an option or if you do turn it off, you get terrible aliasing, shimmering, jaggies etc.

Maybe for really fast paced games like that necrumada, doom eternal, you would be hard pressed to notice any motion issues but for slower paced games, it is very noticeable.
 
Isn't that the point of FSR and DLSS to give you a believable native image while actually playing the game.

If you need to stop and pixel peep then you are ignoring the what this technology is offering.

The reason FSR 1.0 is so good in its early stages is because you can not see the downsides from playing like you could with DLSS 1.0 that has motion blur known as the vaseline effect right from just watching some gameplay you could see this that is the technology failing to deliver.

If you can not see this while gaming then both of these techs are doing what they have set out to do.

Simple as that really

Both have the ups and downs, both are doing a good enough job.
 
Isn't that the point of FSR and DLSS to give you a believable native image while actually playing the game.

If you need to stop and pixel peep then you are ignoring the what this technology is offering.

The reason FSR 1.0 is so good in its early stages is because you can not see the downsides from playing like you could with DLSS 1.0 that has motion blur known as the vaseline effect right from just watching some gameplay you could see this that is the technology failing to deliver.

If you can not see this while gaming then both of these techs are doing what they have set out to do.

Simple as that really

Both have the ups and downs, both are doing a good enough job.

This is it in a nutshell. When DLSS 1.0 was released with RT for the 2000 series GPUs it was an absolute joke. It achieved nothing that you outlined above and Nvidia had to go away and make massive improvements. It has taken them years to get to a much more acceptable level for DLSS where it is genuinely viable. FSR is only released a month and it already acheives a level that is close to DLSS in many use cases.
 
I still don't get this urge to compare DLSS to FSR and pick a winner. Like if your game and your card support DLSS and you need a bump to the frame rate use it. If you can't but can use FSR use that. There's no winners or losers.
I think FSR seems pretty good at doing what it tries to, which is a way to upscale your game while mitigating the downsides of that. We've always been able to just play the game at 720p or some such, but it kind of sucks. This is much better, so its a good thing.
 
I still don't get this urge to compare DLSS to FSR and pick a winner. Like if your game and your card support DLSS and you need a bump to the frame rate use it. If you can't but can use FSR use that. There's no winners or losers.
I think FSR seems pretty good at doing what it tries to, which is a way to upscale your game while mitigating the downsides of that. We've always been able to just play the game at 720p or some such, but it kind of sucks. This is much better, so its a good thing.

to be honest I dont know people are comparing the two DLSS can only be used by people with certain tech

FSR can be used by a massive audiance including everyone who can use DLSS

it all comes down to elitist outlook on this forum


both are a good thing
 
to be honest I dont know people are comparing the two DLSS can only be used by people with certain tech

FSR can be used by a massive audiance including everyone who can use DLSS

it all comes down to elitist outlook on this forum


both are a good thing

Agreed that both are a good thing (and that both have downsides) but then again, FSR doesn't help much in lower resolutions (like 1080p) and lower quality settings. Neither does DLSS in 1080p really. But that means, as even Tim from HU said - both techs are mostly designed for 4k (1440p as absolute min.) and mid-high end GPUs, to achieve proper results. And if you have new expensive GPU... it's rare you would actually need either in anything below 4k, unless game is overloaded with RT - but that again means new, expensive GPU.

In other words, both of these tech on a PC work best for people who do not actually need them as much. And they do not work at all (DLSS) or badly (FSR) for people who would like to use them the most (older/weaker GPUs in 1080p). On consoles it's another story - mid-range GPU with 4k TV usually connected, which means good upscaling is very desired.
 
The use cases for DLSS or FSR at 1080p are to allow lower end GPUs such as a 2060, or 5700 XT to enable some level of RT in their games. This would be equivalent to someone with a 3080/6800 XT using DLSS or FSR at 4K for the same purpose.

FSR also has the benefit of allowing even older GPUs to play graphically demanding games at higher settings. There are cases where 1080p with FSR or DLSS is more than acceptable for IQ.
 
The use cases for DLSS or FSR at 1080p are to allow lower end GPUs such as a 2060, or 5700 XT to enable some level of RT in their games. This would be equivalent to someone with a 3080/6800 XT using DLSS or FSR at 4K for the same purpose.

FSR also has the benefit of allowing even older GPUs to play graphically demanding games at higher settings. There are cases where 1080p with FSR or DLSS is more than acceptable for IQ.

Problem is that both have huge quality issues in 1080p and it's better to just use native or lower details instead of enabling upscaling. 1440p sure, it's good enough (not as good as in 4k, but playable). Any test I've seen in 1080p is just bad, in both DLSS and FSR (though DLSS does it better in 1080p for sure).
 
I have used DLSS on a 2060 S to play watchdogs Legion and Control with RT at 1080p and they looked fine.
I have used FSR at 1080p in Terminator Resistance and DOTA 2 and they looked fine (admittedly poor samples). In these cases using DLSS or FSR looked better than turning the rest of the graphical settings to low for example. Admittedly I only tested FSR in T:R as a proof of concept but the point remains that there are games that look perfectly fine with FSR and DLSS at 1080p.

I would say that If it looks bad it's down to the devs and not the underlying tech. Especially with DLSS.
 
The use cases for DLSS or FSR at 1080p are to allow lower end GPUs such as a 2060, or 5700 XT to enable some level of RT in their games. This would be equivalent to someone with a 3080/6800 XT using DLSS or FSR at 4K for the same purpose.

FSR also has the benefit of allowing even older GPUs to play graphically demanding games at higher settings. There are cases where 1080p with FSR or DLSS is more than acceptable for IQ.
Not sure it's worth using RT for some pretty reflections at 1080p when the general image quality will take such a hit by using DLSS / FSR although I do see the benefit of people with older cards using it to hit a playable 60fps in modern titles when they wouldn't otherwise be able to achieve this.
 
Last edited:
Not sure it's worth using RT for some pretty reflections at 1080p when the general image quality will take such a hit by using DLSS / FSR although I do see the benefit of people with older cards using it to hit a playable 60fps in modern titles when they wouldn't otherwise be able to achieve this.

That was my thinking too, especially with APU on my laptop. But upon trying it myself in DOTA with FSR (just an experiment), I rather lower details than use FSR in 1080p. Same with DLSS in 1080p in CP2077, when I tested it on 3060Ti. The tech just needs to have enough pixels in source frame to be able to produce good results, especially in movement.
In 1440p I had no issues with either - both looked fine in games I've checked. But this is a very subjective matter, and some people consider it good enough also in 1080p, which is very much fine by me - it's a tech for people and if it works well for them, all the better. :)
 
Last edited:
it needs a bit sharpening at 1080p and mostly works fine

u have to accept oversharpening artifacts though

the actual problem is rtx 3070 can't handle RT maxed out in Cyberpunk at native 1080p. it gets around 43-53 frames and dlss quality reliably carries the performance to upwards of 60 frames, usually ranging in the 65-75. i literally need dlss quality to enjoy rt maxed out+60 fps in this game with the supposed "1440p" monster rtx 3070/2080ti. if it was a tps or slow paced game, i'd have no trouble with 30-50 frames, but when its a FPS game, i need at least a minimum of 60 fps to be it smooth

other games, such as doom eternal, metro exodus ee have been fine at native 1080p, pushing way above 90+ frames with rt maxed out. in those cases i use dsr to 4k and then back to 1080p with dlss performance and that results a much sharper, better looking image compared to native 1080p with a bit of dlss overhead cost (the iq difference is worth it) [in the case of RT being maxed out, i want to enable all RT bells and whistles. here's the reason: You pay for BHV calculations already, a huge price. all other RT effects benefit from the baseline BHV cost. so when I want to enable RT, I want to have it all. shadows, lighting and reflections in the case of Cyberpunk. disabling these effects one by one shows that the performance increase is not that much. you sacrifice huge visual quality for little performance gains. once you use any rt effect, the biggest cost comes from the RT BHV cost itself. so if my GPU is going to do BHV calculations, its best that i also enable other RT effects for best possible efficiency to make it worth. in this case, enabling "single" RT effect in Cyberpunk was not worth it for me)

another note to make here is DSR 4k+dlss ultra performance (internal 720p) looks waaaay better than dlss quality at native 1080p. i don't know the exact reason why, but its also a valid strategy. but then the dlss overhead cost again stops rtx 3070 from getting 60 fps.

so its a mixed of bags. as i said, lots of people criticised AMD gpus not being able to handle RT in cyberpunk, but if you ask me, none of the rtx gpus cannot either. only rtx 3080 can reliably put 60 or so frames with 1080p input to 4k. at 1440p input (dlss quality at 4k), even the 3090 hovers around 45-50 fps which is funny.

so yeah, dlss at 1080p/1440p is not a thing to scoff at. if it looks bareable, which it does in cyberpunk kinda, it is worth to max out RT in such cases. but i would prefer games perform like metro exodus ee and doom eternal so that most gpus can push nice fps and rt at native 1080p and only need dlss at 1440p and upwards.

but cyberpunk, it literally needs dlss even at 1080p with the 3070. so its not a joke or meme when dlss works better at 1080p. this means that a person like me that choose 3070 over 6700xt for 1080p will enjoy better upscaling quality for years the come if we want to enable RT, but then again, the VRAM discussion and longevity discussion is also there to be made. i think even fsr quality for 6700xt wouldn't do much favors for 6700xt in the case of cyberpunk. but i can get reliably get 60+ frames with my 3070 with much better upscaling (again, this is not a good thing and im aware that its a game that is really tough to run, just sharing my experience here)

i have no intentions of upgrading for a foreseeable future so dlss being "viable" at 1080p ensures me to enjoy RT for many years to come, instead of shelling out huge money to nvidia every 2 years or so
 
Last edited:
Back
Top Bottom