• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD FidelityFX And Radeon Image Sharpening Tested vs DLSS

FidelityFX vs DLSS. Oh dear.....



UdgSPCQ.png

O dear lol

It begs the question did Nvidia really go out the way to bring something to look cool?

Ai AI bah, bah to sell GPUs? It sounds cool doesn't it. In reality it's just extra work for developers and cost for worst image quality.
 
O dear lol

It begs the question did Nvidia really go out the way to bring something to look cool?

Ai AI bah, bah to sell GPUs? It sounds cool doesn't it. In reality it's just extra work for developers and cost for worst image quality.

HU responded saying that they were surprised about the bad quality of the DLSS on the video, so they had to record the footage twice, yet still looked the same bad on all games in the video.
I mean look at the Tiger details including the cammo net. I would take 1800p + Sharpening over Native 4K any day. And that also works on all games on all resolutions.
So we can activate it, and play on native 1440p on a 2560x1440 monitor and it will work on all but DX11 games atm.
DLSS works only on 4K on games coded for it.
 
Im at 3440x1440, i could turn the sharpness right down/off, take a shot, then whack it right up, and take a shot, the one with the sharpenss right down/ off, will be worse than the one with it right up, it'll be blury as ****, but i'll still be at 3440x1440! :p
 
Im at 3440x1440, i could turn the sharpness right down/off, take a shot, then whack it right up, and take a shot, the one with the sharpenss right down/ off, will be worse than the one with it right up, it'll be blury as ****, but i'll still be at 3440x1440! :p

They are a few methods that you can use with RIS
1. Native res so 1440p in my case + Sharpening will give a better image and help remove TAA blur
2. Using a none native res on a 4k display so say 1800p seems to be the best case, then using the GPU upscale found inside Radeon settings it must be enabled + Sharpening will give you close to a 4k image with the much better frame rate.
 
Why are these 2 being pitted against each other, when ones sharpening, and ones upscaling :confused:

If you see the whole video you will understand why.
But to sum it it up. There are 3 images.
Left - Native 4K.
Middle - 5700XT at 3200x1800 + Sharpening upscaled to 4K monitor. (on the video there are also 4K sharpened videos)
Right - 4K DLSS on an Nvidia GPU.
HU even thought that there is an issue with the bad quality of the DLSS and run the video benchmarks twice, but on all takes of the videos, the quality remained that bad on all DLSS games.

UdgSPCQ.png
 


Wow, I didn't know DLSS artifacts like that. (Mountain Reflection in the water)

What I do not get. Navi is not advertised having "tensor cores" and AI malarkey. Yet does better job reading the frame/image, identify the areas need to apply the FidelityFX and do the job without performance impact.

It doesn't apply colour & gamma filters to the whole image/frame like some dumb filter (found on Nvidia Experience settings) or how Reshade post processing injectors work.
It makes a choice on the fly on what and where to apply the "sharpening", without requiring programming for it's basic functionality.
 


Wow, I didn't know DLSS artifacts like that. (Mountain Reflection in the water)

All three images have the same artifact 4k DLSS seems to have made that more pronounced, But it has made the image textures the sharpest as well clearly seen here.


But then again the above 78% sharpening looks better to me than native 4k or DLSS indeed native 4k looks blurry compares to that. I like it a lot if i could downsample with unlimited framerates 4k to 1080p and it looked as good as Nvidia? Well i would choose AMD because i could use 78% sharpening.


DLSS can do a lot more i thought though you ever seen the deep learning of the older final fantasy texture restoration? Thats astonishing i thought DLSS could do that nearly double texture quality through deep learning.
 
All three images have the same artifact 4k DLSS seems to have made that more pronounced, But it has made the image textures the sharpest as well clearly seen here.


But then again the above 78% sharpening looks better to me than native 4k or DLSS indeed native 4k looks blurry compares to that. I like it a lot if i could downsample with unlimited framerates 4k to 1080p and it looked as good as Nvidia? Well i would choose AMD because i could use 78% sharpening.


DLSS can do a lot more i thought though you ever seen the deep learning of the older final fantasy texture restoration? Thats astonishing i thought DLSS could do that nearly double texture quality through deep learning.

On the DLSS image it feels someone took sandpaper and smoothed the metallic bars. And that image is from Metro Exodus. The best DLSS implementation up to now!!!!

The RIS BFV image is oversharpened and looks horrible. Can it be toned down?

Is not an image filter like the one found in Reshade & Nvidia Experience where you can configure it. Is on or OFF.
 
What I do not get. Navi is not advertised having "tensor cores" and AI malarkey. Yet does better job reading the frame/image, identify the areas need to apply the FidelityFX and do the job without performance impact.

It doesn't apply colour & gamma filters to the whole image/frame like some dumb filter (found on Nvidia Experience settings) or how Reshade post processing injectors work.
It makes a choice on the fly on what and where to apply the "sharpening", without requiring programming for it's basic functionality.

Marketing sells GPUs and Nvidia is the best at this. They led people into thinking DLSS was the second coming.
It's all a gimmick and Ai and slap RTX on and you have a clever market.

I can bet when Amd release Ray tracing they will do it in away that makes sense.
Like what I reading the GPU cores will be able to do both Ray tracing and Rasterizer.
Unlike Nvidia the cores are left sitting there doing nothing when not used.
 
On the DLSS image it feels someone took sandpaper and smoothed the metallic bars. And that image is from Metro Exodus. The best DLSS implementation up to now!!!!

Yes it seems to apply a smoothing effect but also making it entirely higher quality the middle image looks like the bars maybe looks a bit grainy, like rust but clearly that is all done more blurry. I suspect the deep learning has not been tweaked enough because it should realize that grain is part of the texture. To deep learn and add more you need to learn that but it seems to have forgot that part. I still think both ha e merits i want DLSS and sharpening. Chuck in Anti Lag too please.
 
Ai AI bah, bah to sell GPUs? It sounds cool doesn't it. In reality it's just extra work for developers and cost for worst image quality.

False.

Developers didnt do extra work on DLSS, they did nothing with it so DLSS was all done on Nvidia supercomputer trained on DLSS model to improved image quality saved developers lots of time. In reality RIS is just extra work for developers tweaked to improved image quality themselves.

It will be interesting to see if DLSS get better than before with Monster Hunter World patch due next week will support DLSS.
 
Marketing sells GPUs and Nvidia is the best at this. They led people into thinking DLSS was the second coming.
It's all a gimmick and Ai and slap RTX on and you have a clever market.

I can bet when Amd release Ray tracing they will do it in away that makes sense.
Like what I reading the GPU cores will be able to do both Ray tracing and Rasterizer.
Unlike Nvidia the cores are left sitting there doing nothing when not used.

There is a video of why Navi RDNA is interesting architecture and why AMD decided to use GCN as piggy bag on top of the 5700s, otherwise it would take another year to code drivers for it properly.
Get ready relax and watch 30 minutes of technical analysis how frames are made, and how different architecture processes them :)

Yeah, an intelligent video :)
 
False.

Developers didnt do extra work on DLSS, they did nothing with it so DLSS was all done on Nvidia supercomputer trained on DLSS model to improved image quality saved developers lots of time. In reality RIS is just extra work for developers tweaked to improved image quality themselves.

It will be interesting to see if DLSS get better than before with Monster Hunter World patch due next week will support DLSS.

Erm? What?
RIS is a driver toggle and just works. FidelityFX is a game engine feature that will work on both amd and nvidia and add extra sharpening.

DLSS isn't some magic toggle, the devs need to work alongside Nvidia to get it working why the hell do you think it tuck so long to release into battlefield 5.

You then need the game to learn the AI pass it onto nvidia and then patch it in. Lol cost long after a game as released.

Why do you think battlefield 5 looks even worst than metro? Simple really Dice don't want to keep paying the extra to Nvidia to learn Ai or Dice don't want to add more cost to the development of it.
 
Back
Top Bottom