• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

You disagree? What are you disagreeing with? These images are from reviews sites, and they do state what AA versions are been used. Most of the Video reviews mention it and most of the video reviews say that DLSS 2.0 and Fidelity FX are better than native with or without TAA/FXAA.

There are better AA methods but these are GPU intensive and maybe in time DLSS and FidelityFX will surpass those methods but at the moment they are getting over the problem of the soft/blurry look that FXAA and TAA introduce while trying to improve performance.

I was joking when I said that you think Nvidia controls everything. Just made the joke because you said it was marketing. How can it be marketing, the images are from Review sites? Nvidia's own slides and info about DLSS 2.0 and DLSS 1.0 are compared against TAA/FXAA and labelled as such. Although DLSS 2.0 images are been mostly compared again DLSS 1.0 images.

You continue to say it's dishonest. But yet you had no problem what so every when DLSS 1.0 was been compared to Native, no matter what way it was labelled.
Lol Melmec, not sure if you are playing silly or what. Go read my original post and you will see what I have a problem with. I do not have a problem with correctly labeled images, I bloody complemented the review for using the correct labels for god sakes :p:D

Let me know if you still need clarification on what I have a problem with ;)
 
There was lots of bun fighting in this thread but I can say now, in 2020, that Death Stranding finally delivers on the promises made by AMD/NVidia regarding these technologies.

FidelityFX on a 5700XT the difference with it on/off is pretty incredible on my rig/screen/frame-rate counter.

Its the first game I have played where it really makes a big difference.

Well done to AMD and Kojima Productions. Looking at the shots the Nvidia version does the same too.

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,4.html

Is it better than HDR as the OP says? Yes, having experienced them both now, I think it DLSS/FidelityFX is a better effect than HDR.
 
Let me know if you still need clarification on what I have a problem with ;)

I still need clarification I guess!! You were complaining that's it's dishonest and a marketing trick?

Not sure how it can be dishonest when DLSS 2.0 or Fidelity FX is better than Native with or without TAA/FXAA applied.
 
There was lots of bun fighting in this thread but I can say now, in 2020, that Death Stranding finally delivers on the promises made by AMD/NVidia regarding these technologies.

FidelityFX on a 5700XT the difference with it on/off is pretty incredible on my rig/screen/frame-rate counter.

Its the first game I have played where it really makes a big difference.

Well done to AMD and Kojima Productions. Looking at the shots the Nvidia version does the same too.

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,4.html

Is it better than HDR as the OP says? Yes, having experienced them both now, I think it DLSS/FidelityFX is a better effect than HDR.

So a sharpening filter with a small upscaling benefit (FFX), heck even DLSS which does way more, is better than HDR? Guess you haven't seen HDR on a proper TV and/or in games that use it properly then.

HDR vs SDR is the difference between certain things looking real and some not, color and contrast-wise. Gran Turismo, Death Stranding, Horizon Zero Dawn, God of War. Doom Eternal and many more games have amazing HDR implementations. Every SDR game i play on my OLED Tv or i had played on my previous XF90 looks/looked flat compared to HDR. There's nothing more disappointing than not having HDR in a new game, most ps4 exclusives got it nowadays (if not all), but there are some PC games which still don't (mostly multiplat games, although it's better nowadays compared to a couple of years ago where you could count PC HDR games on one hand, two max ).

Every time i play something on my ACER XB271hu i hate it, wish we had OLED monitors by now. There's something wrong when you have to pay 2000E+ for an HDR monitor that's worse than any of the high-end HDR Tv's ( IPS HDR monitors, they're the "best" atm, and contrast on an IPS is horrible to really give the impact HDR deserves, plus blacks are more gray than black) . You can get an oled for half that. Sure you'll only have 4k 120hz instead of higher refresh rates on monitors, but the image quality (not clarity) is way better.

Hell, resolution isn't even that relevant when a game is trying to convince you it looks real or tries to go for that pixar cgi look with a vast amount of color shades and contrast. You can have 1080p and great HDR which will beat 4k 60 SDR anytime.

Then again if you're gaming on a low-end HDR tv or an "HDR" monitor i can see where the issue lies easily. Those aren't real HDR-capable displays.
 
Last edited:
I don't understand the point of making a statement about FFX/DLSS and HDR, then when confronted with facts, the only answer is that.

You do know that fidelityFX is just a sharpening filter with slightly better upscaling yes?

What games have you played with HDR? What display did you play them on? System (win10/ ps4/xbx). Something has to have been wrong if you really think HDR is worse than those. I haven't met a single person who wasn't blown away by HDR on a proper display. Whether it was movies or games. 1080p to 4k many were disappointed, but HDR is immediately visible no matter the resolution.
 
I still need clarification I guess!! You were complaining that's it's dishonest and a marketing trick?

Not sure how it can be dishonest when DLSS 2.0 or Fidelity FX is better than Native with or without TAA/FXAA applied.
Ok, let help you out. Here is the post that got you guys going:


Briefly looked at TPU’s Death Stranding review. These guys did a good job and labelled things properly. They did not just say native, but labelled it TAA. Comes over a lot more honest. If the words native are to be used for the resolution one should put + TAA or + FXAA etc. That clarifies things, is a lot more honest and informative. Calling it just native is dishonest and a marketing trickery tactic.

Before I am labelled a hater or something else, I would like to clarify that I do like DLSS 2.0 and currently swinging towards picking up a RTX 3070 on release (will likely use it when playing Cyberpunk 2077). Sure that may change, but it is unlikely.

So as you can see I am praising TechPowerUp for doing it the right way by labeling the images correctly. They clarified what AA method was being used rather than hiding behind "native". I have zero issues with comparisons that explain what AA is being used at the native resolution. Saying native alone is not enough! You can say native and use FXAA like many do. How is that an bloody honest example?? FXAA sucks!

Anyway, I know you know what I am saying. You are just choosing to be stubborn here ;)
 
You disagree? What are you disagreeing with? These images are from reviews sites, and they do state what AA versions are been used. Most of the Video reviews mention it and most of the video reviews say that DLSS 2.0 and Fidelity FX are better than native with or without TAA/FXAA.

There are better AA methods but these are GPU intensive and maybe in time DLSS and FidelityFX will surpass those methods but at the moment they are getting over the problem of the soft/blurry look that FXAA and TAA introduce while trying to improve performance.

I was joking when I said that you think Nvidia controls everything. Just made the joke because you said it was marketing. How can it be marketing, the images are from Review sites? Nvidia's own slides and info about DLSS 2.0 and DLSS 1.0 are compared against TAA/FXAA and labelled as such. Although DLSS 2.0 images are been mostly compared again DLSS 1.0 images.

You continue to say it's dishonest. But yet you had no problem what so every when DLSS 1.0 was been compared to Native, no matter what way it was labelled.

AMD fan boy on DLSS1 review comparisons: OMG haha looks it's a vaseline filter native/taa is so much better, fidelity fx ftw

AMD fan boy on DLSS2: it's not fair, because reasons! These review websites must be Nvidia shills, Nvidia controls the media, DLSS 2 can't look that good I won't believe it.
 
AMD fan boy on DLSS1 review comparisons: OMG haha looks it's a vaseline filter native/taa is so much better, fidelity fx ftw

AMD fan boy on DLSS2: it's not fair, because reasons! These review websites must be Nvidia shills, Nvidia controls the media, DLSS 2 can't look that good I won't believe it.

Ah, the old nVidia paranoia - so nice to see after all these years. Anyone commenting anything that's not fully praising nVidia is an AMD fanboy ;)
 
Ah, the old nVidia paranoia - so nice to see after all these years. Anyone commenting anything that's not fully praising nVidia is an AMD fanboy ;)
Indeed. Cracks me up.

I even said multiple times, I like DLSS 2.0 but that is not good enough, unless I drool over it and promote it at every opportunity, then I am classed as an AMD fanboy apparently :p

I will stick with what I am saying thanks. TechPowerUp did a good job and hopefully others will follow suit and explain what AA they are using to arrive at the image quality they refer to as native. There is no need to hide it.
 
I even said multiple times, I like DLSS 2.0 but that is not good enough, unless I drool over it and promote it at every opportunity, then I am classed as an AMD fanboy apparently :p

I also noticed that even though you were objective, some still twisted the usual narrative. What amazed me was that it was actually badly implemented at first and took some time to get better - a thing that if AMD do, gets brushed off as useless tech (who's listening to them plebs) kind of attitude.
 
I also noticed that even though you were objective, some still twisted the usual narrative. What amazed me was that it was actually badly implemented at first and took some time to get better - a thing that if AMD do, gets brushed off as useless tech (who's listening to them plebs) kind of attitude.
Yea. I gave DLSS a hard time and rightly so, because it was poor on release. They improved it a lot with DLSS 2.0 and I have not said anything negative about it, but have said I like it. But that is not enough, one must clearly promote it at every opportunity apparently! :p

As you say I have been objective, yet due to this not fitting the narrative some have really gone out of their way to twist things. It is so obvious to see that it is laughable.
 
This is why if you come across as 'defending' AMD or basically not sucking off nvidia, your a fanboy from a minority of members, but its quite obvious who they are after you notice the same behavior over various threads. Guess you train yourself to ignore them or just develop a thicker skin. ;)
 
I'm in the same place. DLSS 1.0 was definitely not worthy of promotion, DLSS 2.0 is seemingly a good implementation of... what boils down to upscaling... and upscaling was bad... until nVidia did it, then it's good? It's a curious mindset we have :D

That said, I'm probably gonna have to pick up a 3080Ti when it comes because I'm not convinced AMD can match or be within 10% with Navi.. so DLSS being improved is a good thing for me :)
 
I have to agree, if the 3080 is a far superior card I would love to own one. Realistically though my budget wont go anywhere near a grand so likely the territory of a 3070 or what AMD can produce.

To be fair to the DLSS hardcore they were championing it from 1.0 - and this is where they get the flak, preaching it is garlic bread well over a year later isnt quite the same so the only difference from you with me there @Howling :)
 
Ok, let help you out. Here is the post that got you guys going:




So as you can see I am praising TechPowerUp for doing it the right way by labeling the images correctly. They clarified what AA method was being used rather than hiding behind "native". I have zero issues with comparisons that explain what AA is being used at the native resolution. Saying native alone is not enough! You can say native and use FXAA like many do. How is that an bloody honest example?? FXAA sucks!

Anyway, I know you know what I am saying. You are just choosing to be stubborn here ;)

That's not the post that got me going. But there is a line in your post that I still don't understand and you haven't explained at all.

You say it's Marketing and trickery and dishonest? Marketing and Trickery by who? Dishonest by who? Show me. Review sites either state in the review what settings they are using or put in on the screenshots. Nvidia's blog on the subject shows that they are comparing DLSS to TAA.

When people say Native they mean the resolution as DLSS and Fidelity FX are not run at Native. How hard is that to grasp?

Please answer this question. Where was your outrage when they were comparing DLSS 1.0 to Native? Why didn't you suggest that since DLSS 1.0 is blurry that they use a sharpening filter? That's what you want them to do with FXAA before comparing it to DLSS 2.0.

No, I don't know what you saying. You are complaining about the term "compared to native" as been dishonest. But, it only seems to have become dishonest in the last few months. Before that nobody had an issue with the term.
 
You say it's Marketing and trickery and dishonest? Marketing and Trickery by who? Dishonest by who? Show me. Review sites either state in the review what settings they are using or put in on the screenshots. Nvidia's blog on the subject shows that they are comparing DLSS to TAA.
It has been my experience that whenever I see an comparison the AA method used for native is not stated. Here is one example directly from Nvidia: https://developer.nvidia.com/dlss

Here is video from them also, an example of how not to do it:



Here is an example from them with it done right:



I still remember when Grim5 did a comparison where the "native image" turned out to be one where he used FXAA on. Lol.

So yeah, maybe now you can see where I am coming from now? I am not saying the whole web is doing this, but just the places I came across. Hence when I saw the TPU article it felt refreshing!


When people say Native they mean the resolution as DLSS and Fidelity FX are not run at Native. How hard is that to grasp?
I definitely answered this one before. What you seem to be struggling to grasp is most people buying graphics cards are not enthusiasts like me and you. To us it is obvious, to them not so much. Again, how hard is it to just put down on each comparison image or video what aa is being used? Why let people assume?


Please answer this question. Where was your outrage when they were comparing DLSS 1.0 to Native? Why didn't you suggest that since DLSS 1.0 is blurry that they use a sharpening filter? That's what you want them to do with FXAA before comparing it to DLSS 2.0.
A better question would be, why do I have to have had and outrage for my current point to stand?

I did not need to, DLSS on release was poor and I made that amply clear!


No, I don't know what you saying. You are complaining about the term "compared to native" as been dishonest. But, it only seems to have become dishonest in the last few months. Before that nobody had an issue with the term.
Yes, because you do not want to see. Too invested in your own point of view to be impartial (disappointed to see that from you btw).

It is super simple what I am saying yet you look for things I said to strawman :rolleyes:
 
Ready for 8K?

EiX5iyf63eRqMB4YLMnFUZ.jpg


9532_102_death-stranding-benchmarked-how-does-hideo-kojimas-game-run-at-8k.png
 
Back
Top Bottom