• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

TNA

TNA

Caporegime
Joined
13 Mar 2008
Posts
27,572
Location
Greater London
Sorry what? Anyone with an RTX card can try both themselves, what "game" are you talking about? The denial game? Also, again, i'm not someone else's alt account.

@Poneros Yes, rain and in general particles will be better by not using DLSS. It's one of the disadvantages of reconstructing an image using Deep Learning. Does that invalidate everything else? No. Does that make FidelityFX better? No. The screenshots show everything there is to show. Again, anyone with an RTX card can try both at 1080p and see how each tech works. DLSS actually reconstructs the image, FidelityFX only sharpens and upscales. The wires and generally any edges are a mess without DLSS.

Same thing happens at 4k, both techs work the same as they do at 1080p. Obviously there's more pixels to work with so the wires won't look discontinued like in 1080p with FFX but the shimmering is horrible even there.
Lol.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
I too would like to see a comparison of the game in the rain.

There is a twitch player called dansgaming (2080ti). Where you can clearly see that with dlss set to quality it stutters and he says so while streaming. So he uses dlss performance which is blurry.

https://m.twitch.tv/videos/679442728
25 minute mark. This is why we shouldn't heavily depend on still photos using the dlss quality. As it doesnt tell the whole picture...pund intended :D
 
Last edited:
Associate
Joined
13 Jul 2020
Posts
500
The stutter happens in the beginning of the game due to shader caching. Has nothing to do with DLSS quality or performance. I'm running DLSS quality on 4k 60 fps HDR on my 2080 (not a ti) on an OLED and got no stutter anymore. Had in the beginning of the game for like 10 minutes.
 
Soldato
Joined
8 Jun 2018
Posts
2,827
The stutter happens in the beginning of the game due to shader caching. Has nothing to do with DLSS quality or performance. I'm running DLSS quality on 4k 60 fps HDR on my 2080 (not a ti) on an OLED and got no stutter anymore. Had in the beginning of the game for like 10 minutes.
Caching for 10 minutes :eek::eek:

Sorry, that's not how caching work in games. If anything Nvidia needs to fix their drivers...if possible.
As it stands now the game stutters using quality dlss 2.0.
 
Associate
Joined
13 Jul 2020
Posts
500
10 minutes :eek::eek:

Sorry, that's not how caching work in games. If anything Nvidia needs to fix their drivers...if possible.

I'm pretty sure same **** happens on AMD cards. But feel free to apply to NVIDIA to help them out fixing their drivers. Remember borderlands 3 in dx12 taking 2 minutes to start the game cause "shaders"? Well..

It's the decima engine that's the issue.

PS: also in case i wasn't clear, only happens the first time that you ever start the game, not every time.
 
Last edited:
Soldato
Joined
6 Feb 2010
Posts
14,594
I wonder if Nvidia is started to worry that people won't buy high end cards because of DLSS 2.0.
I think rather than that, it is more of a ploy start to try to push for getting away with offering the graphic cards using lower cost components/cut-down GPUs (reducing the cost) and rely on DLSS to compensate, while maintaining the prices to increase profit margin (since they clearly the mindshare and command over the market that allow them to do so).

They have already gotten away with branding the what should had been 660/660ti as 670/680 and sold as such, and with that approached carried on to all the cards released after the 780 being called one to two model higher than what the hardware imply, so it would be the next logical step (we gotten to the ridiculous point of Nvidia selling the 2060 for £300+, and them going out of their way to create a "16" prefix so that they can call the slower cards "a 60 card" for the second time in the same gen). Soon we might see Nvidia force DLSS as default for their graphic cards, with them selling 50/50ti hardware level cards as 60 or even 70 cards (since it will make reviewers can no longer do apple to apple comparison of Nvidia card to AMD).

BOSE on the audio scene is often criticised and look down upon by enthusiasts for doing this- rather investing on better components to improve the audio quality for the users like other brands do, they use lower cost component and rely heavily on artificial enhancing, which make a audio track sound good but at the cost of bearing no resemblance to original recording or reference listening experience. BOSE don't care though, since main source of profit is not from enthusiasts but mainstream casual consumers that would be none the wiser or don't care enough.

Apple is similar...it is not that they products are no good, but more of a case of for the price they are charging they are not giving the customers their money's worth in terms of the hardware and components they use.
 
Associate
Joined
13 Jul 2020
Posts
500
As it stands now the game stutters using quality dlss 2.0.

Nice edit. No it doesn't. It's so funny seeing people who don't even:

1) own the game
2) own the hardware

make statements about either. Guess DigitalFoundry had a special version of the game, or maybe they had super secret Ampere which has DLSS3.0 since there's no stutter in their video either eh? No wait, facts are stupid. Let's make random statements and present them as facts instead! That works.

It's somewhat amusing though thinking how you're pretty much looking for anything to try and make meanie DLSS look bad, browsing through random twitch streamers vods for just that one video. i mean really?
 
Soldato
Joined
19 Dec 2010
Posts
12,031
I too would like to see a comparison of the game in the rain.

There is a twitch player called dansgaming (2080ti). Where you can clearly see that with dlss set to quality it stutters and he says so while streaming. So he uses dlss performance which is blurry.

https://m.twitch.tv/videos/679442728
25 minute mark. This is why we shouldn't heavily depend on still photos using the dlss quality. As it doesnt tell the whole picture...pund intended :D

The stuttering is a problem with the game. It's happening to a lot of people using a variety of cards. And solutions that work for some people don't work for others. This game needs a few patches. It's just another PC port not working quite right.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
I think rather than that, it is more of a ploy start to try to push for getting away with offering the graphic cards using lower cost components/cut-down GPUs (reducing the cost) and rely on DLSS to compensate, while maintaining the prices to increase profit margin (since they clearly the mindshare and command over the market that allow them to do so).

They have already gotten away with branding the what should had been 660/660ti as 670/680 and sold as such, and with that approached carried on to all the cards released after the 780 being called one to two model higher than what the hardware imply, so it would be the next logical step (we gotten to the ridiculous point of Nvidia selling the 2060 for £300+, and them going out of their way to create a "16" prefix so that they can call the slower cards "a 60 card" for the second time in the same gen). Soon we might see Nvidia force DLSS as default for their graphic cards, with them selling 50/50ti hardware level cards as 60 or even 70 cards (since it will make reviewers can no longer do apple to apple comparison of Nvidia card to AMD).

Well AMD must be right scammers so!! They were the ones who were charging $549 for the HD 7970. That's almost $200 over the HD 6970 and $50 more expensive than the GTX 580. For a paltry 15% increase over the GTX 580.

So If it's true what you say and the Nvidia 680 card should have been a 660 card that makes AMD look like money grabbing morons. The 680 was over 20% faster than the 7970 and $50 cheaper.

AMD tried to scalp us by selling their mid range card as high end. That left the door open for Nvidia to charge $499 for their mid range card.

Way to go AMD.
 
Soldato
Joined
14 Aug 2009
Posts
2,782
AMD tried to scalp us by selling their mid range card as high end. That left the door open for Nvidia to charge $499 for their mid range card.

Way to go AMD.

Neah, that was the situation back then, AMD's best was only able to match nVIDIA's mainstream card. Being first allowed them to set a much higher price than it should have been for that CPU. With that said, yeah, AMD would charge a lot more, case in point: the new "T" Ryzen CPUs.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
So I just had a little playthrough in Division 2. I'm currently on a 1440p screen and with a slight sharpness filter from the driver package I thought to myself "darn everything is razer sharp and detailed". Then I remember that Last stranding Nvidia demo and after watching it again I'm astounded that the non DLSS capture can look that blurry and lifeless in 4k compared to my 2k result although from another game. Makes me wonder about that lovely bundled black box software that is included in the game.

Then you have to blame AMD as they worked with the developers of this game. Remember it's a console game ported to PC? It was designed on AMD GPUs.

And for the PC port the developers had to continue working with AMD as the game includes FidelityFX.

So if there is a little black box ruining the game it's from AMD.

One reviewer even mentioned how much better the game runs on AMD GPUs than Nvidia.
 
Soldato
Joined
19 Dec 2010
Posts
12,031
Ummm, jump to around 3:30+ minutes in his video, and it's very clear in the waterfall scene that the 4k native has way more detail when you look at the mountain/rocks and waterfall itself. I wouldn't call a blur filter witchcraft.

It's the dynamic weather system that the game has. When recording the weather stayed clear when he was doing the native video, cloudy when the has doing the DLSS quality video and the weather varied while doing the DLSS performance video. Watch the DLSS performance video from 4.17 on, you will see it happening.
 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
It's the dynamic weather system that the game has. When recording the weather stayed clear when he was doing the native video, cloudy when the has doing the DLSS quality video and the weather varied while doing the DLSS performance video. Watch the DLSS performance video from 4.17 on, you will see it happening.

Not buying it, both native and DLSS are rendering what appears to be foggy weather, although the DLSS version seems slightly thicker, that said it shouldn't cause the kind of loss of detail in the DLSS rendering as it does in his video. It's just a bad excuse to justify what is clearly an inferior result. There are other places where long distance Geometry seems worse on DLSS quality like at 10:37.

Fc5f0fH.png
 
Soldato
Joined
6 Feb 2019
Posts
17,594
There is a dlss vs fidelity fx cas video doing the round on reddit

cas with fidelity fx seems too loose details in the far backgrounds

 
Soldato
Joined
10 Oct 2012
Posts
4,421
Location
Denmark
There is a dlss vs fidelity fx cas video doing the round on reddit

cas with fidelity fx seems too loose details in the far backgrounds


Trying to redirect people's focus now? We are not talking about FidelityFX vs DLSS or AMD vs Nvidia. We are talking Native vs DLSS. If you feel the need to talk about the supposed compromises of FidelityFX then start a thread with that topic in mind and we can have the discussion there. But a note on your screenshot, the compression artifacts on that image is pretty bad, just look at your Red circles.
 
Back
Top Bottom