• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen

Hardware unboxed has started using DLSS 2.0 in titles that support it for hardware reviews on the basis that there is no good reason not to use DLSS 2.0 when its available. Good on them, I think they are the first mainstream reviewers to do so.


It'd be interesting to see what the power usage of those cards are when locked at say 60fps with DLSS on - purely to see what could be achieved if this tech ended up in consoles down the road. DLSS was a tech I was very interested in when it was first announced but it looked like it was just going to be ****. Now that it works well it's firmly in the interesting catergory for me.

So we are no longer comparing apples to apples..facepalm.

Also.. so.. what is it.. 4 games? Only 1 of which i will play. Wow... DLSS 2 is such a killer feature... Not. If it was game agnostic...great.

I'm in two minds, if it's completely indistinguishable then I almost have no issue. The problem then becomes what if High/Ultra/GODLIKE settings for someting, say lighting, shows zero difference visually but shows varying performance gains across different vendors, so maybe NVidia gains 20% by going from godlike -> high, but AMD gains 40%. Not really fair.

Personally, I think DLSS is fine so long as the results are seperated like HUB have done, but I'd say any DLSS chart like the one above should always include the non-DLSS result too.
 
I wonder what the performance impact will be though? There is a performance hit when you use it on Turing GPUs, what's it going to be like on GPUs that it isn't optimised for?

hopefully someone tests it. RTX gpus take up to 10% performance hit. Hopefully someone with a GTX card here can run a test or two
 
What a surprise.....

Tensor and RT core load is deferred to Cuda cores when no RT Tensor cores are detected, Nvidia implemented this functionality in its drivers some time ago.

it was clear from the get go they were just blocking other cards from this RTX voice feature.

the only difference is performance. RTX cards take a 10% performance hit, the performance hit is much worse for a 1080ti
 
It'd be interesting to see what the power usage of those cards are when locked at say 60fps with DLSS on - purely to see what could be achieved if this tech ended up in consoles down the road. DLSS was a tech I was very interested in when it was first announced but it looked like it was just going to be ****. Now that it works well it's firmly in the interesting catergory for me.



I'm in two minds, if it's completely indistinguishable then I almost have no issue. The problem then becomes what if High/Ultra/GODLIKE settings for someting, say lighting, shows zero difference visually but shows varying performance gains across different vendors, so maybe NVidia gains 20% by going from godlike -> high, but AMD gains 40%. Not really fair.

Personally, I think DLSS is fine so long as the results are seperated like HUB have done, but I'd say any DLSS chart like the one above should always include the non-DLSS result too.

I certainly don't mind technology that helps the user regain some performance while still looking good visually. My problem however has been with the comparisons i've seen which to me hasn't been done right. The Native capture has been violated by TAA, one of the worse Anti Aliasing features available if you ask me, giving what would be a grisp 4k image a blurry look. Why a developer would force this AA method is beyond me. Let me be clear again, before some random poster start drumming on a warpath, I don't mind DLSS, I would most likely use it if I owned a DLSS capable card now that 2.0 is out and looking pretty decent. I just don't feel like the comparisons has been 100% fair. That is it.
 
I certainly don't mind technology that helps the user regain some performance while still looking good visually. My problem however has been with the comparisons i've seen which to me hasn't been done right. The Native capture has been violated by TAA, one of the worse Anti Aliasing features available if you ask me, giving what would be a grisp 4k image a blurry look. Why a developer would force this AA method is beyond me. Let me be clear again, before some random poster start drumming on a warpath, I don't mind DLSS, I would most likely use it if I owned a DLSS capable card now that 2.0 is out and looking pretty decent. I just don't feel like the comparisons has been 100% fair. That is it.
Well said.

It is puzzling to me that people are happy to compare what they call native resolution where they have applied TAA or FXAA which ruins the image to DLSS. If you want it done right compare it with either no aa or a form of aa that does not blur the hell out of an image.

I still remember Grim5 embarrassing him with his dodgy comparison of DLSS 2.0 and "native resolution" trying to "trick us" by applying FXAA to the native image. Cracked me up as I said the image looked like it had FXAA on, which in the end it turned out it did. Lol. Own goal! Haha.

DLSS 2.0 does seem like very nice tech and soon I will be using it myself. But one cannot just compare it like that.
 
Let me be clear again, before some random poster start drumming on a warpath, I don't mind DLSS, I would most likely use it if I owned a DLSS capable card now that 2.0 is out and looking pretty decent. I just don't feel like the comparisons has been 100% fair. That is it.

use it on what though, I’ve no beef with DLSS v2 from what I’ve seen but no games I play support it ergo its a tech demo on hardware asking a big premium. Not opposed to paying for these things where there is genuine use and gains to be had
 
Why a developer would force this AA method is beyond me.

It's because it helps a lot with alpha transparencies, and it also scales well at 4K or thereabouts. Does very well at upscaling too. There's a lot of benefits to TAA really, I mean, let's face it the industry doesn't mass adopt it for no reason. It does suffer from a softer presentation but if you look at the alternatives you are faced with even worse trade-offs. I'm hopeful ray tracing is going to help out with AA in the future, and we'll get back some of that clear & crisp image of old.
 
I wonder what the performance impact will be though? There is a performance hit when you use it on Turing GPUs, what's it going to be like on GPUs that it isn't optimised for?

The issue with Tensor cores and slowdown here is likely that the processing is hanging off the end of the chain (due to lack of concurrency) and you take a small percentage hit regardless rather than from stealing resources from other processing so to speak - the interesting bit will be how much of a speed up Tensor cores are for doing the task versus general compute where you will be directly sharing resources.
 
It's because it helps a lot with alpha transparencies, and it also scales well at 4K or thereabouts. Does very well at upscaling too. There's a lot of benefits to TAA really, I mean, let's face it the industry doesn't mass adopt it for no reason. It does suffer from a softer presentation but if you look at the alternatives you are faced with even worse trade-offs. I'm hopeful ray tracing is going to help out with AA in the future, and we'll get back some of that clear & crisp image of old.

The other thing is with a deferred rendering engine you simply cant apply MSAA anyway is you have to use one of the alternatives, of which TAA is the best
 
The other thing is with a deferred rendering engine you simply cant apply MSAA anyway is you have to use one of the alternatives, of which TAA is the best
I get away with using no anti aliasing at all on many games at 4K. Therefore I would prefer comparisons to be made properly.
 
It's because it helps a lot with alpha transparencies, and it also scales well at 4K or thereabouts. Does very well at upscaling too. There's a lot of benefits to TAA really, I mean, let's face it the industry doesn't mass adopt it for no reason. It does suffer from a softer presentation but if you look at the alternatives you are faced with even worse trade-offs. I'm hopeful ray tracing is going to help out with AA in the future, and we'll get back some of that clear & crisp image of old.

I just know that I absolutely hate the end result of TAA unless I then sharpen it up using a proper filter which fixes a lot of the blurry mess. I rather use SMAA as it takes cares of the jaggies that stands out to me. It may not be as "advanced" as TAA but it doesn't smear my screen either and does the job well enough. Again this is personal preference I suppose but TAA just stinks to high heaven IMHO. Seems like the type of AA you would use for couch gaming and not sitting 60cm from the monitor.

I'm using a 1080p monitor atm, jaggies are real on this thing :) real real. I rather have to live with them, however painful, than using TAA. Thank <insert random deity> for the sharpening filters built in to the drivers now.
 
The issue with Tensor cores and slowdown here is likely that the processing is hanging off the end of the chain (due to lack of concurrency) and you take a small percentage hit regardless rather than from stealing resources from other processing so to speak - the interesting bit will be how much of a speed up Tensor cores are for doing the task versus general compute where you will be directly sharing resources.

Some people on different forums have done tests and there seems to be more of performance hit when using tensor cores than when using CUDA. (I am presuming it works on older cards because it falls back to CUDA when it doesn't detect Tensor cores.)

It was less than a 3% performance hit.
 
Since DLSS is really a form of AA, I guess that's why it's nearly always compared to native image with TAA applied.
I still disagree with the claim that it is better than native image though. Not many will know DLSS is a form of AA.

I really like the tech. Just can't get behind it being compared to native with taa/fxaa and then say it has better image quality. The reason it is more of an issue for me I guess is because I play at 4K and have played many games without AA just fine.

I will have to see for myself. But it does look very good and I may very well end up using it in every game even at 4K. Certainly would love the benefit from the boost of fps. It will mean I can keep the card I get much longer and skip a gen.
 
Back
Top Bottom