Caporegime
- Joined
- 8 Jul 2003
- Posts
- 30,063
- Location
- In a house
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
So we are no longer comparing apples to apples..facepalm.
Also.. so.. what is it.. 4 games? Only 1 of which i will play. Wow... DLSS 2 is such a killer feature... Not. If it was game agnostic...great.
Interesting, NVIDIA RTX Voice works without RTX GPUs with a simple modification - https://www.guru3d.com/news-story/n...hout-rtx-gpus-with-a-simple-modification.html
I wonder what the performance impact will be though? There is a performance hit when you use it on Turing GPUs, what's it going to be like on GPUs that it isn't optimised for?
Interesting, NVIDIA RTX Voice works without RTX GPUs with a simple modification - https://www.guru3d.com/news-story/n...hout-rtx-gpus-with-a-simple-modification.html
It was a gimmick people went from paying 700 quid for the latest graphics card to over a thousand pounds for something that is not even ready.A whole load of Marketing ****** imho
What a surprise.....
It'd be interesting to see what the power usage of those cards are when locked at say 60fps with DLSS on - purely to see what could be achieved if this tech ended up in consoles down the road. DLSS was a tech I was very interested in when it was first announced but it looked like it was just going to be ****. Now that it works well it's firmly in the interesting catergory for me.
I'm in two minds, if it's completely indistinguishable then I almost have no issue. The problem then becomes what if High/Ultra/GODLIKE settings for someting, say lighting, shows zero difference visually but shows varying performance gains across different vendors, so maybe NVidia gains 20% by going from godlike -> high, but AMD gains 40%. Not really fair.
Personally, I think DLSS is fine so long as the results are seperated like HUB have done, but I'd say any DLSS chart like the one above should always include the non-DLSS result too.
Well said.I certainly don't mind technology that helps the user regain some performance while still looking good visually. My problem however has been with the comparisons i've seen which to me hasn't been done right. The Native capture has been violated by TAA, one of the worse Anti Aliasing features available if you ask me, giving what would be a grisp 4k image a blurry look. Why a developer would force this AA method is beyond me. Let me be clear again, before some random poster start drumming on a warpath, I don't mind DLSS, I would most likely use it if I owned a DLSS capable card now that 2.0 is out and looking pretty decent. I just don't feel like the comparisons has been 100% fair. That is it.
Let me be clear again, before some random poster start drumming on a warpath, I don't mind DLSS, I would most likely use it if I owned a DLSS capable card now that 2.0 is out and looking pretty decent. I just don't feel like the comparisons has been 100% fair. That is it.
Why a developer would force this AA method is beyond me.
I wonder what the performance impact will be though? There is a performance hit when you use it on Turing GPUs, what's it going to be like on GPUs that it isn't optimised for?
It's because it helps a lot with alpha transparencies, and it also scales well at 4K or thereabouts. Does very well at upscaling too. There's a lot of benefits to TAA really, I mean, let's face it the industry doesn't mass adopt it for no reason. It does suffer from a softer presentation but if you look at the alternatives you are faced with even worse trade-offs. I'm hopeful ray tracing is going to help out with AA in the future, and we'll get back some of that clear & crisp image of old.
I get away with using no anti aliasing at all on many games at 4K. Therefore I would prefer comparisons to be made properly.The other thing is with a deferred rendering engine you simply cant apply MSAA anyway is you have to use one of the alternatives, of which TAA is the best
It's because it helps a lot with alpha transparencies, and it also scales well at 4K or thereabouts. Does very well at upscaling too. There's a lot of benefits to TAA really, I mean, let's face it the industry doesn't mass adopt it for no reason. It does suffer from a softer presentation but if you look at the alternatives you are faced with even worse trade-offs. I'm hopeful ray tracing is going to help out with AA in the future, and we'll get back some of that clear & crisp image of old.
The issue with Tensor cores and slowdown here is likely that the processing is hanging off the end of the chain (due to lack of concurrency) and you take a small percentage hit regardless rather than from stealing resources from other processing so to speak - the interesting bit will be how much of a speed up Tensor cores are for doing the task versus general compute where you will be directly sharing resources.
I get away with using no anti aliasing at all on many games at 4K. Therefore I would prefer comparisons to be made properly.
I still disagree with the claim that it is better than native image though. Not many will know DLSS is a form of AA.Since DLSS is really a form of AA, I guess that's why it's nearly always compared to native image with TAA applied.