Soldato
What?The only thing i'm surprised is missing from your Nvidia slide is the 4X higher frame rates on right image....
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
What?The only thing i'm surprised is missing from your Nvidia slide is the 4X higher frame rates on right image....
What?
"Beyond Native" I can see it now... Reminds me of the "Better than Life" episode
"Beyond Native" I can see it now... Reminds me of the "Better than Life" episode
*Cough*, *Cough* is it really native or is it using another, inferior, aliasing method like TAA or some other post processing filter?Nobody said textures don't' play a big part of the quality of the presentation.. What im saying is DLSS Q + High textures look better than native + Ultra textures and much better than FSR in hogwarts.
Erm, no. If you wanted better framerates you'd just lower the resolution, DLSS does what all up sampling techniques do, the image is processed at a lower resolution then upscaled and then gets enhanced by DLSS so it doesn't look as bad as had you just upscaled it.The goal of DLSS isn't to improve image quality, it's to improve your framerate.
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA...people like Tim from HUB declaring "this is better than native" is ######, he says that because he thinks science fiction is now fact, because he's on the screen saying that with such confidence and authority people believe it, but those people are as clueless as he is, i'm not casting any dispersions here, aside my frustrations with the quality of this journalism by clueless people who think they know everything or trust implicitly in anything Nvidia tell them.
What? For what reason would you use upscaling if not to increase your framerate?*Cough*, *Cough* is it really native or is it using another, inferior, aliasing method like TAA or some other post processing filter?
Erm, no. If you wanted better framerates you'd just lower the resolution, DLSS does what all up sampling techniques do, the image is processed at a lower resolution then upscaled and then gets enhanced by DLSS so it doesn't look as bad as had you just upscaled it.
Granted it does a better job than other image enhancing techniques but it's still doing essentially the same thing.
Image scaling - Wikipedia
en.wikipedia.org
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA.
While that's native resolution it's not what I'd consider to be natively displayed because you're applying a technology that's used to make upscaled images look better when there's no need to do that at 4k and you're not even upscaling the image. Native IMO would be rendering at 4k, displaying at 4k, and not using any antialiasing techniques or other post processing algorithms.
To not make a 720 or 1080p image that's been upscaled look like crap.What? For what reason would you use upscaling if not to increase your framerate?
Yes, jaggies that are only there because you upscaled an image.TAA has got nothing to do with scaling, it just uses frame data over time and motion vectors to smooth out jaggies. Some engines don't have particularly good implementations of this as it's complex to get right. DLSS2, XeSS and FSR2 all use this same information for upscaling which in some cases can produce results that look better than a more naive engine level implementation.
To not make a 720 or 1080p image that's been upscaled look like crap.
What? How can that be the answer to the question "why would you use upscaling in the first place?" LLOLTo not make a 720 or 1080p image that's been upscaled look like crap.
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA.
Do you even know what upscaling is? Like i said if you just wanted higher frame-rates then render and display the image at 320x200.What? How can that be the answer to the question "why would you use upscaling in the first place?" LLOL
But WHY would you upscale in the first place is the question which you are not answering cause it will prove your whole point wrong. LOL this is nuts..Do you even know what upscaling is? Like i said if you just wanted higher frame-rates then render and display the image at 320x200.
e: How about this, go and buy a 32 inch 320x200 display and tell me how good the images look on it.
2nd e: If my maths is right that means each pixel would be 2.5mm in size.
What? For what reason would you use upscaling if not to increase your framerate?
Dude, 1 picture is a thousand words I guess. You can't seriously tell me it's just nvidia marketing that makes me think the right AI image is higher quality than the left native image
For what games? And try not to say cyberpunk.This encapsulates the reason behind DLSS very well. On a 4k display, when you don't have the raw power to driver that screen, DLSS would be better than lowering the resolution to 1080p in order to get about the same frame rate.
You can throw as much power as you want at a GPU, eventually will run out. The only thing that matters (and of which I've seen nothing so far), is if the area used for DLSS, if it were to be used to increase the performance of the card "naturally", would offer at least the same performance. In other words, if 10-20% of the chip is used for DLSS and that provides a 50-100%+ performance increase, then is well worth it.
You're talking about a different situation. What good is downsampling from even a higher resolution than native or use MSAA when the graphic card already doesn't have enough power native for 60fps?For what games? And try not to say cyberpunk.
Arguably the highest and consistent box sales game is world of Warcraft on PC doesn't use dlss.
Using your argument, the dlss portion does nothing here and where it does will have image degradation and inconsistency in various games which has been evident.
Every time I've used dlss, it has been very easy to spot the degradation in image quality, I don't use TAA but will super sample where msaa or no in game ssaa is available by custom resolution to get sharper quality 100% of the time with no inconsistency.
Every game needs a render scale option and comprehensive AA option like that used to do.
was GPU so yea