• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX 4070 12GB, is it Worth it?

Status
Not open for further replies.
Nobody said textures don't' play a big part of the quality of the presentation.. What im saying is DLSS Q + High textures look better than native + Ultra textures and much better than FSR in hogwarts.
*Cough*, *Cough* is it really native or is it using another, inferior, aliasing method like TAA or some other post processing filter?
The goal of DLSS isn't to improve image quality, it's to improve your framerate.
Erm, no. If you wanted better framerates you'd just lower the resolution, DLSS does what all up sampling techniques do, the image is processed at a lower resolution then upscaled and then gets enhanced by DLSS so it doesn't look as bad as had you just upscaled it.

Granted it does a better job than other image enhancing techniques but it's still doing essentially the same thing.
 
Last edited:
..people like Tim from HUB declaring "this is better than native" is ######, he says that because he thinks science fiction is now fact, because he's on the screen saying that with such confidence and authority people believe it, but those people are as clueless as he is, i'm not casting any dispersions here, aside my frustrations with the quality of this journalism by clueless people who think they know everything or trust implicitly in anything Nvidia tell them.
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA.

While that's native resolution it's not what I'd consider to be natively displayed because you're applying a technology that's used to make upscaled images look better when there's no need to do that at 4k and you're not even upscaling the image. Native IMO would be rendering at 4k, displaying at 4k, and not using any antialiasing techniques or other post processing algorithms.
 
Last edited:
*Cough*, *Cough* is it really native or is it using another, inferior, aliasing method like TAA or some other post processing filter?

Erm, no. If you wanted better framerates you'd just lower the resolution, DLSS does what all up sampling techniques do, the image is processed at a lower resolution then upscaled and then gets enhanced by DLSS so it doesn't look as bad as had you just upscaled it.

Granted it does a better job than other image enhancing techniques but it's still doing essentially the same thing.
What? For what reason would you use upscaling if not to increase your framerate?
 
TAA has got nothing to do with scaling, it just uses frame data over time and motion vectors to smooth out jaggies. Some engines don't have particularly good implementations of this as it's complex to get right. DLSS2, XeSS and FSR2 all use this same information for upscaling which in some cases can produce results that look better than a more naive engine level implementation.
 
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA.

While that's native resolution it's not what I'd consider to be natively displayed because you're applying a technology that's used to make upscaled images look better when there's no need to do that at 4k and you're not even upscaling the image. Native IMO would be rendering at 4k, displaying at 4k, and not using any antialiasing techniques or other post processing algorithms.

People need to be asking if native resolution plus modern AA methods(such as DLAA,etc) would be better than DLSS/FSR2/XeSS upscaling.
 
What? For what reason would you use upscaling if not to increase your framerate?
To not make a 720 or 1080p image that's been upscaled look like crap.
TAA has got nothing to do with scaling, it just uses frame data over time and motion vectors to smooth out jaggies. Some engines don't have particularly good implementations of this as it's complex to get right. DLSS2, XeSS and FSR2 all use this same information for upscaling which in some cases can produce results that look better than a more naive engine level implementation.
Yes, jaggies that are only there because you upscaled an image.

e: Is it that some people just don't understand what happens when you upscale an image or something?
 
Last edited:
To not make a 720 or 1080p image that's been upscaled look like crap.

Also to sell,that that like with consoles,Nvidia/AMD/Intel are trying to upsell the RT capability of dGPUs which are too weak to do a proper job. This way they can sell weaker and weaker dGPUs for more money,then eventually lock "new versions" to the latest hardware and force people to upgrade more often. This is also partially why VRAM increases have slowed down hugely...even on the AMD side.
 
Last edited:
I wouldn't go that far but i do think that HUB video is highly misleading, Tim starts out by saying that 'native' is 4k render - 4k display but then goes on to say that they're using the games built-in antialiasing technology which is typically TAA.

Yeah now you mention it, I remember Tim blurbing on and thinking as the video went on he's starting to delude himself enough that my confidence in him made alarm bells fire off.
 
What? How can that be the answer to the question "why would you use upscaling in the first place?" LLOL
Do you even know what upscaling is? Like i said if you just wanted higher frame-rates then render and display the image at 320x200.

e: How about this, go and buy a 32 inch 320x200 display and tell me how good the images look on it.

2nd e: If my maths is right that means each pixel would be 2.5mm in size. :cry:
 
Last edited:
Do you even know what upscaling is? Like i said if you just wanted higher frame-rates then render and display the image at 320x200.

e: How about this, go and buy a 32 inch 320x200 display and tell me how good the images look on it.

2nd e: If my maths is right that means each pixel would be 2.5mm in size. :cry:
But WHY would you upscale in the first place is the question which you are not answering cause it will prove your whole point wrong. LOL this is nuts..
 
Dude, 1 picture is a thousand words I guess. You can't seriously tell me it's just nvidia marketing that makes me think the right AI image is higher quality than the left native image

DLSS-power.png

This encapsulates the reason behind DLSS very well. On a 4k display, when you don't have the raw power to driver that screen, DLSS would be better than lowering the resolution to 1080p in order to get about the same frame rate.

You can throw as much power as you want at a GPU, eventually will run out. The only thing that matters (and of which I've seen nothing so far), is if the area used for DLSS, if it were to be used to increase the performance of the card "naturally", would offer at least the same performance. In other words, if 10-20% of the chip is used for DLSS and that provides a 50-100%+ performance increase, then is well worth it.
 
This encapsulates the reason behind DLSS very well. On a 4k display, when you don't have the raw power to driver that screen, DLSS would be better than lowering the resolution to 1080p in order to get about the same frame rate.

You can throw as much power as you want at a GPU, eventually will run out. The only thing that matters (and of which I've seen nothing so far), is if the area used for DLSS, if it were to be used to increase the performance of the card "naturally", would offer at least the same performance. In other words, if 10-20% of the chip is used for DLSS and that provides a 50-100%+ performance increase, then is well worth it.
For what games? And try not to say cyberpunk.

Arguably the highest and consistent box sales game is world of Warcraft on PC doesn't use dlss.

Using your argument, the dlss portion does nothing here and where it does will have image degradation and inconsistency in various games which has been evident.

Every time I've used dlss, it has been very easy to spot the degradation in image quality, I don't use TAA but will super sample where msaa or no in game ssaa is available by custom resolution to get sharper quality 100% of the time with no inconsistency.

Every game needs a render scale option and comprehensive AA option like that used to do.
 
For what games? And try not to say cyberpunk.

Arguably the highest and consistent box sales game is world of Warcraft on PC doesn't use dlss.

Using your argument, the dlss portion does nothing here and where it does will have image degradation and inconsistency in various games which has been evident.

Every time I've used dlss, it has been very easy to spot the degradation in image quality, I don't use TAA but will super sample where msaa or no in game ssaa is available by custom resolution to get sharper quality 100% of the time with no inconsistency.

Every game needs a render scale option and comprehensive AA option like that used to do.
You're talking about a different situation. What good is downsampling from even a higher resolution than native or use MSAA when the graphic card already doesn't have enough power native for 60fps?
 
Status
Not open for further replies.
Back
Top Bottom