• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

really getting fed up with the posts stating RTX/DLSS does not work this gen


deliver-us-the-moon-fortuna-nvidia-geforce-rtx-ray-tracing-dlss-3840x2160-performance.png


https://www.nvidia.com/en-us/geforce/news/deliver-us-the-moon-fortuna-ray-tracing-dlss/
https://www.nvidia.com/en-us/geforce/news/deliver-us-the-moon-nvidia-dlss/

deliver-us-the-moon-fortuna-nvidia-dlss-comparison-002-740x261.png


deliver-us-the-moon-fortuna-nvidia-dlss-comparison-003-v2-740x458.png


deliver-us-the-moon-fortuna-nvidia-dlss-comparison-005-740x529.png


It looks like the DLSS is improving Image Quality there. :p

(Wonder if its DLSS + Sharpening)
 
Last edited:
Well sharpening feature is in the driver and only drops performance by 1% if you use it so it would make total sense for them to auto enable sharpening if the user enables dlss
 
deliver-us-the-moon-fortuna-nvidia-dlss-comparison-002-740x261.png


deliver-us-the-moon-fortuna-nvidia-dlss-comparison-003-v2-740x458.png


deliver-us-the-moon-fortuna-nvidia-dlss-comparison-005-740x529.png


It looks like the DLSS is improving Image Quality there. :p

(Wonder if its DLSS + Sharpening)

These images are all taken from 1080p. Lets see the 1440p and 4k screen shots. There is also 3 x dlss modes being used quality, balanced and performance. At 4k they chose to show the performance mode for fps numbers which i am guessing is the blurr mode version. Quality mode most likely does incorporate other settings to help it achieve a better image. For those on 1080p/1440p it could be a nice toggle feature. I think an unbiased source needs to do some testing to see what's really going on here.
 
1080p w/ integer scaling - RT just got a lot more viable on big 4K screens compared to at the start of the year. Honestly the main problem for them is the same as it ever was: too few games with RTX, and the implementations come months after, and the games are ultimately meh. That's why getting it in Cyberpunk is such a big deal, that's the perfect type of game for RT & choosing to do GI is also the best choice possible, but I just doubt it will be there at release. In the end, it will slowly take over with next-gen consoles having the capability as well, but it didn't exactly make upgrading to Turing an urgency. In fact, only with Amid Evil getting it within the next month or so will I start itching to try it out. I guess that says everything about the current state of RTX.
 
Honestly the main problem for them is the same as it ever was: too few games with RTX, and the implementations come months after, and the games are ultimately meh...I guess that says everything about the current state of RTX.

AMD would get panned by both camps especially the nVidia boys if any feature came out in such fashion. Same applied to mantle, as it only featured in a few games it was slated. Yet couple years on its pushed out Vulkan and forced M$ to improve with DX12 so it benefited everyone considering.
 
1080p w/ integer scaling - RT just got a lot more viable on big 4K screens compared to at the start of the year. Honestly the main problem for them is the same as it ever was: too few games with RTX, and the implementations come months after, and the games are ultimately meh. That's why getting it in Cyberpunk is such a big deal, that's the perfect type of game for RT & choosing to do GI is also the best choice possible, but I just doubt it will be there at release. In the end, it will slowly take over with next-gen consoles having the capability as well, but it didn't exactly make upgrading to Turing an urgency. In fact, only with Amid Evil getting it within the next month or so will I start itching to try it out. I guess that says everything about the current state of RTX.

Problem is games that advertised with Ray Tracing by Nvidia 14 months ago, still do not have it. Wolfenstein Youngblood? Asseto Corsa?
 
Problem is games that advertised with Ray Tracing by Nvidia 14 months ago, still do not have it. Wolfenstein Youngblood? Asseto Corsa?
Yep. Very poor showing. But they don’t care, they have people’s money now. They knew RTX was not ready. I guess people had a little demo and did a bit of beta testing. Hopefully 2020 titles coming out will get it right on release and the 3000 series actually dedicates more than 5% of the transistors towards RT, not to mention improvements from architectural changes.
 
Yep. Very poor showing. But they don’t care, they have people’s money now. They knew RTX was not ready. I guess people had a little demo and did a bit of beta testing. Hopefully 2020 titles coming out will get it right on release and the 3000 series actually dedicates more than 5% of the transistors towards RT, not to mention improvements from architectural changes.

Problem is Youngblood RT was going to be implemented by Nvidia directly not the developers, because of Vulkan not supporting at that time ray tracing....
 
Problem is Youngblood RT was going to be implemented by Nvidia directly not the developers, because of Vulkan not supporting at that time ray tracing....
Wonder what the issue is? Maybe release it with the new cards as an extra title to market :D
 
If nvidia got rid of the tensor cores completely and had say 4x the RT cores I’d be very up for buying a 3 series. It’s mental but I really want to play minecraft with it
 
If nvidia got rid of the tensor cores completely and had say 4x the RT cores I’d be very up for buying a 3 series. It’s mental but I really want to play minecraft with it

I'm not sure if they are actually used in current implementations but the Tensor cores can be used to accelerate denoising of the ray traced results in theory allowing for the appearance of higher quality ray tracing with less rays used. Supposedly they can also be used in some capacity to optimise the BVH structure to reduce the work the RT cores have to do but not 100% on that.
 
1080p w/ integer scaling - RT just got a lot more viable on big 4K screens compared to at the start of the year. Honestly the main problem for them is the same as it ever was: too few games with RTX, and the implementations come months after, and the games are ultimately meh. That's why getting it in Cyberpunk is such a big deal, that's the perfect type of game for RT & choosing to do GI is also the best choice possible, but I just doubt it will be there at release. In the end, it will slowly take over with next-gen consoles having the capability as well, but it didn't exactly make upgrading to Turing an urgency. In fact, only with Amid Evil getting it within the next month or so will I start itching to try it out. I guess that says everything about the current state of RTX.
How do enable integer scaling? Does it have a big impact/difference on the PQ?

I’m needing to play RDR2 at 1440p on a 4K screen so interested in whether integer would help improve the PQ or not.
 
Back
Top Bottom