Discussion in 'Graphics Cards' started by sunshinewelly, Aug 16, 2019.
It looks like the DLSS is improving Image Quality there.
(Wonder if its DLSS + Sharpening)
Well sharpening feature is in the driver and only drops performance by 1% if you use it so it would make total sense for them to auto enable sharpening if the user enables dlss
These images are all taken from 1080p. Lets see the 1440p and 4k screen shots. There is also 3 x dlss modes being used quality, balanced and performance. At 4k they chose to show the performance mode for fps numbers which i am guessing is the blurr mode version. Quality mode most likely does incorporate other settings to help it achieve a better image. For those on 1080p/1440p it could be a nice toggle feature. I think an unbiased source needs to do some testing to see what's really going on here.
"On a 2060, you can indeed run raytracing at 60fps with minimal fuss". I see averages but what are the minimums?
1080p w/ integer scaling - RT just got a lot more viable on big 4K screens compared to at the start of the year. Honestly the main problem for them is the same as it ever was: too few games with RTX, and the implementations come months after, and the games are ultimately meh. That's why getting it in Cyberpunk is such a big deal, that's the perfect type of game for RT & choosing to do GI is also the best choice possible, but I just doubt it will be there at release. In the end, it will slowly take over with next-gen consoles having the capability as well, but it didn't exactly make upgrading to Turing an urgency. In fact, only with Amid Evil getting it within the next month or so will I start itching to try it out. I guess that says everything about the current state of RTX.
AMD would get panned by both camps especially the nVidia boys if any feature came out in such fashion. Same applied to mantle, as it only featured in a few games it was slated. Yet couple years on its pushed out Vulkan and forced M$ to improve with DX12 so it benefited everyone considering.
Problem is games that advertised with Ray Tracing by Nvidia 14 months ago, still do not have it. Wolfenstein Youngblood? Asseto Corsa?
Yep. Very poor showing. But they don’t care, they have people’s money now. They knew RTX was not ready. I guess people had a little demo and did a bit of beta testing. Hopefully 2020 titles coming out will get it right on release and the 3000 series actually dedicates more than 5% of the transistors towards RT, not to mention improvements from architectural changes.
Problem is Youngblood RT was going to be implemented by Nvidia directly not the developers, because of Vulkan not supporting at that time ray tracing....
Wonder what the issue is? Maybe release it with the new cards as an extra title to market
But it was advertised with the RTX2080Ti
Don’t worry about that, 2080Ti owners won’t care, they will just buy a 3080Ti!
actually shouldn't be shocked
Sponsored by Nvidia was as far as I got. Even the fanboys have given up trying to defend this mess.
If nvidia got rid of the tensor cores completely and had say 4x the RT cores I’d be very up for buying a 3 series. It’s mental but I really want to play minecraft with it
I'm not sure if they are actually used in current implementations but the Tensor cores can be used to accelerate denoising of the ray traced results in theory allowing for the appearance of higher quality ray tracing with less rays used. Supposedly they can also be used in some capacity to optimise the BVH structure to reduce the work the RT cores have to do but not 100% on that.
How do enable integer scaling? Does it have a big impact/difference on the PQ?
I’m needing to play RDR2 at 1440p on a 4K screen so interested in whether integer would help improve the PQ or not.
Separate names with a comma.