I can sense despair & denial...A real gameplay video is not enough for a shill I guess.
Despair and denial. Sounds like something ECH would say lol

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
I can sense despair & denial...A real gameplay video is not enough for a shill I guess.
People are arguing that RT is better ones 6800XT or rather will be it seems.
What I'm trying to understand is that Nvidia has the hardware separated from the main CUDA cores.
Also with DLSS, Nvidia again have dedicated hardware for this, but AMD don't have dedicated hardware specifically for the equivilant function of DLSS, therefore it would be almost like Nvidia using CUDA cores for DLSS to be equal in comparison to AMD as they'll be using the existing cores for it, adding to the workload.
We are using Adaptive multisampling (AMS) because it's better and faster with Radeons.
Also, AMD doesn't need equivalent of DLSS, because there are already ways to achieve the same - just lower your settings, and use Radeon Boost.
I can sense despair & denial...A real gameplay video is not enough for a shill I guess.
Also no need for a new GPU either. Just lower resolution to 720p and all settings to lowest.
Also to note is that games use Nvidia proprietary libraries for RT instead of depending fully on direct X API.
I expect Nvidia might get a bit worse performance when using standard API and AMD will get better once they tune drivers, get game optimizations and implement their own version of DLSS, it will still be slower, but not that bad as it's now.
People are arguing that RT is better ones 6800XT or rather will be it seems.
What I'm trying to understand is that Nvidia has the hardware separated from the main CUDA cores.
Also with DLSS, Nvidia again have dedicated hardware for this, but AMD don't have dedicated hardware specifically for the equivilant function of DLSS, therefore it would be almost like Nvidia using CUDA cores for DLSS to be equal in comparison to AMD as they'll be using the existing cores for it, adding to the workload.
Yep.
Also no need for a new GPU either. Just lower resolution to 720p and all settings to lowest.
We are using Adaptive multisampling (AMS) because it's better and faster with Radeons.
Also, AMD doesn't need equivalent of DLSS, because there are already ways to achieve the same - just lower your settings, and use Radeon Boost.
This is what Nvidia does because its RTX performance is abysmal.
No normal person will ever want DLSS with Radeon RX 6800 XT because the frame rates are already hundreds of FPS, even at 4K.
AMD Radeon Smart Access Memory Review - 22 Games Tested | TechPowerUp
But the point of DLSS is not not lower settings, it renders the game at a lower resolution and upscales whilst achieving the same visual quality.
If you applied DLSS and then started lowering more settings the frame rate would go even higher
DLSS pretends it achieves almost the same visual quality.
It can never achieve the native image quality.
"Nvidia claims this technology upscales images with quality similar to that of rendering the image natively in the higher-resolution"
Deep learning super sampling - Wikipedia
I find it strange that people tend to buy super expensive smartphones but yet are very conservative about spending for a new PC rig.
PC rigs are cheaper than cars, for example, and yet provide high level of entertainment.
RX 6800 XT is probably the first high-end card that should be bought by everyone - even by those on a budget.
All that matters is what the gamer sees, if I enable DLSS and it looks excellent and no different to 4k then I'm happy.
Console players don't complain running below 4k on their "4k" console
That's because console players for the longest of times have been used to things being this way. However, I agree if you are not bothered by a DLSS rendered image then there is no problem for you. I don't mind DLSS 2.0, I however do not agree with the notion that it's better image quality-wise than native, which is the narrative Nvidia is pushing. But if I had a DLSS capable card and a game offered it and the framerate was too poor without it then sure, I would use the feature. Would be silly not to IMHO, unless the image quality was severely degraded.
That's because console players for the longest of times have been used to things being this way. However, I agree if you are not bothered by a DLSS rendered image then there is no problem for you. I don't mind DLSS 2.0, I however do not agree with the notion that it's better image quality-wise than native, which is the narrative Nvidia is pushing. But if I had a DLSS capable card and a game offered it and the framerate was too poor without it then sure, I would use the feature. Would be silly not to IMHO, unless the image quality was severely degraded.
Frame rate is only too poor with RT enabled.
To be fair I find it really hard to decifer between DLSS and native in Death Stranding, there may be a slightly degredation to sharpness, but if I keep a lookout for it I could never play the game, the difference is just so small. Nevertheless I disabled DLSS in death stranding because the frame rate is adequate by far without it.
Aside from this,. AMD is going to have its own version of DLSS, which AMD would have to do via software or on top of their existing workload, whilst Nvidia has dedicated hardware to offload, so I struggle to see how software will outperform the dedicated hardware.
As far as RT goes, AMD seems to be slower given Nvidia has a lead of 2 years or so. RT performance without anything like DLSS at 4k is a bit of a struggle, more so for AMD at the moment. Can they catch up through drivers? How long will that take? But will they be able to outperform Nvidia, it's not easy to see how.
Ray tracing test for Cold War
tldr: 6800xt is slower than the minimum framerate in the rtx3070 and that's before you turn on dlss
3080 is 70% faster than 6800xt
3070 50% faster than 6800
With dlss on: 3080 is 125% faster than 6800xt