• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The thread which sometimes talks about RDNA2

Status
Not open for further replies.
People are arguing that RT is better ones 6800XT or rather will be it seems.

What I'm trying to understand is that Nvidia has the hardware separated from the main CUDA cores.

Also with DLSS, Nvidia again have dedicated hardware for this, but AMD don't have dedicated hardware specifically for the equivilant function of DLSS, therefore it would be almost like Nvidia using CUDA cores for DLSS to be equal in comparison to AMD as they'll be using the existing cores for it, adding to the workload.
 
People are arguing that RT is better ones 6800XT or rather will be it seems.

What I'm trying to understand is that Nvidia has the hardware separated from the main CUDA cores.

Also with DLSS, Nvidia again have dedicated hardware for this, but AMD don't have dedicated hardware specifically for the equivilant function of DLSS, therefore it would be almost like Nvidia using CUDA cores for DLSS to be equal in comparison to AMD as they'll be using the existing cores for it, adding to the workload.

We are using Adaptive multisampling (AMS) because it's better and faster with Radeons.

Also, AMD doesn't need equivalent of DLSS, because there are already ways to achieve the same - just lower your settings, and use Radeon Boost.
 
We are using Adaptive multisampling (AMS) because it's better and faster with Radeons.

Also, AMD doesn't need equivalent of DLSS, because there are already ways to achieve the same - just lower your settings, and use Radeon Boost.

Yep.

Also no need for a new GPU either. Just lower resolution to 720p and all settings to lowest.
 
I can sense despair & denial...A real gameplay video is not enough for a shill I guess.

You would need two videos for the same system. One for the 6800xt and another for the 3080. Then you would need to graph the fps every second for the lenght of each video. Then find the average fps for each card. Otherwise you are just watching someone play and can make no conclusion about relative performance.
 
Also to note is that games use Nvidia proprietary libraries for RT instead of depending fully on direct X API.
I expect Nvidia might get a bit worse performance when using standard API and AMD will get better once they tune drivers, get game optimizations and implement their own version of DLSS, it will still be slower, but not that bad as it's now.

NVidia uses DXR which is why all the RT features in Control work on AMD 6800xt cards. DXR is part of DX12u. DLSS is not a RT feature.

Control runs much slower on AMD using DX12u DXR than Nvidia. Even with dlss off.
 
People are arguing that RT is better ones 6800XT or rather will be it seems.

What I'm trying to understand is that Nvidia has the hardware separated from the main CUDA cores.

Also with DLSS, Nvidia again have dedicated hardware for this, but AMD don't have dedicated hardware specifically for the equivilant function of DLSS, therefore it would be almost like Nvidia using CUDA cores for DLSS to be equal in comparison to AMD as they'll be using the existing cores for it, adding to the workload.

Yes AMD will have to use DirectML with compute doing all the proccessing instead of tensor cores.
 
We are using Adaptive multisampling (AMS) because it's better and faster with Radeons.

Also, AMD doesn't need equivalent of DLSS, because there are already ways to achieve the same - just lower your settings, and use Radeon Boost.

But the point of DLSS is not not lower settings, it renders the game at a lower resolution and upscales whilst achieving the same visual quality.

If you applied DLSS and then started lowering more settings the frame rate would go even higher
 
But the point of DLSS is not not lower settings, it renders the game at a lower resolution and upscales whilst achieving the same visual quality.

If you applied DLSS and then started lowering more settings the frame rate would go even higher

DLSS pretends it achieves almost the same visual quality.
It can never achieve the native image quality.

"Nvidia claims this technology upscales images with quality similar to that of rendering the image natively in the higher-resolution"
Deep learning super sampling - Wikipedia
 
DLSS pretends it achieves almost the same visual quality.
It can never achieve the native image quality.

"Nvidia claims this technology upscales images with quality similar to that of rendering the image natively in the higher-resolution"
Deep learning super sampling - Wikipedia

All that matters is what the gamer sees, if I enable DLSS and it looks excellent and no different to 4k then I'm happy.

Console players don't complain running below 4k on their "4k" console
 
I find it strange that people tend to buy super expensive smartphones but yet are very conservative about spending for a new PC rig.
PC rigs are cheaper than cars, for example, and yet provide high level of entertainment.

RX 6800 XT is probably the first high-end card that should be bought by everyone - even by those on a budget.

You also need the rest of the system, not only the graphics card. Also most people will buy a console (or both), rather than a gfx card at those prices. :)
 
All that matters is what the gamer sees, if I enable DLSS and it looks excellent and no different to 4k then I'm happy.

Console players don't complain running below 4k on their "4k" console

That's because console players for the longest of times have been used to things being this way. However, I agree if you are not bothered by a DLSS rendered image then there is no problem for you. I don't mind DLSS 2.0, I however do not agree with the notion that it's better image quality-wise than native, which is the narrative Nvidia is pushing. But if I had a DLSS capable card and a game offered it and the framerate was too poor without it then sure, I would use the feature. Would be silly not to IMHO, unless the image quality was severely degraded.
 
That's because console players for the longest of times have been used to things being this way. However, I agree if you are not bothered by a DLSS rendered image then there is no problem for you. I don't mind DLSS 2.0, I however do not agree with the notion that it's better image quality-wise than native, which is the narrative Nvidia is pushing. But if I had a DLSS capable card and a game offered it and the framerate was too poor without it then sure, I would use the feature. Would be silly not to IMHO, unless the image quality was severely degraded.

Frame rate is only too poor with RT enabled.

To be fair I find it really hard to decifer between DLSS and native in Death Stranding, there may be a slightly degredation to sharpness, but if I keep a lookout for it I could never play the game, the difference is just so small. Nevertheless I disabled DLSS in death stranding because the frame rate is adequate by far without it.

Aside from this,. AMD is going to have its own version of DLSS, which AMD would have to do via software or on top of their existing workload, whilst Nvidia has dedicated hardware to offload, so I struggle to see how software will outperform the dedicated hardware.

As far as RT goes, AMD seems to be slower given Nvidia has a lead of 2 years or so. RT performance without anything like DLSS at 4k is a bit of a struggle, more so for AMD at the moment. Can they catch up through drivers? How long will that take? But will they be able to outperform Nvidia, it's not easy to see how.
 
That's because console players for the longest of times have been used to things being this way. However, I agree if you are not bothered by a DLSS rendered image then there is no problem for you. I don't mind DLSS 2.0, I however do not agree with the notion that it's better image quality-wise than native, which is the narrative Nvidia is pushing. But if I had a DLSS capable card and a game offered it and the framerate was too poor without it then sure, I would use the feature. Would be silly not to IMHO, unless the image quality was severely degraded.

Will be interested in an update on this from @TNA as he has the card to test it now and we were never convinced of the "better than native" statements people were coming out with.
 
Frame rate is only too poor with RT enabled.

To be fair I find it really hard to decifer between DLSS and native in Death Stranding, there may be a slightly degredation to sharpness, but if I keep a lookout for it I could never play the game, the difference is just so small. Nevertheless I disabled DLSS in death stranding because the frame rate is adequate by far without it.

Aside from this,. AMD is going to have its own version of DLSS, which AMD would have to do via software or on top of their existing workload, whilst Nvidia has dedicated hardware to offload, so I struggle to see how software will outperform the dedicated hardware.

As far as RT goes, AMD seems to be slower given Nvidia has a lead of 2 years or so. RT performance without anything like DLSS at 4k is a bit of a struggle, more so for AMD at the moment. Can they catch up through drivers? How long will that take? But will they be able to outperform Nvidia, it's not easy to see how.

AMD doesn't need DLSS because you can lower the settings, use Radeon Boost and get the same frame rate increase while keeping the quality virtually indistinguishably different.

Tech Spot have articles about Crysis 2 and Crysis 3.

Crysis 2: comparison between Extreme, Very High and High settings:

Extreme:


Very High:


High:


Framerate:
Radeon HD 6870:
Extreme 2560x1600: 23 FPS;
Very High 2560x1600: 34 FPS;
High 2560x1600: 41 FPS;




Crysis 2 GPU & CPU Performance Test > Benchmarks: High Performance (techspot.com)
 
Ray tracing test for Cold War

tldr: 6800xt is slower than the minimum framerate in the rtx3070 and that's before you turn on dlss

3080 is 70% faster than 6800xt
3070 50% faster than 6800
With dlss on: 3080 is 125% faster than 6800xt


50 fps on 6800 XT is same performance as RTX 2070 SUPER/RTX 2080 and 6800 42 fps is slight faster than RTX 2060 SUPER 38 fps after watched videos on youtube.
 
Status
Not open for further replies.
Back
Top Bottom