• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

The Medium supports RT for both AMD and Mvidia at launch

Played it today on my Sapphire 6800XT nitro + oc. Was enjoying it but it crashed on me 3 times. Especially using out of body thing. RT I tried but quickly disabled as it tanks performance so not useable @ 4K high settings. I just wish theyd release a game thats not as stable as a 2 legged chair.

I had the same on my 6900xt Sapphire OC+ to the point at one section I have had to switch to using my system with a 3090 as it crashes. Still get those major stuttering moments where it just about hangs in there, mainly when loading into a cut scene or splitting into 2 realities.

Even on Nvidia it is poorly optimised. Glad it's on Game Pass and I didn't stump up £40.
 
Of course Nvidia have invested more silicon in RT, this is why they can offer ~400% better performance over RDNA2. Both cards aren't doing 1 sample/pixel RT as AMD will be using quarter resolution RT or quarter the FPS.
here is the fanboy in you talking again some nonsense. Both cards are doing the same spp in every game and usually the number is 1. As the number of samples increases the image becomes less noisier (the performance decreases a lot) but for a good image you will need something like 5000 or 10k samples/pixel. More samples per pixel - less denoising. But also more samples per pixel - lower framerates.This shows that you will never get good RT performance no matter how many RT cores you'll use inside your chipset.
The games right now are using mostly 1spp for performance reasons and then they clear the image through denoising. Which is done on the tensor cores on Nvidia and probably on the shader cores on AMD cards. Denoising (especially Nvidia denoising) is similar with upscaling, you can call it AI cleaning. So look AMD can do AI cleaning without having dedicated hardware.
The problem is that when Nvidia is involved in a game and puts their denoising code inside that game, it will work better with their tensor cores but not so great with anything else. Quake II, Minecraft and that Raytracing feature benchmark are probably the best examples for this.
Again in The Medium, a Microsoft XBox title where Nvidia have added DLSS and some extra RT fetures.
Read above, when Nvidia is involved it means they also adding at least the denoising code. It is laughable to call it a Microsoft game when Nvidia was involved in their PC launch.
 
here is the fanboy in you talking again some nonsense. Both cards are doing the same spp in every game and usually the number is 1. As the number of samples increases the image becomes less noisier (the performance decreases a lot) but for a good image you will need something like 5000 or 10k samples/pixel. More samples per pixel - less denoising. But also more samples per pixel - lower framerates.This shows that you will never get good RT performance no matter how many RT cores you'll use inside your chipset.
The games right now are using mostly 1spp for performance reasons and then they clear the image through denoising. Which is done on the tensor cores on Nvidia and probably on the shader cores on AMD cards. Denoising (especially Nvidia denoising) is similar with upscaling, you can call it AI cleaning. So look AMD can do AI cleaning without having dedicated hardware.
The problem is that when Nvidia is involved in a game and puts their denoising code inside that game, it will work better with their tensor cores but not so great with anything else. Quake II, Minecraft and that Raytracing feature benchmark are probably the best examples for this.

Read above, when Nvidia is involved it means they also adding at least the denoising code. It is laughable to call it a Microsoft game when Nvidia was involved in their PC launch.

Again with the fanboy BS. I've already said I don't care who makes the best card. Now if you want to backup your argument with some data, fine, otherwise you just appear as the angry fanboy.


Again, AMD have only themselves to blame for their poor RT performance.
 
Last edited:
To prove what? That the games are using mostly 1spp on both cards? That is very easy to prove. To prove that most of the RT image is done through denoising and you need thousands of samples per pixel for a clear image? To prove that the denoising on Quake II or Minecraft has a huge impact on AMD cards? That is also easier to prove.
Watch this from your beloved Digital Foundry:


What do you want me to prove? Can you prove that " Both cards aren't doing 1 sample/pixel RT as AMD will be using quarter resolution RT or quarter the FPS."? You can't because that is a complete nonsense. :D
 
Again, AMD have only themselves to blame for their poor RT performance.
Good thing you posted the same video i posted, after i posted. But if you were watching the video before posting it, you wouldn't had written that stupid thing about AMD and 1spp. :D
Of course they can blame themselves but Nvidia isn't doing to great either. And it is not helping the competition, that is for sure. :D
Here is from that video when you increase the SPP from 1 to 4. Huge perfromance impact, not only on AMD but also on Nvidia.

They also talk about the denoising on Quake II at the end.
 
To prove what? That the games are using mostly 1spp on both cards? That is very easy to prove. To prove that most of the RT image is done through denoising and you need thousands of samples per pixel for a clear image? To prove that the denoising on Quake II or Minecraft has a huge impact on AMD cards? That is also easier to prove.
Watch this from your beloved Digital Foundry:


What do you want me to prove? Can you prove that " Both cards aren't doing 1 sample/pixel RT as AMD will be using quarter resolution RT or quarter the FPS."? You can't because that is a complete nonsense. :D

Well I have been pointing out that AMD's attempt at ray tracing is basically worthless, a bad purchase.

You argue it's not and it's due to Nvidia that AMD are doing so bad.

I show you examples where Nvidia aren't involved and AMD are still perfroming poorly.

Now you claim it's due to Nvidia's denoiser. So where is AMD's solution? Where are their RT demos? Why didn't Hitman 3 release with RT support? Why can't AMD use RT within CP2077?

If both cards were doing 1 sample per pixel then they wouldn't require denoising. Denoising is requried as not every pixel is being sampled and not every pixel that is being sampled is only using one sample. Now given that AMD doesn't have the RT performance of Nvidia and doesn't have a DLSS competitor and doesn't have dedicated hardware to run a DLSS competitor it then makes sense that AMD can't perform as well as Nvidia. That leaves AMD in the position where they have to deny RT such as in CP2077, or provide a lower quality RT such as any title they allow RT in beyond shadows.
 
Good thing you posted the same video i posted, after i posted. But if you were watching the video before posting it, you wouldn't had written that stupid thing about AMD and 1spp. :D
Of course they can blame themselves but Nvidia isn't doing to great either. And it is not helping the competition, that is for sure. :D
Here is from that video when you increase the SPP from 1 to 4. Huge perfromance impact, not only on AMD but also on Nvidia.

They also talk about the denoising on Quake II at the end.

I posted the same video as an edit as I had to go looking for it after I posted.

Do you understand what 1 sample per pixel actually means?

Your screen shot proves my point, AMD is ~50% slower than AMD. Add on DLSS and you find AMD are another ~50% slower again.

As far as Quake II goes, there is nothing to stop AMD releasing their own RT demos. Where are they?
 
Well I have been pointing out that AMD's attempt at ray tracing is basically worthless, a bad purchase.

You argue it's not and it's due to Nvidia that AMD are doing so bad.

I show you examples where Nvidia aren't involved and AMD are still perfroming poorly.

Now you claim it's due to Nvidia's denoiser. So where is AMD's solution? Where are their RT demos? Why didn't Hitman 3 release with RT support? Why can't AMD use RT within CP2077?

If both cards were doing 1 sample per pixel then they wouldn't require denoising. Denoising is requried as not every pixel is being sampled and not every pixel that is being sampled is only using one sample. Now given that AMD doesn't have the RT performance of Nvidia and doesn't have a DLSS competitor and doesn't have dedicated hardware to run a DLSS competitor it then makes sense that AMD can't perform as well as Nvidia. That leaves AMD in the position where they have to deny RT such as in CP2077, or provide a lower quality RT such as any title they allow RT in beyond shadows.

And where did i said that AMD is not bad with Ray Tracing? You said that AMD is a budget console chip and i told you that Nvidia is also a budget console chip and that their much better performance is due to investments in upscaling ( and denoising) and not in RT. I am not sure you can even get too much better performance investing in RT because just like in upscaling vs native, denoising is also much cheaper than using more samples per pixel.
And i told you that AMD is also not working well with Nvidia denoiser since that was built for their dedicated hardware. If most of the RT games are made with Nvidia denoising code, AMD will lose several miliseconds for every frame they need to render.

What do you ask me? Why Hitman 3 which is a game sponsored by Intel didn't release with RT support? Or why CP2077 which is a game sponsored by Nvidia doesn't have AMD support? Ask them, why should i answer?
Do you understand what 1 sample per pixel actually means?
Now you are making a fool of yourself. What do you think it means? :D
If both cards were doing 1 sample per pixel then they wouldn't require denoising. Denoising is requried as not every pixel is being sampled and not every pixel that is being sampled is only using one sample.

That is a lot of crap right there. Then what do you think 100 samples per pixel means? Do you still need denoising if you use 100 samples per pixel? :)
 
Those minimum FPS though....

nOff7Pr.png

Although drivers could improve performance further (for both camps)

But it kind of confirms my suspicions like what happened with tessellation back in the day, nvidia in their sponsored titles are more than likely over doing ray tracing effects to show up amds weaknesses

That is the section where you fly over the landscape. The following is when your character is on the ground:
https://i.imgur.com/aBXliqa.png

aBXliqa.png
 
Last edited:
And where did i said that AMD is not bad with Ray Tracing? You said that AMD is a budget console chip and i told you that Nvidia is also a budget console chip and that their much better performance is due to investments in upscaling ( and denoising) and not in RT. I am not sure you can even get too much better performance investing in RT because just like in upscaling vs native, denoising is also much cheaper than using more samples per pixel.
And i told you that AMD is also not working well with Nvidia denoiser since that was built for their dedicated hardware. If most of the RT games are made with Nvidia denoising code, AMD will lose several miliseconds for every frame they need to render.

What do you ask me? Why Hitman 3 which is a game sponsored by Intel didn't release with RT support? Or why CP2077 which is a game sponsored by Nvidia doesn't have AMD support? Ask them, why should i answer?

Now you are making a fool of yourself. What do you think it means? :D


That is a lot of crap right there. Then what do you think 100 samples per pixel means? Do you still need denoising if you use 100 samples per pixel? :)

I remember reading well before launch that Hitman 3 was AMD partnered. My mistake, I was surprised to see it isn't connected to AMD :o

I've been playing with my own voxel based ray tracing engine and so arguing with myself a lot. This was one of these times I should have gone to bed rather than visit the forums :D
 
Seen this game on YouTube wot a crazy game it reminds me of crisis remaster runs utter cack even with the 3000 series wouldn't waste my money on this had a chuckle when the frames tanked.

Yea, the remastered version suffers from the same bad design as the original where it mainly uses 1 CPU core.
 
I remember reading well before launch that Hitman 3 was AMD partnered. My mistake, I was surprised to see it isn't connected to AMD :o

I've been playing with my own voxel based ray tracing engine and so arguing with myself a lot. This was one of these times I should have gone to bed rather than visit the forums :D
FYI: 1spp means one ray is shot for each pixel. Normally in non real time work, you shoot thousands of rays into each pixel and then you will have a very clear image with very little denoising required. Because you shoot the rays on each pixel in different directions and this way you get the information about how each pixel should look like.
For real time that is not an option so really the big performer in RT games is the denoiser. Even 4spp will kill any video card we have right now and the future doesn't look too bright. Right now they use 1spp in every game, compare that with thousands of spp and you will know you have to thank the denoiser for your RT games. :)
Both cards are doing the same thing in the same game, how many spp you shoot is written inside the game code and in games like Control you can even play with this setting and see how much impact on the performance increasing the spp has. Nvidia does the job faster but both cards are budget console chips if we talk about how much RT performance they can offer.
 
Since we seem to be on the subject of AMD RT again due to the Nvidia worshipping constantly bringing up the subject, here is an interesting video which shows the RT performance of the consoles in Control.




The commentary at 6:27 onwards states that the performance in RT achieved by the PS5 is roughly on par with a 2070S. This was expected as we knew from the specs before launch that the PS5 gpu was going to be at around 2070S/2080 level.

The most interesting thing they mention is that the 6800XT is underperforming compared to the consoles by quite a lot. A 40CU PS5 gpu is actually getting similar performance to the 72CU 6800XT at the same settings. They think there are probably some driver or optimization issues on the pc gpu's which could unlock more RT performance.
 
Last edited:
Since we seem to be on the subject of AMD RT again due to the Nvidia worshipping constantly bringing up the subject, here is an interesting video which shows the RT performance of the consoles in Control.




The commentary at 6:27 onwards states that the performance in RT achieved by the PS5 is roughly on par with a 2070S. This was expected as we knew from the specs before launch that the PS5 gpu was going to be at around 2070S/2080 level.

The most interesting thing they mention is that the 6800XT is underperforming compared to the consoles by quite a lot. A 40CU PS5 gpu is actually getting similar performance to the 72CU 6800XT at the same settings. They think there are probably some driver or optimization issues on the pc gpu's which could unlock more RT performance.
Part of the performance difference could be explained by using a different denoiser on consoles. At least on PS5 they won't use Nvidia's because PS5 is not working with DirectX. Radeon cards are forced to use Nvidia code in Nvidia sponsored games and that can be a problem, most likely the code is not optimal for their hardware.
And part of the performance difference can be explained by using even lower settings on consoles.
 
Since we seem to be on the subject of AMD RT again due to the Nvidia worshipping constantly bringing up the subject, here is an interesting video which shows the RT performance of the consoles in Control.




The commentary at 6:27 onwards states that the performance in RT achieved by the PS5 is roughly on par with a 2070S. This was expected as we knew from the specs before launch that the PS5 gpu was going to be at around 2070S/2080 level.

The most interesting thing they mention is that the 6800XT is underperforming compared to the consoles by quite a lot. A 40CU PS5 gpu is actually getting similar performance to the 72CU 6800XT at the same settings. They think there are probably some driver or optimization issues on the pc gpu's which could unlock more RT performance.


Or maybe it's unrelated to drivers and an architecture bottleneck. I can remember from 2080ti testing that Ray Tracing likes memory bandwidth and what's there one thing Big Navi is lacking in, is it clock speed, no. Is it tdp, no. Is it the color red, no. Oh that's right, it's running with mid range memory bandwidth
 
Or maybe it's unrelated to drivers and an architecture bottleneck. I can remember from 2080ti testing that Ray Tracing likes memory bandwidth and what's there one thing Big Navi is lacking in, is it clock speed, no. Is it tdp, no. Is it the color red, no. Oh that's right, it's running with mid range memory bandwidth
It is not bandwidth limited otherwise you will see that at 8k. I thought it is limited too due to lower 4k performance but it looks like it can do even 8k as well as the 3090 so the bandwidth is not a problem. Somehow that cache is doing magic.
 
Back
Top Bottom