Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Can you explain what you mean here, you are super vague.
None of these results are a surprise btw. The 6800xt is 2080 ti level of RT performance.
Very dishonest video since Minecraft RTX is made by Nvidia and the guy who made the video knows it. But he is on a mission to prove that the 6000 series and especially 6800xt is useless.![]()
There are plugins with software RT for Minecraft and i think you can get better results running software RT on AMD 6000 series than running hw acceleration on Nvidia build. Of course the card will have trouble running on Nvidia code and no one really knows how big is the performance loss vs an AMD optimized version.
I don't think that AMD RT performance is better than Nvidia but i think it is way better than these tech gurus want us to believe. Right now they are promoting Nvidia.
I thought the 6800xt was as well, but this looks a bit worse. And of course, no DLSS for AMD.
Minecraft RTX is just using DXR for RT. This is just an example of fully path traced rendering.
It will be nice if people will start to think for themselves instead of repeating " is just DXR" over and over.
Any software can be optimized. Let's say i have a 4 core CPU and you have a 3 core CPU.
I can make a program that can use 4 cores every cycle. So i can calculate a frame every 5 milliseconds.
You will not be only 25% slower because you will need 2 cycles for each frame.You will be 50% slower.
But if we write the same program to only use 3 cores, then your performance will be almost equal with mine.
We see games where 3090 is behind 6800xt even at 4k ( Valhalla for example ). And we see programs where Intel is still ahead of AMD. And these are old technologies where a faster CPU/GPU can cut the costs of bad optimization through brute force.
Ray tracing is new technology. And what we see in games like Minecraft or Control is optimization for Nvidia vs brute force for AMD. Of course the performance on AMD will look bad. But if you optimize the games for RDNA2, then the performance will be much better. I don't think it will beat Nvidia in this generation but it will be much better.
Look at Cyberpunk. How can they give Nvidia Ray Tracing and not give ray tracing to AMD if it is just DXR? Not to mention that PS5 and X box consoles are also left behind, they will get ray tracing later next year. The simple thing that Cyberpunk has announced that Nvidia will get Ray Tracing now and AMD will get it later, tells you that it is not just DXR.
The only way they are going to do better when the gap is so big is by lowering image quality.For anyone thinking that RDNA2 is going to get better with RT...
really? Hard to tell since they don't run a 2080ti in that video.
oh no.. alllll those games with DLSS... oh wait..
The only reason those cards get any meaningful perf is because of DLSS in minecraft. If that's a game you heavily play, then Nvidia seems like a good purchase. I don't play it, but my daughter does. How many adults are really playing minecraft to warrant buying 500+ card for their kid?
You do realise AMD is attempting some form of DLSS without dedicated hardware?
First, thanks for the slight on my ability to comprehend. I'm sure you're such a magnificent brain surrounded by a limited body, who knows how far you could have gone in life.It's not Minecraft that is important here, but the ability to render a path traced scene at playable frame rates. Don't worry too much, people also have a problem understanding what Quake 2 RTX is doing.
people also have a problem understanding what Quake 2 RTX is doing.
DLSS has always been sold as 'it goes faster/more perf'. hell no. It doesn't give you more performance, it gives you LESS. But that's what a good marketing department does, skews the truth.
First, thanks for the slight on my ability to comprehend. I'm sure you're such a magnificent brain surrounded by a limited body, who knows how far you could have gone in life.
Second, yes, it is minecraft that is important here. You decided to link a video that doesn't even use cards that we know are on the same level of RT performance. If people are buying an AMD card, they aren't buying it for RT, and if they are, they are making a mistake.
Can you explain what you mean here, you are super vague.
None of these results are a surprise btw. The 6800xt is 2080 ti level of RT performance.
The 6800XT is worse than the 2080ti for RT, it is just faster than the 2080ti in rasterization which hided dome of the performance drop.
In something lile Minecraft with a heavy RT load and easy rasterization we can see the true RT differences. The 6800XT is more like a 2070.
There are hundreds of thousands of Minecraft videos posted on Youtube if not more. There are very few that compare next gen cards (Ampere/RDNA2) ability to render via path tracing. The only other title that I know of doing this is Quake 2 RTX, which I have yet to see an RDNA2 card running. The guy himself states that it was the most requested comparison. You may want to keep your head in the sand, but many buying or even just interested in next gen tech are interested in this type of video. Remember this is the graphics card section within a mostly hardware related forum. A place where people come to make sure they are not making a bad purchasing decision.
The guy himself states that it was the most requested comparison. You may want to keep your head in the sand, but many buying or even just interested in next gen tech are interested in this type of video. Remember this is the graphics card section within a mostly hardware related forum. A place where people come to make sure they are not making a bad purchasing decision.
My head isn't in the sand. I don't particularly care about RT right now. I think Nvidia taking on RT with their own implementation is impressive, but they haven't managed yet enough developers to adopt it yet. So far, it's mainly a gimmick. The simple fact that to get heavy RT playable requires upscaling to be enabled should already tell you enough.
IMO there is coming a time when either we have to go full RT, or RT features become an integral part of the GPU. Sustaining two entirely different sections of the GPU is not feasible long term.