He didn't, double check the titlea Vs video without an FPS counter, how pointless. And where did he get a RX6000 card to benchmark?

Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
He didn't, double check the titlea Vs video without an FPS counter, how pointless. And where did he get a RX6000 card to benchmark?
If anything, it's Nvidia who should be performing far better at RT than they actually are with how much additional experience they have had. They are now into their second generation and have managed only a 20% or so increase in real-world performance according to the benchmarks? That's hardly impressive.
a Vs video without an FPS counter, how pointless. And where did he get a RX6000 card to benchmark?
Not cheating no... they are using DLSS as a crutch though to support Raytracing, but its extremely clever of them to do so, i have zero issues with it. But it does show that hardware is still not really at the point where it can use RT without having to have workaround etc to make it really playable. Again i will point out, AMD have seemingly beaten Nvidias first attempt at RT in Turing, with their method in RDNA2, i feel this needs more applause than it is getting, people seem to be overlooking this fact.
Yes they dont match Nvidias second go at RT, but they beat Nvidias first go.. this can only bode well for the future no? especially if they bring out a DLSS competitor, it will aid in the uptake of RT as a whole. Sure many will still turn it off (myself being one unless i want to take screenshots).
Yup.Granted it's a sneaky titled video as there isn't really any comparison of AMD vs nVidia in RT, but the underlying message is that games like Legion use DXR, which is universally compatible and usable to full extent on any RT hardware it is played on long as that hardware supports DX12 Ultimate. You should expect the exact same Raytracing visuals regardless of which team GFX card you are running.
All leaks so far show RT performance better than 2080ti but worse than 3080 which is not bad at all for anyone that cares about RT.It's not hard to find benchmarks showing Quake 2 RTX at 1440p running greater than 60FPS on a 3080. That's a full scene denoised path tracing example. So pretending that hardware is not ready makes no sense.
I can't find any demos of AMDs RT performance, but as it uses shared hardware it may not even be meeting the dissappointing 2080Ti levels.
No worries, clearly I misread or misunderstood your post.We are discussing AMD's implementation?
Laws of physics, I guess. They both have experienced engineers working within the same limitations of their field.How is it that two companies spend years building their own architecture, independently from each other, and they're pretty much the same speed?
Internal leaks - the 2 companies know a shocking amount about the other and what goes on internally. Thats what the rumour mills state between each other anywayDid AMD have enough time to check the speed of Nvidia's cards, and add just enough compute units to equal them? Or is something else going on?
Nobody knows. Although Jay said hes been notified to expect deliveries soon and also that he does not have much time "at all" (his words) to do testing, benchmarks, prepare video. So I imagine we'll see reviews in like a week or so tbh.When will we see reviews, at release date?
Yup.
I do wonder how this is going to play out going forward. I get the impression that AMDs implementation using DXR is more applicable than Nvidias homebrewed implementation which uses the DXR API. IE will Nvidia cards need better tuning to take advantage and will AMDs just work more out of the box?
We will have to wait and see.
Knowing nVidia and past events in driver updates with optimisations, I imagine they have been riding the RTX wave all this time whilst behind the scenes been working on DXR optimisations and when the time comes to switch because game devs will naturally be using DXR just to keep everything consistent between next gen consoles and PC etc and making their own lives easier/less costly, nV will release a new driver which ramps up performance just like they have done many times in the past.
I know we haven't even had RDNA2 yet but RDNA3 vs 40x0 is going to be awesome because i doubt Nvidia will want to get Intel'ed by AMD so they will probably be quite aggressive with their price and performance.
NV has its own devkit (100% DXR compatible and mosty open source) which may have some algorithmic optimisations best suited to its architecture..
AMD has a task cut out..