What are 'tensor cores'? Sure it's some fancy name for dedicated parts of the gpu that perform RT and ML tasks but why does AMD or anyone else NEED to have them? Are you saying RT cannot be done using stream processors in AMD gpu's?
You do understand that RT is performed by lots of mathematical calculations which can be done by any processing unit right? There is RT in the Crysis remastered which is being done in software and that seems fine. Here's a demo of Cryengine software RT:
On Tensor cores:
https://www.youtube.com/watch?v=yyR0ZoCeBO8
You need to have them because real time ray tracing is so taxing on hardware that general purpose hardware cannot do ray tracing in real time, it's way too slow. You need to have dedicated hardware which can do ray tracing "operations" much faster, the trade off being that they can only do this one type of operation, they can't be used for general purpose calculations in the same way. You can do ray tracing operations on any general purpose hardware, that's not the issue, in fact you can even do it on a regular CPU and there has been demos of this over the years. The issue is that general purpose hardware is so monumentally slow at it that it's essentially useless for real time gaming. And that basically comes down to the fact that ray tracing is a lot more demanding on hardware than rasterization, to such a degree that few people really appreciate.
With hybrid rasterization and ray tracing obviously what load you put on your GPU depends to the degree on how much ray tracing you're doing. You can cast literally just a few rays to do something like calculate the bounce of an audio source to do better directional audio, and the cost of a few rays is very tiny. Or you can do real time full scene ray tracing which obliterates even modern GPUs, even the 3xxx range with all the hardware optimized ray tracing hardware can only do full scene ray tracing at like 720p@30fps, upscalled only with DLSS.
Some people still think raytracing is an nvidia thing only, so it doesnt surprise me.. ^
Literally never ever said that, but thanks for putting those words into my mouth. It's not "nvidia thing only" in fact ray tracing in software has been around for an extremely long time, whether or not you can render a frame with ray tracing is NOT the issue, it's whether you can render at >=1080p @ 60fps in real time. And the answer on general purpose hardware is, no. Not even slightly close.
In the hardware world, when you use transistors to do logic/math, if you want general purpose calculation then it's inefficient. Certain more complex operations can be done faster using the same number of transistors but the trade off is that you can only do those specific operations with those specific dedicated transistors. This happens all the time, a good example is AES encryption. For a long time it was expensive using general purpose calculations. If you wanted to do FDE (Full Disk Encryption) on say your SSD/HDD with an AES based cipher, so everything written/read from disk was encrypted/decrypted it had an insane hit on the CPU, because it was inefficient at AES via general purpose calculations. And then Intel (first) threw onto their chips a bunch of transistors reserved for AES operations so that it could be "done in hardware" and suddenly AES became so cheap on the CPU that you could just AES encrypt all your disks and it was essentially free, which is what I do across all my disks. But the trade off is that those transistors on the chip are taken away from the pool of those used to do general purpose calculations.
Ray tracing is like that only the performance ramifications are off the chart by comparison. if you want to do RT Ops in any kind of sensible, real world way, you need an equivalent of RT/Tensor cores in your silicon. If you don't and you just go full general purpose (in the case of GPUs, general purpose is really more "rasterization" than anything else) then you wont get real time performance. This is why when AMD boast support for RT via DX that's lovely, all you need is the driver paths to support that. But when they talk about doing things like no visual effects, and instead things like positional audio (which requires very few rays projected) that's a big red flag they're not investing in dedicating transistors in the console GPUs towards ray tracing. I can't know that for sure, that's just my bet, based on the available information. If we see hybrid RT effects in the next gen console similar to thsoe we see in say BFV with RTX on, I'd be extremely surprised.