No I didn't say ray-tracing did I. I mentioned the Tensor cores, which are the AI-accelerating cores that nV has tried to re-purpose in a consumer gaming card by inventing DLSS.
Ray-tracing is entirely separate. You don't need Tensor cores for it. As you yourself say, RT has been done in software for decades. Heck I remember RT demos in the early 90s. You just had to wait several hours/days for a single scene to render.
The point is the Tensor cores are there because nV's focus is not just gaming now. They are big in the AI space, and they wanted to sell/re-purpose compute-heavy cards for consumer/gaming.
Don't forget there are Tensor core and RTX cores.
Three different types of cores. CUDA + Tensor + RT.
Agreed - even the addition of RT cores wasn't done entirely for gaming,it was done for getting into the VFX market. People forget at the Turing launch,Nvidia launched a full line of commercial cards with the fully enabled GPUs,and started talking about $200 billion VFX markets - even investment sites were talking about VFX mostly and not gaming. Most of the initial talk at the Turing reveal was about VFX and not so much about pure gaming.
This explains why at launch you hardly saw many games with RT,and it took months for some of the games which had it to get it optimised and why DLSS was so hit and miss. Even developers had poor access to cards until launch - if this were all features which were meant only for gaming,most of them would have had prototype cards to try before launch and would have hit the ground running. The fact that the software stack was so poorly developed is telling - I suspect Turing was pulled forward as AMD was meant to launch Navi at the start of last year according to rumours,but didn't.
You also need to consider this - Nvidia would have had to have three lines,one for gaming,one for RT use in VFX markets and one for DP computing. They basically combined the first two lines,which saves on R and D which is a significant expenditure. Otherwise it makes little sense to sell such huge GPUs for gaming usage - traditionally when we had large gaming GPUs it has been primarily for cards which have pulled dual uses,ie,gaming and commercial usage.
Its also typically pompous statement claiming that other people 'dont understand'. Let's face it, if AMD had came out with it first they would have been trolled to the rafters as another over-hyped feature that doesnt quite deliver yet. Just because nvidia released it doesnt make it immune to criticism, they basically rushed it so the marketing could stamp it for advertisers to adopt it.
People do understand,yet a lot of dorks(us included) on hardware enthusiast forums,don't understand graphics are only one part of a game. 75% of computer gaming revenue is from consoles and phones which don't push graphics,and a large percentage of the remaining 25% of computer gaming revenue is from MMOs and twitch shooters which don't push graphics and many have cartoonish graphics styles which can scale down to slower systems. Even on Steam look at how many people have anything better than a GTX1070?? Oh,wait that is because most PC gamers are still constrained by price. On this forum most people and their friends will have better than average hardware and upgrade more often. Its like going on a watch collecting forum...how many will be taking about £20 Casio watches on there?
If graphics were the singular important thing,why didn't Crysis sell more copies,etc - graphics are important,but its quite telling most of the computer gaming market isn't driven forward by purely graphics. Its driven forward by games which don't even look the best with current rasterised techniques but are apparently fun or involving in some way. Even a game such as the Witcher 3 sold well because it is a good game,even Cyberpunk 2077 will live or die on how the gameplay and story holds up. Nice graphics are the icing on the cake,but they don't make the game. Vast numbers of people who ran Crysis or Witcher 3 probably never ran it at max settings either.
Most of us on forums like this are hardware enthusiasts and dorks who like talking about hardware and technical aspects of games tech,but most gamers don't care. If they cared so much about the technically best platforms which had the nicest looking games,which ran the best,phones and consoles would not be making so much money with computer games. They are inferior to a decent gaming PC.
People also don't understand there is a huge mass of graphics cards which can't do raytracing very well,and developers with the increasing dumbing down of games,want to expand their player base and sell more games. Are they going to just ignore that so many cards are poor at raytracing??
No they are not,so for the immediate future it is going to be another "max graphics option" and not essential to most games,and how many times do we have these features integrated at the behest of AMD or Nvidia,who provide support in some way. AMD and Nvidia need to keep finding ways to sell new graphics cards. Raytracing will become a standard part of PC graphics when a critical mass of graphics cards can do it OK,and that will mean even £150 ones. If the new consoles can do it OK it will be a step towards this too. Pretty much the same for anyone of the new features we have seen introduced in the last 20 years.