Ideally you don't want dedicated hardware inside your chip for anything. My bet is that in 3 generations we won't even have dedicated hardware for RT. It is a handicap to use a part of the chip for something that is not used all the time instead of using more compute units that work all the time.
That does not mean that the tensor cores in Ampere are a disadvantage. Since the performance is equal to Big Navi, they are an advantage. But an Ampere with more CU's instead of tensor cores, would have been even a better option.
I said before that AMD is giving us less hardware for almost the same amount of money and it is true. I am not saying that Big Navi cards are bad or are " console chips" like our friend calls them. They are pretty much a big leap compared to the older AMD generation and they can handle almost every game you throw at them, as long as it is not sponsored by Nvidia. There is nothing new under the sun.
The problem with RT and every tech Nvidia were promoting heavily is that if you focus too much on graphics, you can miss other important things a game will need to be a great game. And then you have Days Gone which was not the most awesome game on PS4 in any way and is coming to PC and it has better ratings than any RT game made so far. This should tell us something about the quality of PC games.
Ideally you have as little fixed function hardware as possible but RT doesn't lend itself well to general purpose compute - having dedicated hardware for it is always going to trump doing it on the shaders by quite a margin unless there is some radical new hardware design.