I don't think that argument works. Games will tweak their physics to get the gameplay right as a priority. Not to achieve the most realistic physics (unless it is a sim game). One game that comes to mind is rocket league, it's in game gravity value is 6.5 m/s2 for refernece earths gravity is 9.81 m/s2.It's physics. Do you like to have realistic gravity in games ?People talk about "liking" raytracing far too much.
It's a graphics setting which can, if well implemented, look good. It comes with a godawful performance hit which is the primary dealbreaker.
What's there to like or not like beyond the bleeding obvious.
Concidentally this extends to graphics as well. The goal should be for games to look "good"; that doesn't mean looking realistic. I still think the main benefits of RT will be reducing the work that artists need to do to get their games looking "good".
Regarding the OPs questions, as it was alluded to by two previous posters RT cores have been amazing for offline render engines. They are ready to go in that industry. For games, they are still underpowered and hampers what can be achieved by artist as they need to compromise because of the limited rays that can be used per frame. I think we need to see a huge increase (at least 3x increase in 3090 RT performace) in RT core performance before they become the must have feature for a majority of gamers. Currently the trade off between performance loss and visual gain, is skewered to heavily towards performance loss.