At the Eurogamer expo last November I had the opportunity to very briefly tryout CP2077 with RT on an all singing all dancing PC.
The thing that left the strongest impression on me, was how much noise RT introduced to the scene when not using DLSS. I do remember flicking between RT on and RT off but I don't remember it blowing me away. However it was a brief spin on a showroom floor.
I played though Ratchet and Clank rift apart and noticed the exact same noise issue as I saw in CP2077. I gave RT a good go and kept it on for a good chunk of my playthrough, I put up with the 60fps. I disabled RT on a certain level because the framerate nose dived but I didn't feel like I was missing out buy not having RT enabled.
Its a nice to have but when push comes to shove I disable it.
I pretty certain that no coverage of RT in video games, that I have seen, has ever touched on this noise issue. It seems like the devs do not bother to denoise the output. I'm guessing it is because they simply do not have a performance budget to do that. I do not fault people who think that DLSS/FSR looks better than native since those technology do seem to work as a denoise step. I wonder if those people are noticing the noise and its clean up when enabling DLSS/FSR but don't understand that, that is what is happening.
For anyone who is wondering what I mean when I say noise. In the link below there is an image at the top of the page that shows a noisy image and a denoised version. There are different levels of noise introduced to an image that image does show a particularly harsh example.