Probably because they need to mix the RT and rasterised effects more optimally to get the "look" the developers want??
I think once we start to double RT performance per tier,things will start to get a better IMHO. So either next generation or the generation after. The biggest issue being RT performance an entry and mainstream tier dGPUs IMHO.
For example if you look at Cyberpunk 2077 with RT on:
https://tpucdn.com/review/msi-geforce-rtx-3050-gaming-x/images/cyberpunk-2077-rt-1920-1080.png
The RTX3060TI is 54% faster than an RTX2060 Super at 1080p,which is its upper mainstream/entry level enthusiast Nvidia dGPU. The RTX3060 is only 34% faster than an RTX2060!
However,the RTX3080 is 72% faster than the RTX2080 at 1080p but is 81% faster at 1440p. RTX3090 is 63% faster than an RTX2080TI at 1440p. This is part of the issue - many here own the faster dGPUs,which have shown the greatest per generation RT performance increases,especially at higher resolutions.