The last game I've seen with full RT and good performance that justify it, along with no accompanying issues (lagging GI, noise all over the place, very blurry image caused by TAA on most effects etc.) was Metro LL - that IS a benchmark for me of what's possible. More modern games, they turn on RT/PT and that's it. Are they technically better? No, in most cases it's a regression in my eyes - there's no better physics, no better AI, no more realistic RT effects, there's just much higher system requirements. Did RT suddenly become more computationally expensive or is it a case of lazy devs using off the shelf solution without actually doing any work themselves? From NVIDIA article about said Metro "Ordinarily, this level of detail would require gigabytes of system memory and GPU VRAM, but thanks to a highly efficient streaming system
Last Light’s world uses less than 4GB of memory, and less than 2GB of VRAM, even at 2560x1440 with every setting enabled and maxed out.". I don't think I need to add more here. We're not moving forth with graphical fidelity, we're moving forth with laziness of devs and people like DF excusing it, instead of pushing for better development, along with blaming gamers for having too weak hardware. The audacity of that, when we know already current hardware is fast enough to handle it, just horribly utilised. Then 5k series arrives and what will we get? Even less FPS in next games, with same overall fidelity, I can bet. Because 30FPS is more cinematic?
Can you imagine running modern game looking like Metro LL on 2GB vRAM?