Depends on what you're doing. Having all of the sudden 6/12, 8/16 or even a 12/24 CPU at a good offer, at the same price as previous 4/8 ones, in apps that do know about extra cores, can be seen as a pretty big leap ahead. But, if we're sticking strictly to gaming, than also should be mentioned that the need to upgrade the CPU is far less needed than the GPU - so big steps are made naturally as you finally decide to go from a very old part to a new and shinny one when is finally required.
Yup, I'm aware of your thoughts on the matter. While I am excited about RT can do, I'm far more reserved in how it's used today - most likely due to lack of hardware power as well. But that DLSS 2.0 for sure looks great!
I think is more about picking your battles scenario depending on the type of game you're in.
The Vanishing of Ethan Carter Redux looks, to me, significantly better than
Metro Exodus in Taiga level (or whatever is called), where the scenery is relatively the same. Of course,
The Vanishing of Ethan Carter Redux also performs far better than Metro, even when Metro runs without RT.
Then talking about multiple lights, when the whole low level api thing was a hot topic, one of the benefits was (or it was made to be), that it should have allowed for a greater number of shadow casting lights per scene than the 4-5 ones that are usually used in dx11 before performance breaks. That's another thing, cheaper, that can improve upon the quality of an image, but did not quite take of even is (dx12) widely supported (by hardware compared to RT).
People are saying that the demo of UE5 is just a demo and doesn't have all the other assets of a game, but Tomb Raider games have plenty of scenes just like that, walking around, "doing nothing", so it is part of a game. nVIDIA's demo, not so much. And also about picking your battles, having better assets, such as the ones from demos Heretic, Blacksmith or The Book Of The Dead, in my opinion, can contribute more to photorealism than allocating those hardware resources to RT.
Anyway, will be interesting to see what the future holds and hopefully nVIDIA will manage to convince developers to use RT to a far greater and better extent than GPU PhysX.