Considering raw raster performance, if the Tech Powerup article earlier is anything to go by, then it's up to 59fps on Cyberpunk when DLSS is not turned on, then you have the increased latency too. To my eyes reading that, it doesn't seem like a big jump in raw raster performance from my 3080 Ti, let alone a 3090 Ti which would be a bit higher still.
Is that difference worth the cost of a 4090 and a new PSU upgrade too? That can only be answered on an individual basis I suppose. To me it doesn't seem cost effective. DLSS is the future, but DLSS 2 core components will continue to be developed as they form the subset of DLSS 3 anyway, so both will continue to improve. I think I would rather the higher fps and lower latency when these are laid on the table.
In my example above I can feel the latency difference between no DLSS vs DLSS in Quality vs Performance. You should be able to anyway as those latency figures are quite big jumps. Granted I do not have a Reflex compatible mouse so cannot speak for how that component of the technology changes the experience.
Once the 40XX prices are slashed, this view might change though as cost vs relative perf is a big consideration as mentioned already in the thread for many, regardless of affordability, it's more the principles.
I do however want to see the same test from a 4080 16GB...