On the topic of DLSS, it feels a like NVidia have this great compute/AI silicon that they want to sell into the desktop GPU space and they're shoehorning something in that gives a mild benefit. My gut feeling is that the extra (multi-Billion) transistors would be more effective spent as compute units, ROPs, etc.
The whole AI thing is so hot right now though, so you've got to get it into the product cause buzzwords... Certainly wasn't the case when I used NN in my Thesis circa 2008, flippin' pain in the ass for mild benefit!