I should add the 5700 XT went from ~RTX 2070 equivalent to 2070S-2080 territory in many games, so was in the overtaking lane against its current gen rivals too IIRC, not just the 1080 Ti.
Arguments about power draw come down to amount played and what you're playing. Something like the 7900 XT is powerful enough that, with some games on my 4K TV locked at its max refresh (60 Hz), the 7900 XT is drawing ~160W. My old 5700 XT couldn't lock that FPS, but would be drawing its max of 225W attempting to do so.
The main examples of power efficiency making a real difference in cost comes from NVidia graphs, like at how 50p per kWh playing games 30 hours per week (on games that max your card to 100%), you'll be better off by like £40 per year on NVidia than AMD or something (I don't recall the exact details and can't find the slide on it). I wish I could play triple AAA games that much, but sadly I can't.
I think the 7900 XT is an easy win against the competition in pure rasterization + the juicy VRAM; the sticking point is FSR2 not being as good as DLSS, and there being a giant question mark over FSR3. It's not the even playing field like with the 5700 XT vs 1080 Ti where neither had an advantage in tech. NVidia has an ace up its sleeve in DLSS that AMD have yet to match. Something similar could be said about ray tracing, but that's a headache to pick apart, especially with so much DLSS3 polluting the ray tracing stats and NVidia's focus on what I assumed were obsolete / soon to be obsolete resolutions. Plus there's a giant gulf between how RDNA3 handles lighter ray tracing workloads (RE4, Forza Horizon 5, Shadow of the Tomb Raider), versus something like Cyberpunk, which seems designed to crush everything except the 4090 (or limp through with DLSS3 on other 4000 series cards).
Then there's the whole UE5 situation coming up.