I can accept that. But generally technology advancements meant it was either cheaper, command less power or maybe you got something more (as above a 5080 might be close to a 4090 but your excluding the shortfall of 8GB vram). In good generations you got all of that.
This is where a true Ti card would fill the gap. It wont be the launch 5080. They will probably utilise the 3gb ram by then and offer something more than the baseline 16gb they will get away with which is a shame if your charging >£1000 in the first place.
Despite the panic over VRAM, I'm not actually sure that there is a lot to panic about. Would I like to see 24GB on a 5080? Sure! Do I expect massive problems on 16GB over the next 4 years? Not really.
I bought my 10GB 3080 on release about 4 years ago. I've not noticed a problem with it since. Could it run Indiana Jones at full textures with 10GB? No... but it's a 4 year old card, and that seems about standard for PC refresh times. Could it run the game? Sure.
I _do_ worry that if the lower end 50 series cards release with 8GB, that will be a problem in some games. But with Battlemage released and the 8000 series coming, I expect that there'll be alternatives at the 5060-ish price points, and hope that they are much better.
I can see a gradual inflation of the VRAM needed for games, probably to about 16GB in 5 years time. But it seems unlikely that any but the most thick-headed of devs is going to require that to run the game, as it'll likely alienate the majority of gamers who use mid-range cards.
I don't expect that the 50 series will represent a great generation of cards, but I think that the days of that are over, until Nvidia get some proper competition at the high end. But given that if I want to build a PC today the performance costs say £2000 and in January the same performance costs £1500, at the cost of some VRAM, I think I'm probably ok with that, given one supplier's dominance over the market.