I think what's happened is a new tier of 4K capable graphics cards has been created (the RTX 4080 and 4090, plus the 4070 TI in some games and the Navi31 GPUs). This tier is only going to be expanded in the future, with cards like the RTX 4090 TI and possible Navi31 + v-cache GPUs. It looks like the 4090 can just about handle minimums of 60 FPS in demanding games like Hogwarts Legacy (depending on which reviews you look at).
A 4K capable teir wasn't really something that existed before, even the RTX 3090 and 3090 TI struggled at 4K in demanding games. This is why both companies think customers will pay, because 4K is still very much a premium/ luxury option. You pay through the nose for high memory bandwidth cards. Remember the high production cost of the Radeon VII and Vega 64 GPUs? AMD struggled to make money with these.
The other (relatively) new thing is RT hardware, since this was introduced costs have definitely increased (compare to the GTX 1000 series) - You definitely don't get it free and it might be a good thing for consumers if high end cards without the RT cores were introduced (not bloody likely
). Maybe avoid cards with lots of RT cores?
I also think the size of the market for 4K cards has grown a lot in the last couple of generations, there are more people willing to buy (almost entirely because of Nvidia).
So, sticking with 1440p capable cards is where we will see the (somewhat) more affordable cards being released - which was standard for the older generation graphics cards like the GTX 1000 series, and RDNA gen 1). To say things have remained the same in the last 5-6 years wouldn't be accurate, games have got significantly more GPU intensive (as graphic details have increased) at 1080p, 1440p or higher.