It would seem, it's bold of you to assume some people don't play pixels (illuminated by rays) instead of actual games.
A lot of my mates less interested in the tech side of things play these games so don't have a choice. But they are fun!
It's worse than that - Hollywood in many cases stagnated with CGI too, on their huge render farms, with a lot of films looking just... bad. Often because of cost cutting. And people imagine games can get better than that, it feels - they simply can't. They're games, not movies - designed to be played interactively, not watched like a film in 24FPS. Something has to give.
There is much to be said with decent physical effects. Bladerunner 2049 looked amazing on IMAX screens and used a lot of miniatures from Weta Workshop. But the original Bladerunner still holds up.
NVIDIA CEO likes to claim it's not them, it's physics and they can't make GPUs cheaper nor faster anymore - only AI counts now.
Yes, he's pushing marketing narrative but he's also right in a way that it became very expensive to design and produce these GPUs, as all the low-hanging fruits they already gathered and there's not an easy path forward for more performance anymore. AMD has at least a workable and interesting new designs coming (not this gen but sometime in the future), which should greatly speed up RT processing, but it's a few years away at least and likely won't be cheap with expensive L1 cache added to each small cluster of RT cores, just to achieve required vRAM throughput of hundreds of TB/s - simple GDDR7 is orders of magnitude too slow for that, currently.
I am not against smarter rendering techniques TBH and the tech is interesting - how it works out is another question. But what I have an issue is this repeated shrinkflation in mainstream hardware(including it seems consoles now),which swallows up any potential improvements so they can sell less for more. For the typical enthusiast who spends huge amounts on cards,it doesn't matter as much.
What did it for me finally was when Nvidia literally rebranded the RTX4060/RTX4060TI as the RTX4070,the RTX4080 12GB stunt and AMD followed suite with rebranding the RX7700XT as the RX7800XT and the RX7800 as the RX7900XT.
The resulting mainstream cards were frankly a POS. But these very cards are probably the biggest selling cards for both companies.
It's not about them using RT or not, the point is can they run well on mainstream xx60 cards without too much of quality degradation, stutter, artefacts, horribly blurry image etc.? If yes, that's fine. Plenty of games have RT effects in them that run very well on 3060 even, because they're well optimised. But also, there's plenty of badly optimised games that just don't run well on mainstream hardware or look so bad, nobody wants to run them on such.
But even in those cases,the effects are relatively minimal because the cards still need to render all the other parts. Many here are wanting balls to the walls RT/PT effects with permanent RT. It will happen eventually,but I think we would be here already if the mainstream hardware wasn't so rubbish.
That might have happened if the last two mainstream generations had a 40% improvement,which would have lead to a 2X improvement in performance and probably a 3X improvement by the end of this year. Then if you added upscaling,frame generation,etc on top of this,devs could push forward quicker.
These companies always try to spin the cost of hardware,but strangely their margins seem to be going up as they shrinkflate more and more.
But it seems Nvidia/AMD are more worried looking at what their shareholders are saying,so gamers can go and do one. Pretty much the same with some of these AAA gaming companies,but it seems gamers are wising up.