I don't for a minute think that full on ray traced games is going to happen this generation of hardware - but I have a hard time taking seriously the dismissiveness people have towards it - the implementation nVidia are doing is going to basically be the way it is done underneath and for an early form of it the implementation in Quake 2, when you look beyond the geometry limits inherent to the Quake 2 engine (which do not in any significant way provide a benefit to RTX performance), is very very impressive. Static screenshots don't really do it justice when you see in real time the little lighting details as light interacts with the scene in a way that traditional rendering just can't do it takes it to another level.
Though there are some approximations, cheats and optimisations in the Quake 2 RTX implementation it is a fully featured ray tracing implementation using path tracing that is used for all GI - there are no traditional techniques used at all - every light, reflection, refraction, shadow, etc. is all done through ray tracing techniques - this isn't just slapping some ray traced reflections in an old engine and saying job done.
People are dismissive because of one of numerous reasons. Price of RTX cards, performance of RTX cards, implementation in new games being lack lustre or all three combined.
There were loads of people saying they could not see the difference made by 4K back in 2014 when I got my first 4K monitor and only the past year or two are people changing their tune as better hardware is coming out that can run it. Once hardware to 4K at high FPS you will start seeing more and more people upgrading their monitors to 4K and they will magically see the difference then
The difference for me is I have always been happy to sacrifice FPS for image quality and not been bothered much by playing under 60fps on most games, only a few games that I play NEED it like fighting or racing games. I don’t play online much either. This meant I could live with the low FPS all the way back then and enjoy the image quality boost and not need to use AA methods that are like Vaseline being smeared all over the screen.
As I recall the only other early adopter here of 4K was Kaapstad
To me the sweet spot on most games is lowering settings by 1 notch from Ultra to High and turning off AA, motion blur, depth of field etc. This gets you much better image quality than 1440p could ever provide on Ultra settings. In my experience there has only been a handful of games where you can easily see the difference between High and Ultra while actually playing. You need to run screen shots see minor differences. But not for all games. The Outer Worlds for example I did see a clear difference when I recently played it, so I left settings on ultra for my playthrough
Went on a bit of tangent there
My stance on RT is that I really like it. Just happy to wait until 3000 series cards where we get better hardware and have more than a handful of games that implement it well. If Nvidia had a lot more RT games out there, specifically ones I wanted to play (none so far) then I may have caved.