It does seem strange to me, AMD are usually the ones pushing new stuff while Nvidia tend to wait until it's actually starting to be used and pile in with more transistors and power dedicated to it. It works for them but if for instance AMD didn't have the hardware tessellation unit for a couple of gens, then devs wouldn't have started preliminary work on it.
Either way it strikes me as odd Nvidia would add these units to consumer gpus without the devs ready to go there. Also now with AMD having Playstation and Xbox, most AAA games need to work primarily on AMD based architectures so it would seem unlikely that there is a large switch towards raytracing in major games unless of course Navi/PS5 is aiming to go raytracing next year and Nvidia are getting ready for it also.
For professional it makes perfect sense, but for consumer, none until it's ready.
Also the RTX off and RTX on image is hilarious, the RTX off image looks like all the lighting settings and quality of a game made a decade ago with ray tracing a huge improvement. In reality, and this has always been the issue with ray tracing being treated as brilliant. When Ray Tracing was 'the' big next thing it was being talked about in the 90s, compared to rasterisation methods used then it was night and day, compared to the best global illumination and lighting done today it's a step forward, but maybe not worth the compute cost until/unless you find a trick to make it worthwhile. If they have great but the image they produced in the RTX on image is pretty close to what we can do already, certainly on such a simple image.