True we can have 1 ray per pixel at 4K 60fps even Vega 64 and GTX1080Ti can do it (that is 4.88Gigarays), but not 200 rays per pixel because that would require 200Gigarays before noise fixing, and a total 2000Gigarays for full image. Atm we need 200 RTX2080Tis to achieve that perf.
That is using a completely different metric, and not comparable, to the Gigarays nVidia was talking about - where the 1080ti manages slightly over 1.2GR/s - it isn't even a great metric anyhow as Gigarays doesn't really tell you the performance outside of a specific benchmark.
I'd honestly suggest people keep from being too negative/positive about this until we actually see some titles that properly implement hybrid RT - some of the comments before we've even seen real examples are getting a bit silly.