Please link to a render farm (or anything) doing full raytracing at 60FPS, 1080p.
I'll wait
The guy already admitted the render farm analogy wasn't great, but that isn't what turing is, so banging on about anything that isn't realtime is an equally bad reference. The preview window in blender isn't realtime, and its not a game. Turing allows a level of realtime raytracing previously not available in games. Thats it. If people don't want to recognise that as an improvement they are more than welcome to sit in the past. Ultimately rasterisation is coming to its limits and raytracing is the only way we know how to improve it. Every minor improvement in shadows and lighting we've had in the last 10 years has been a method for faking realism and at a large cost of resources that leads to all sorts of compromises and most people actually turning off/down those effects to get back fps. Now we are talking about offloading faked effects to a seperate core and using raytracing to accurately model those effects. In realtime at playable frame rates.
Something that wasnt hybrid and takes upwards of 1 second per frame for a single model is not anywhere in the same ballpark of what they are talking about with turing and going forwards. I'm baffled as to how you think blender on a 560ti is in any way comparable.
The first implementation isn't going to be renderfarm quality, of course not, but focusing on hyperbole completely misses the point of where GPU graphics is heading and you have to take that first step. If people don't want to be part of the first generation then thats fine, but trying to claim a 970 or 560ti can do the same thing or better than Turing completely misses the point and isn't at all accurate.
Equally a lot of people and even companies staked their future on saying that hardware t&l wasn't needed because it could be supported in software...