Even if it works like you suggest theres still quite a bit to be done when just purely dealing with the scenario of seamlessly moving between something thats being cloud computed then something that needs to be done locally. Theres a lot of head scratching and its probably because of that people can feel so dismissive.
I said a while back, the way I see it is, as soon as you interact you have to pull that back locally and to me when you talk about gaming theres quite a fair bit of variable interaction both by frequency and depth, so its vital to get that scenario tied up...
ps3ud0
True, but I guess the opportunity is to pre render/calculate a bunch of stuff and in the case of AI allow the cloud compute to worry about what everything is doing out of your sight leaving local hardware to only worry about what you're interacting with or close to interacting with (i.e. in line of sight). I wonder if a it could be used for a Physx type application to pre calculate real world physics effects and behaviour for example.
Now whether I care if there's 100 virtual monster, soldiers and other AI outside my line of site doing stuff intelligently without the local CPU having to worry about it or not is another matter.
It'll take some working out and I don't think it will be a short term thing, I do know that cloud compute for general computing is increasingly applicable and used for everything from transcoding to massive number crunching. To write it off as "a pipe dream for gullible viewers" seems a little naïve, particularly as local in box hardware hits it's limits and developers have a few years to wrap their brains around how they can use what is in effect free compute power.
I love that this brings something different to console gaming beyond the same old incremental stuff that just sees the same variations on games we have now but is "amazing 1080p" this time. It's like buying Star Wars on VHS, then DVD, then re-mastered, then Blue ray, then 3d etc... I'll bet a pound to a penny people are hard pushed to notice much difference between the latest 360/ps3 titles and early One/PS4 titles (perhaps with the exception of being crisper at 1080p).
Cloud compute would have piqued my interest whether it was MS, Sony, Nintendo or Steam, I think to underestimate it would be a mistake. It's not something that would change the mind of someone adamant they are buying a PS4 and MS are the devil though.
Still, as I say, could prove to be significant in the future (or be disregarded completely, dunno)