hmm, low persistence I'm not sure he said it implicitly but, would it reduce ghosting, in which case I'm for it, particularly if it meant somewhat the end of ruddy overdrive features. Though, would low persistence end up reintroducing flickering basically, which is why he's saying it's only useful for 90+ which went the same for me for CRT's. 120Hz ilyama whatever the hell it was CRT was awesome, so much less tiring on eyes than lower hz + flicker.
I'd far prefer they move towards 120hz and less ghosting with better screens than thinking low FPS is okay because tearing is gone.
I think low persistence might be the first thing since a CRT I would want to try in person in shop before knowing if it was worth it. If it induces a flicker style effect but reduces gaming... I can see it taking a while to get the balance right there, slight flicker + decent reduction in ghosting rather than all ghosting gone but loads of flicker. It may not induce any flicker at all but surely if they want the light on for less time between frames it would have to effectively induce flicker?
g-sync I can see taking off pretty big with most monitors jumping on board and AMD having the ability too, basically all monitors couldn't add the ability and cut AMD and Intel out of the market. It's smart but not ground breaking tech in so far as, it's just an extra piece of data and having a different source tell the screen when to update. I think it will be less good than they are saying, but still pretty good. It's going to make low FPS LESS bad, but it will still suck. It shouldn't ever be worse than v-sync, at times significantly better, probably average somewhere in the middle. Though if you're at 120fps it won't make a difference.