The real question is how G-Sync is smoothing between pacing because people around the world are noticing a difference. You on the other hand were trying to depict how it's doing it within ms time frame and how it's holding it between delayed frames, but nobody knows exactly how. If they did I'm sure certain people would implement something similar. (That's your queue to tell me how AMD could have implemented it years ago and how we don't need it as you were earlier)
You're once again clutching at straws and slowly going full circle to say you're now pro G-Sync lol.
Carry on
It's funny because your argument before said nothing of the sort, in fact you specifically said we couldn't notice those kinds of differences.... backing away from that and pretending we never said it are we, because it was ridiculous nonsense?
"The real question is how G-Sync is smoothing between pacing"
what, is this supposed to be gibberish pretending to be knowledgable, pacing IS smoothing, how are they smoothing between the smoothing, I don't know, the real question is surely how are they smoothing between the smoothing BETWEEN the smoothing though, isn't it?
Frame pacing is a pretty well explained field, from both sides, and we roughly speaking know how both do it because there is relatively on one rough way to do it... by you know, smoothing.... the frames......
I wonder if you can be honest. If you had a bunch of frames that were I dunno, 25ms, 45ms, 30ms, 16ms, 55ms, 25ms, 30ms.... how would you "smooth" these so you got a smooth set of frames rather than constantly changing rate? Would you attempt to smooth the difference by keeping track of a rough average and holding frames a couple ms here and dropping a frame there.
I would because what else would you do, add in 10 duplicate frames at one point and drop 30 frames elsewhere. There is only one goal in frame pacing... pacing the frames. Nvidia/AMD are on record everywhere explaining the goal, how they achieve it doesn't really matter, the goal is what matters, I was explaining the goal, not the method(though due to the Nvidia patents I have a very very very good idea on Nvidia's general method, not the algorithm or parameters they would use those.
I'm not pro g-sync, and I'm not clutching at straws, you claimed I was debunking it and now you're claiming I'm pro g-sync, you change your argument ENTIRELY every other post to suit yourself, and it's all been gibberish.
My stance hasn't changed since I first saw it. It's got potential to do a lot from 30-60, but it will be extremely lessened on a 120-144hz screen(every single review agreed on this point), it will do relatively little above 60fps and almost nothing about 90fps.
It's better to have, than have not, depending on your hardware and screen you may see almost no benefit, or loads.
I don't much care about freesync or g-sync in use, I like the technology though, is that very much okay with you? I have 2x290's and a 120hz screen, freesync will, outside of the odd insanely powerful game or something that won't run xfire, do very very little for me, well nothing considering the screen I have almost certainly doesn't support it. The technology simply interests me, is that allowed?