So I'm curious about the statement where shifts from say 60 to 30 FPS will be done incrementally. Is there anything documented to say this? Because I find myself puzzled by said scenario.
No documentation but I am a self proclaimed genius
Seriously though, they haven't(and maybe won't) talk about how they do it but essentially the smoothness can only come from frame smoothing. Remember that Nvidia have been talking about frame pacing and adding more and more hardware to monitor and control it for what, 3-4 years. AMD and Nvidia with frame pacing hold frames and drop frames when it means a smoother experience. The converse of this was AMD's with xfire having eleventy billion frames but plenty weren't actually being seen and the image was less smooth. Now we see less frames, drop quite a few, but we get a much smoother game from it.
From both Nvidia patents talking quite specifically about monitoring the rate of change of the change in frame rate to determine what refresh rate to be using and just coming to a logical conclusion that if it was jumping around from 60-30-60fps(just about worse case scenario for g-sync and "normal" for v-sync, without frame smoothing you would expect both to look identical because both would be updating frames at exactly the same time(with one extra refresh in the middle with no effect in the case of v-sync). This is where the stutter comes from with v-sync.
If your game is running at 60 FPS (or any number for that matter), then it has a sudden drop to 30 FPS - what your implying is that G-Sync will reduce the Refresh Rate slowly (to maintain smoothness), but if content is being rendered at 30 FPS how does one render at 59Hz and so on?
My obvious question then is, wouldn't this introduce input lag which is the very thing G-Sync aims to eliminate?
Every company likes to make bold claims, they all do it and most companies don't like explaining things in super detail to average users. But claiming it eliminates latency is a bit daft, makes it MUCH smaller than trip buffered v-sync, for sure. Latency won't be an issue, some latency, but less than basically most/all other methods is a non issue. g-sync really isn't there to eliminate latency to be honest, its 98% about the smoothness in situations where smoothness is usually an issue.
Without being good with making pretty graphs/tables it's really hard to describe but essentially, dropping frames.
Realistically the demo they showed was very slow frame rate change, but the reality is the 16.67ms frame time change that induces stutter in v-sync is noticeable, the probably 0.2-0.5ms frame time changes in the pendulum demo weren't noticeable. There will be a sweet spot in there that allows for faster frame rate change with 99% of the smoothness. IE 0,2ms awesome, 2ms only a fraction worse, 4ms noticeably worse but still decent, 8ms is pretty meh, 10ms is crappy and 16.67ms is woeful.
So if the frame rate went 60 to 30fps, I'd expect a quicker change than in the demo but around some sweet spot where you still feel 95% of the smoothness, but the quicker change reduces the number of dropped/delayed frames and gets to the new frame rate quicker.
I mean 60 to 30fps in 0.2ms gaps is 150 frames, or 5 seconds. If you could barely tell the difference at 4ms frame gap changes, then you'd be adjusted in 8 frames or less than a 1/3rd of a second.
If it does indeed work this way then I am puzzled, G-Sync does not have to smooth out frame rate transitioning for it to be smooth (which is basically no input lag and no tearing/stutter) - they could simply drop to 30FPS and display at 30Hz?
AS above people really need to understand where the stutter from v-sync comes, the stutter they're eliminating is the change in frame times, it's what they've been fighting with frame pacing for years, it is THE key to smoothness. v-sync already drops to 30fps instantly, that is the fundamental problem with it.
Oh and I know they showed the pendulum demo with FPS slowly dropping unrealistically, but it was my impression that scenario was there to demo the tearing-free animation and nothing more. (Infact if I remember correctly that's exactly what they were pointing out as the FPS began to fall, but feel free to correct me if otherwise).
g-sync would be tear free full stop, no matter when it updates, it is by design able to prevent the screen buffer updating midway through the screen refreshing, which is how you get tearing. The slow change in frame rate didn't have any effect on tearing, a signficantly faster change in frame rate would not tear under g-sync and would tear even worse(probably) without v-sync.
The key to the demo was the smooth way the frame rate changed.