just pointing out again - DM says you can't patent the idea of a GPU dynamically updating and telling a monitor what the refresh rate should be;
USPTO says otherwise
Actually that doesn't say otherwise, it agrees with what I said in so much as the patent you are linking to is NOT about dynamically changing refresh rate particularly, it's about a system and, I don't know how to put it, because it's such a vague son of a *****, but reasons/circumstances in which it would want to drop refresh rate. it also mostly references, in fact pretty much exclusively mentions reducing refresh rate and skipping frames in suitable situations for the reasons of SAVING POWER. It's most likely a patent associated with their mobile devices and saving as much power by slowing the display(which is pretty much the main power draw in mobile devices).
It also basically references the patent PGI linked to talking about what I believe is basically the frame metering hardware as ONE of the methods in which they might determine a lower refresh rate was warranted.
The other patent that pgi highlighted is also NOT about dynamic refresh rates in particular, it's about how they determine an appropriate refresh rate, to some degree, and simply... I'd be pretty much certain this patent is basically saying how they use frame metering and/or the on die hardware to determine the best refresh rate at any given time. It's still not certainly about g-sync and could well be focused on mobile devices and determining when the user wouldn't notice a refresh drop but important to that is knowing when to raise it back up. It is a pretty interesting(less vague and much more involved) patent as well.
This patent
http://www.google.com/patents/US8120621
It's seemingly to determine how much change there is between frames and using this as a way to determine the best refresh rate, it would also result in dropping frames if the rate of change is too fast(this isn't a terrible thing in this case and is fairly fundamental to it working).
I specifically mentioned from the pendulum demo that they showed an exceptionally uniform rate of change of framerate and knew this would obviously look MUCH worse if the framerate were jumping around all over the place. IE if you look back at the demo and think frame times. This means you have 60-60-59-58-57, and the frame times are 16.67-16.94-17.24-17.54. While the frame time is changing how close they are is where the smoothness comes from.
If you didn't smooth out the frame rate then you would have jumps from 60 to 30 to 50 to 30 to 75fps, and that means frame times of, roughly speaking 16.7-32-20-32-14 or something. That is where you wouldn't have smoothness.
Either way, this algorithm or hardware it's describing determines how much is changing in each frame, and by how much the change in the frame is changing. IE one frame 5% of the image has changed, the next image 5% of the image changed again. The rate of change(5%) remains steady. If the rate of change is increasing significantly it will increase refresh rate, if it's changing too slowly, it will slow refresh rate to match. There would be I assume an optimum amount of rate of change, I don't know what that is but for instance if more than 50% of the frame is changing because the gpu is changing it faster then it will up frame rate so the difference per refresh is lower, if there is very little change they reduce refresh rate, say if it's below 15% it will decrease it till they have this optimum number, whatever it is.
So think about it like this, you have one frame that takes 16.67ms to produce at effectively 60fps, the next frame takes 32ms, rather than wait and not display a frame for 32ms, it will likely(I'm guessing here but this is the basic idea of the patent) refresh at say 18ms, then it will have that next frame ready a further 14ms later(the rest of the 32ms) but it knows the game has slowed down(from 60 to 30fps) so it knows that rather than show that frame in 14ms(which is a large jump from 18ms) it will probably go with 20ms again thinking it might stay at 30fps for a while it wants to get to 32ms gaps smoothly... there is your smoothness. Then the next frame say jumps back up to 60fps, or would be ready after 16.67ms.... but that time is from when the second frame was done, NOT the third refresh. If it displayed that right away it would only be 10ms after the last refresh, again a large jump.. so it will delay that frame as well, but as it will sense the frame rate has increased(by the actual time taken to draw the frame) it will realise that it wants to smoothly move from 18ms back to 16.67ms, so it will likely display it at say 18ms again, and so on and so on.
Etc.
It would be way way easier to show this with an image, but I'm not even a novice in paint let alone making something good
ANyway to sum up, that patent is about using, seemingly, frame metering to determine the best refresh rate, as above it could easily be used for g-sync, and "very" variable framerates, but it could just as easily be used to check very little if anything on the screen is changing and to drop the refresh rate to save power, and to keep an eye on it to bring the frame time back up.
Ultimately Nvidia know(as I pointed out with the first demo at the time) the variable refresh rate is fairly key but making it smooth is pretty important to that. This is how I'd be implementing it so I wouldn't be surprised if this patent makes up part of g-sync and draws together their frame metering hardware for it. Considering I pointed out this would be required and that the fast changing variable frame rate would itself cause problems and I suggested that minutes after seeing the pendulum demo, lets just say, it's not a very complex or difficult idea. Nvidia can patent their silicon version of it, their algorithm(maybe) but it's another patent that isn't in any way about how it actually changes the refresh rate, it's about determining how and when they would want to and what the best refresh rate would be.
I'm 100% sure none of the patents linked can prevent AMD from doing free-sync(we'll call it that till we know the final name because it's awesome, okay). Nvidia implemented frame metering hardware on die, and did frame pacing as a result.... and has seemingly patented that.... and AMD has had no problem implementing their own frame pacing. Nvidia do not and will not gain the ability to patent and prevent monitors from varying their refresh rate(they already can, you can flick between 60/120/anything else to your hearts content, they just never bothered putting in "all" the numbers and making it easier to change it on the fly).
EDIT:- I couldn't remember what it was called but I wouldn't be surprised if the first patent has something to do with Prism, changing backlight intensity, and changing the rate at which the backlight pulses(essentially the refresh rate) and it specifically talks about persistence as well which is a combination of backlight intensity and refresh rate. It specifically mentions battery life and mentions power saving as the only reason for dynamic refresh rate change.
Both patents talk specifically about refresh rate change in the case of the first one and dynamic refresh rate change in the second, not variable refresh rates(which is what a patent to describe specifically a screen/gpu combining to offer synced refresh would almost certainly mention). You can and have been able to dynamically change from any default modes the screen has forever, neither are new concepts. Neither patent mentions with any specificity a new idea to use more or different refresh rates, gaming in particular, g-sync at all.