• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

A Week With NVIDIA's G-SYNC Monitor

charlie has been posting anti nvidia things for months, hardly surprising his site no longer does reviews of their stuff either, yet is positively evangelic when it comes to AMD... just like a certain someone else around here, hmmm
 
http://www.google.com/patents/US8120621

From 2007 and I can't get far through reading it without my phone crashing, but it is an nvidia patent for dynamically adjusting frame/refresh rates.

Feel free to pick at it as I said I can't read much of it atm :(

http://patft.uspto.gov/netacgi/nph-...a&OS=refresh+AND+nvidia&RS=refresh+AND+nvidia

and this one from 2013... in fact searching for NVidia and refresh finds lots of patents, can't be bothered to read through all of them but this was the most appropriate one I could find in the first 10 minutes

lots of people saying "you can't patent a simple idea" however this idea seems pretty simple - using a command source to tell a monitor to change its refresh rate on the change of a buffer, sounds a lot like what gsync does, and this patent was granted even referencing all the prior art for refresh rates as it does at the top

now, I'm not a patent lawyer, so there may be a hole in this that AMD can exploit to do something similar, but on the face of it they will have to be very careful to avoid running in to patent/licencing issues
 
Last edited:
http://patft.uspto.gov/netacgi/nph-...h&OS=nvidia+AND+refresh&RS=nvidia+AND+refresh

lots of people saying "you can't patent a simple idea" however this idea seems pretty simple - using a command source to tell a monitor to change its refresh rate on the change of a buffer, sounds a lot like what gsync does, and this patent was granted even referencing all the prior art for refresh rates as it does at the top

An idea CAN be simple and be patentable. It's whether that idea would have been easily 'stumbled across' by any other person/endeavour that dictates it's legitimacy as a patent.

The fact we have been living with the exact same GPU - Monitor - Refresh Rate relationship for decades sort of tells me that even though G-SYNC is simple, I think it's worthy of a patent. Seeing as nobody else has done it in all that time.

Don't get me wrong - I am not a huge fan of patents where patents should not exist. Especially in the tech world. However, as a business with a business head on - Patent/Licensing would be a savvy way to protect the idea and allow revenue generation from other GPU vendors.

Right now I think it's far too early however. AMD/Others might be interested but until this thing is well established in the wild and the pros/cons are well known I doubt they are looking into having hardware that can support it.
 
Serious question time again...

If its as simple as just an nvidia branded controller chip that anyone could make should they want to, why will it only work with kepler and above cards? Driver or architecture restriction?

there is a lot of collaboration between the GPU and the G-Sync module. It took us, you know, a while to get this to work perfectly and I don’t expect any other graphics card would do that naturally anytime soon. There is hardware inside Kepler and in future GPUs that makes this happen elegantly.

http://www.dsogaming.com/news/nvidi...-g-sync-to-work-with-other-gpus-anytime-soon/
 
just pointing out again - DM says you can't patent the idea of a GPU dynamically updating and telling a monitor what the refresh rate should be;

USPTO says otherwise

Actually that doesn't say otherwise, it agrees with what I said in so much as the patent you are linking to is NOT about dynamically changing refresh rate particularly, it's about a system and, I don't know how to put it, because it's such a vague son of a *****, but reasons/circumstances in which it would want to drop refresh rate. it also mostly references, in fact pretty much exclusively mentions reducing refresh rate and skipping frames in suitable situations for the reasons of SAVING POWER. It's most likely a patent associated with their mobile devices and saving as much power by slowing the display(which is pretty much the main power draw in mobile devices).

It also basically references the patent PGI linked to talking about what I believe is basically the frame metering hardware as ONE of the methods in which they might determine a lower refresh rate was warranted.

The other patent that pgi highlighted is also NOT about dynamic refresh rates in particular, it's about how they determine an appropriate refresh rate, to some degree, and simply... I'd be pretty much certain this patent is basically saying how they use frame metering and/or the on die hardware to determine the best refresh rate at any given time. It's still not certainly about g-sync and could well be focused on mobile devices and determining when the user wouldn't notice a refresh drop but important to that is knowing when to raise it back up. It is a pretty interesting(less vague and much more involved) patent as well.



This patent

http://www.google.com/patents/US8120621

It's seemingly to determine how much change there is between frames and using this as a way to determine the best refresh rate, it would also result in dropping frames if the rate of change is too fast(this isn't a terrible thing in this case and is fairly fundamental to it working).

I specifically mentioned from the pendulum demo that they showed an exceptionally uniform rate of change of framerate and knew this would obviously look MUCH worse if the framerate were jumping around all over the place. IE if you look back at the demo and think frame times. This means you have 60-60-59-58-57, and the frame times are 16.67-16.94-17.24-17.54. While the frame time is changing how close they are is where the smoothness comes from.

If you didn't smooth out the frame rate then you would have jumps from 60 to 30 to 50 to 30 to 75fps, and that means frame times of, roughly speaking 16.7-32-20-32-14 or something. That is where you wouldn't have smoothness.

Either way, this algorithm or hardware it's describing determines how much is changing in each frame, and by how much the change in the frame is changing. IE one frame 5% of the image has changed, the next image 5% of the image changed again. The rate of change(5%) remains steady. If the rate of change is increasing significantly it will increase refresh rate, if it's changing too slowly, it will slow refresh rate to match. There would be I assume an optimum amount of rate of change, I don't know what that is but for instance if more than 50% of the frame is changing because the gpu is changing it faster then it will up frame rate so the difference per refresh is lower, if there is very little change they reduce refresh rate, say if it's below 15% it will decrease it till they have this optimum number, whatever it is.

So think about it like this, you have one frame that takes 16.67ms to produce at effectively 60fps, the next frame takes 32ms, rather than wait and not display a frame for 32ms, it will likely(I'm guessing here but this is the basic idea of the patent) refresh at say 18ms, then it will have that next frame ready a further 14ms later(the rest of the 32ms) but it knows the game has slowed down(from 60 to 30fps) so it knows that rather than show that frame in 14ms(which is a large jump from 18ms) it will probably go with 20ms again thinking it might stay at 30fps for a while it wants to get to 32ms gaps smoothly... there is your smoothness. Then the next frame say jumps back up to 60fps, or would be ready after 16.67ms.... but that time is from when the second frame was done, NOT the third refresh. If it displayed that right away it would only be 10ms after the last refresh, again a large jump.. so it will delay that frame as well, but as it will sense the frame rate has increased(by the actual time taken to draw the frame) it will realise that it wants to smoothly move from 18ms back to 16.67ms, so it will likely display it at say 18ms again, and so on and so on.


Etc.

It would be way way easier to show this with an image, but I'm not even a novice in paint let alone making something good :p


ANyway to sum up, that patent is about using, seemingly, frame metering to determine the best refresh rate, as above it could easily be used for g-sync, and "very" variable framerates, but it could just as easily be used to check very little if anything on the screen is changing and to drop the refresh rate to save power, and to keep an eye on it to bring the frame time back up.

Ultimately Nvidia know(as I pointed out with the first demo at the time) the variable refresh rate is fairly key but making it smooth is pretty important to that. This is how I'd be implementing it so I wouldn't be surprised if this patent makes up part of g-sync and draws together their frame metering hardware for it. Considering I pointed out this would be required and that the fast changing variable frame rate would itself cause problems and I suggested that minutes after seeing the pendulum demo, lets just say, it's not a very complex or difficult idea. Nvidia can patent their silicon version of it, their algorithm(maybe) but it's another patent that isn't in any way about how it actually changes the refresh rate, it's about determining how and when they would want to and what the best refresh rate would be.

I'm 100% sure none of the patents linked can prevent AMD from doing free-sync(we'll call it that till we know the final name because it's awesome, okay). Nvidia implemented frame metering hardware on die, and did frame pacing as a result.... and has seemingly patented that.... and AMD has had no problem implementing their own frame pacing. Nvidia do not and will not gain the ability to patent and prevent monitors from varying their refresh rate(they already can, you can flick between 60/120/anything else to your hearts content, they just never bothered putting in "all" the numbers and making it easier to change it on the fly).


EDIT:- I couldn't remember what it was called but I wouldn't be surprised if the first patent has something to do with Prism, changing backlight intensity, and changing the rate at which the backlight pulses(essentially the refresh rate) and it specifically talks about persistence as well which is a combination of backlight intensity and refresh rate. It specifically mentions battery life and mentions power saving as the only reason for dynamic refresh rate change.

Both patents talk specifically about refresh rate change in the case of the first one and dynamic refresh rate change in the second, not variable refresh rates(which is what a patent to describe specifically a screen/gpu combining to offer synced refresh would almost certainly mention). You can and have been able to dynamically change from any default modes the screen has forever, neither are new concepts. Neither patent mentions with any specificity a new idea to use more or different refresh rates, gaming in particular, g-sync at all.
 
Last edited:

Yup, I'm almost certain they are using the on die frame metering for their g-sync solution. However they've used that for frame pacing for a while, and AMD have massively improved frame pacing(they are somewhat tackling one situation at a time but it's monumentally improved).

Nvidia has specific hardware on die, AMD might by now, I don't know. But it's relatively simple stuff to do via drivers anyway, as shown by AMD seemingly doing it through drivers in a very effective way.

As I also mentioned this was something I pointed out minutes after the pendulum demo, it's fairly obvious and I think the most important part of the quote you made is... its the on die bit they have that makes it an "elegant" solution.

Variable frame rate from 60 to 30 to 60, wouldn't be that smooth and I would precisely describe that as a non elegant solution. Quick frame rate changes would still be smoother than v-sync at low frame rates, and a better overall solution. It would be less inelegant at higher frame rates. Flicking between 30-60fps is 16 to 32ms in frame times. But flicking between 90-120fps is flicking between 11 and 8ms... or a 5 times smaller difference. However, if I can figure out how to make it elegant, and Nvidia's patents pretty much spell out how to make it elegant, I don't think it will take AMD particularly long(nor intel for that matter) to implement a similar solution.

Nvidia's solution is a specific on die, dedicated silicon(i presume and not just using shaders for compute, it would be a very inefficient way to do it), which they can absolutely patent. But Nvidia's own patent goes as far as saying there are many many ways to compare the image output and determine the amount of changes......

No, I very much doubt anyone will implement "g-sync" which is a combination of their specific chip used on monitors, their specific drivers and their specific on die frame metering implementation...... they can patent that solution as theirs.

But they can't stop monitor makers adding variable refresh rate, they can't stop AMD using their own driver pacing software with really fairly minor adjustments to filter the frame rate change. They can only prevent them using Nvidia's specific solution, not any solution, of which there would be many.
 
Last edited:
I've missed a lot of news on this front it seems, so it was great to read through this thread and the articles posted up here on G-Sync. All in all it looks very interesting and I shall certainly consider buying a monitor with this feature some time down the line when IPS 1440p monitors with G-Sync are released instead of jumping the gun and getting a TN monitor with poorer image quality.

However, it is likely that in around 6 months or so time I shall get another 780 for my rig (or look at what Maxwell has to offer around then) - so, the question is whether it is worth getting a G-Sync monitor for an SLI system? Surely the inherent SLI stutter will defeat the purpose of this? Or is SLI stutter much less noticeable these days with the new R331 drivers and such?

Whatever the case, there are a going to be a lot of options open to us PC gamers (and professional use too of course) in the near future with regards to graphical solutions - with 4K monitors slowly dropping in price, G-Sync on the horizon and both AMD and nVidia flexing their muscles in the desktop GPU market, the future is looking very interesting indeed.

Within a couple of years time I expect to see 4K, G-Sync, 144Hz monitors starting to emerge :) although I guess that's when the data transfer capabilities of our current cable technology may start to become tested...
 
Hmmm, Andy appears to have gone quiet in this thread after attempting to call me out several times with incorrect information......... or are you still looking for another patent?

Don't worry, I can explain that one to you as well if you want.
 
http://www.pcper.com/news/Displays/Reader-Results-NVIDIA-G-Sync-Upgrade-and-First-Impressions

Reader Results: NVIDIA G-Sync Upgrade and First Impressions

This was a fairly serious product mod, actually more than I thought it was going to be. Overall, the installation took more than an hour, so not exactly trivial for me. I suppose it's possible to get it done in 30 minutes if you were really focused and knew what you were doing. I put the LVDS connector on wrong the first time (connectors had to be rotated 180 degrees) so I had to retrace my steps for a bit to get it fixed after I realized it was put on incorrectly and the metal plate was on the wrong side. The manual does actually point this out in a couple steps but it was a little confusing to think of that rotation change. Also, during installation I opted to remove the somewhat useless monitor speakers (that nobody probably uses anyway). It's definitely something a PC hobbyist can do, but count on spending some time carefully removing a lot of small cables inside the monitor and doing it right. Part of my slow approach was caution at damaging any components; I've never been inside an LCD display until now.

If I had to pick a couple words to describe the G-Sync change it would be responsiveness and smoothness. As far as the VG248QE, even without the G-Sync module the display looked pretty amazing just using Adaptive V-Sync prior to this. Overall G-Sync is a somewhat subtle upgrade from adaptive, definitely a more obvious benefit to the middle ground FPS areas rather than at the extreme high end. I think some of the reaction to it is even on a sub-conscious level perhaps; something about it is just more pleasing to the eye and observing the super fast response time when moving around in a game gives a better feeling of immersion. G-Sync might actually ruin my plans of thinking about an AMD card, so hopefully NVIDIA also decides to lean towards adding the Mantle API some time in the future.

Pcper did a comp for those yanks to win a G-Sync module and this guy has given a fairly decent feedback IMO.
 
Hmmm, Andy appears to have gone quiet in this thread after attempting to call me out several times with incorrect information......... or are you still looking for another patent?

Don't worry, I can explain that one to you as well if you want.

that wasn't the patent I linked to, nice wall of text but totally irrelevant to the info I posted
 
It's gonna be on the Asus 27" IPS in a few months, now that will be a nice monitor indeed, even better shall be £499-£599 price.

I've still not seen G-Sync in the flesh but for about £50-£100 more than the regular 27" IPS that does not seem bad, as £550ish is acceptable when it's a top of the range IPS 2560x1440 panel. :)

Going to ask Asus if they can get us an early sample to borrow, not gonna hold my breath though.

Early pricing on the BenQ models looks like £350 for the 24" 1920x1080 TN and £400ish on the 27" 1920x1080 TN. March/April tome on BenQ roughly.
 
Last edited:
It's gonna be on the Asus 27" IPS in a few months, now that will be a nice monitor indeed, even better shall be £499-£599 price.

I've still not seen G-Sync in the flesh but for about £50-£100 more than the regular 27" IPS that does not seem bad, as £550ish is acceptable when it's a top of the range IPS 2560x1440 panel. :)

Going to ask Asus if they can get us an early sample to borrow, not gonna hold my breath though.

Early pricing on the BenQ models looks like £350 for the 24" 1920x1080 TN and £400ish on the 27" 1920x1080 TN. March/April tome on BenQ roughly.

I doubt anyone with some common will look at a TN panel. For me personally, I want a 2560x1440 monitor that doesn't tear or have input lag like my Dell U2713HM does. If G-Sync can do that, I am sold.
 
Back
Top Bottom