• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

It was said from the off that each G-Sync module needs tuning to the monitor it is going in. It isn't a standard module plug and play. I don't protest to be efficient with monitor knowledge at all but I wouldn't make such silly statements like Humbug :o
 
AMD made a massive deal about "possible ranges for freesync monitors" going as low as 9fps. They generated interest in freesync supporting wider ranges than gsync. Nvidia pointed out that gsync simply supports whatever panels are capable of and everyone laughed, saying that freesync monitors would be coming and they would support wider ranges than gsync.

Now that freesync is here and every monitor actually has a narrower band than equivalent gsync monitors, it isn't at all surprising to see people asking questions about why gsync monitors can do 30, but the equivalent freesync monitor can only do 40.

AMD put it out there that freesync would be better than gsync and are now reaping what they have sewn

This and the ghosting issue show the advantage of having a dedicated module that can be tuned to the panel, instead of just hacking variable refresh on to a generic controller designed for fixed refresh.

It can be overcome but will need a completely new design of ASIC to incorporate these controls for variable refresh rates.

No one is blaming AMD or attacking them, but there are currently advantages to nvidia's variable refresh implementation that the current adaptive sync monitors lack.

100% this.

In the many Freesync threads, it was repeatedly reported about the really low ranges and how this would be epic and was even told that G-Sync wouldn't go that low (which was debunked but ignored). 9Hz was repeatedly said but no common sense on the actual logic of 9Hz gaming.

Now when you look at Freesync on its big late release, all the hype Huddy was bigging up just lets people down. This just makes nVidia's G-Sync price look even better, as I would rather pay more for something that makes my experience better and not some horrible ghosting experience or some pathetic 27Hz window for the actual tech to work.
 
Sorry, too much happening today and mind not on it :D

Thanks Triss and that was my point. 48Hz to 75Hz is where Freesync works and that is a very small window for that monitor. I applaud the tech and it is something I would detest being without now but AMD seem to be too keen to pronounce this and that, rubbish this and that from nVidia and then release a driver "After" the monitor has been released and then blame everyone else except themselves. AMD seem too quick to do that and seems everything they do is everyone elses fault except them.

Man up ffs. If you let people down, don't keep pointing the finger.
 
Well this is becoming a very interesting thread indeed. I've read through the entirety of it with a great amount of interest. I would like to add a few additional thoughts of my own on the whole 'ghosting' thing.

The advantage that G-SYNC has is that the module itself replaces the 'scaler' and other assistive electronics that are placed in a monitor to accompany the panel. This module is responsible for the pixel overdrive algorithm, whereas on FreeSync models it's up to the manufacturers own electronics to handle that. Nvidia is able to tune the pixel overdrive intensity 'on the fly' as refresh rate changes, whereas the manufacturer's solutions have only really been tuned with specific fixed refresh rates in mind (such as 144Hz).

With the FreeSync models shown in the PCPer video, you have two extremes which would be different in their pixel responsiveness regardless of FreeSync being used. The LG model uses an IPS panel - and its pixel responsiveness is slower. The BenQ uses very aggressive pixel overdrive and can be expected to suffer from a degree of inverse ghosting. I would expect some other FreeSync models (like the upcoming MG279Q) to offer a better balance with the pixel responsiveness, even if it is just tuned with a single refresh rate in mind. It may not be necessary to tune it to perfection for a huge range of refresh rates if it is already tightly tuned to offer rapid acceleration without noticeable overshoot at 144Hz.

What is crucially important in all of this is that these artificial tests (including capturing static frames with a high speed camera) are not a good representation of what the eye sees when observing motion on a monitor. It is the movement of your eyes rather than the pixel response behaviour of a modern 'sample and hold' monitor that is the most significant contributor to motion blur. When you're talking about variable refresh rates, the degree to which the eye moves changes alongside refresh rate. You could have an absolutely perfectly tuned pixel overdrive algorithm for a given refresh rate - it doesn't change the fact that there is more motion blur (a greater degree of eye movement) at lower refresh rates. By the same token, this motion blur can quite easily mask a lot of the pixel response behaviour of the monitor.

So whilst I do feel that Nvidia has an advantage in their ability to tightly control the acceleration of the monitor to reflect changing refresh rates, this isn't really as important in the real world as marketing or misrepresentation in videos would lead you to believe. :)

Nice one bud and I was hoping you could clear some things up :)
 
Great watch Gerard and clears up a lot. So it is clear from watching that, that G-Sync has a clear advantage with storing the last frame in the memory buffer for when the frames drop below the monitors refresh rate, so hence no stuttering or tearing but when the Freesync panel hits the monitors refresh rate, it is basically on its own and still tries to show 35fps when only 30 fps is being delivered as an example and this in turn incurs judder and tearing.

I love science :D
 
I am looking forward to seeing PCM2's results/review and I trust his judgement over many others. I enjoyed the way PCPer did that article as well and it does define the difference between G-Sync and Freesync and how each deals with the lower frames (and let's be honest, we have all been there at one time or another).
 

Right, so this is very raw and very rubbish recording but it shows what happens when G-Sync is running at around 25fps... I had to run 5K with DSR to get under 30 fps and this to me is completely playable. Very smooth and it easily feels like 50+ fps.

I had my Mrs hold my phone while I just flew about a bit :D

DM has no idea what he is talking about!
 
I implore anyone who has a G-Sync monitor to max up the DSR and max up the settings and see what you think. I am blown away if I am honest and never assumed it would look and feel that smooth.
 
Back
Top Bottom