• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

Pc per latest podcast basically explains the differences between freesync and gsync.

Gsync gets around the frame tearing at lower refresh rates by duplicating frames so stuttering and tearing dosnt get introduced. So basically gsync doesn't really have a window on the low end where it stops working.


45 minute mark they talk about it.
Yea whatever. I'd still wait for a reviews from dedicated reviewers instead jack of all trade PC hardware reviewers when it comes to speakers/Hi-Fi and monitors reviews :p

Still waiting for reviews from PCM2 and TFTCentral etc :)
 
Last edited:
The range though should be about right ? 60 FPS - 60HZ that both G and F screens can do ?

Recording on the PC records direct from the GPU, the ghosting is happening on the monitor so recording the GPU output won't capture something that happens outboard of the GPU, just physically cant happen

Nope, intentions are to get a 390x if they ever surface lol. This is why I asked if this is tied into a freesync issue? Because running the monitor as is at 144hz and frames, I don't get any ghosting. Just trying to give info on my config and findings.

The potential issue is that overdrive that works for a fixed refresh (144hz in your case) might not be best for a variable refresh, hence ghosting. What seems to be true though is that some people notice it and some don't, so you might still be ok.
 
Well this is becoming a very interesting thread indeed. I've read through the entirety of it with a great amount of interest. I would like to add a few additional thoughts of my own on the whole 'ghosting' thing.

The advantage that G-SYNC has is that the module itself replaces the 'scaler' and other assistive electronics that are placed in a monitor to accompany the panel. This module is responsible for the pixel overdrive algorithm, whereas on FreeSync models it's up to the manufacturers own electronics to handle that. Nvidia is able to tune the pixel overdrive intensity 'on the fly' as refresh rate changes, whereas the manufacturer's solutions have only really been tuned with specific fixed refresh rates in mind (such as 144Hz).

With the FreeSync models shown in the PCPer video, you have two extremes which would be different in their pixel responsiveness regardless of FreeSync being used. The LG model uses an IPS panel - and its pixel responsiveness is slower. The BenQ uses very aggressive pixel overdrive and can be expected to suffer from a degree of inverse ghosting. I would expect some other FreeSync models (like the upcoming MG279Q) to offer a better balance with the pixel responsiveness, even if it is just tuned with a single refresh rate in mind. It may not be necessary to tune it to perfection for a huge range of refresh rates if it is already tightly tuned to offer rapid acceleration without noticeable overshoot at 144Hz.

What is crucially important in all of this is that these artificial tests (including capturing static frames with a high speed camera) are not a good representation of what the eye sees when observing motion on a monitor. It is the movement of your eyes rather than the pixel response behaviour of a modern 'sample and hold' monitor that is the most significant contributor to motion blur. When you're talking about variable refresh rates, the degree to which the eye moves changes alongside refresh rate. You could have an absolutely perfectly tuned pixel overdrive algorithm for a given refresh rate - it doesn't change the fact that there is more motion blur (a greater degree of eye movement) at lower refresh rates. By the same token, this motion blur can quite easily mask a lot of the pixel response behaviour of the monitor.

So whilst I do feel that Nvidia has an advantage in their ability to tightly control the acceleration of the monitor to reflect changing refresh rates, this isn't really as important in the real world as marketing or misrepresentation in videos would lead you to believe. :)

Just want to clear something up, are you saying that our eyes introduce motion blur ? I kind of hope that's not what you are saying in the part I've bolded.
 
Just want to clear something up, are you saying that our eyes introduce motion blur ? I kind of hope that's not what you are saying in the part I've bolded.

Yes, that is exactly what I am saying. Why do you hope I'm not saying that? It is the basis on which these 'strobe backlights' reduce motion blur. And how it is that higher refresh rates can deliver lower levels of motion blur on a given monitor even if it uses similar pixel overdrive at 60Hz.
 
Yes, that is exactly what I am saying. Why do you hope I'm not saying that? It is the basis on which these 'strobe backlights' reduce motion blur. And how it is that higher refresh rates can deliver lower levels of motion blur on a given monitor even if it uses similar pixel overdrive at 60Hz.

Nothing like that, but saccadic eliminates everything that our human eyes pick up and removes it completely. While our eyes may actually create it, we will never ever see blur etc caused by ourselves. So our eyes do not contribute at all to any form of blur.
 
What i have been wondering for some time now is what happens when you OC a freesync enabled monitor.. Lets say you manage to get the LG 34um67 from 75 hz to 85hz while freesync is enabled.. Does that extend the variable refresh rate range? from 48-75 to 48-85? or does it just leave it at 48-75 and go into 85hz mode the second you go over 75 fps? or does it totally invalidate the enabled freesync part of the monitor and leave it at 85hz even when enabled through catalyst?
 
Nothing like that, but saccadic eliminates everything that our human eyes pick up and removes it completely. While our eyes may actually create it, we will never ever see blur etc caused by ourselves. So our eyes do not contribute at all to any form of blur.

Your understanding is flawed I'm afraid, and not at all in-line with the modern scientific understanding of the topic. It is quite easy to demonstrate that eye movement is a significant contributor to motion blur when viewing moving content on a monitor. There is a wealth of scientific literature related to this topic, I'd advise a simple Google Scholar search if you would like to read some of that. There is also some literature (this for example) that specifically mentions the effects of saccadic suppression and how it fits into this specific context. I also work with individuals in the Institute Of Neuroscience who specialise in human vision, some of whom have worked on this sort of thing for many years. This sort of thing is second nature to them and I'm sure some could explain the role of saccadic suppression 'in context' far better than I could.

I would also advise you to read an article on my website entitled "Factors Affecting PC Monitor Responsiveness", paying particular attention to the "Sampling Method" and subsequent sections. There are some simple demonstrations explained and linked to there which run through the importance of eye movement when it comes to perceived blur on a monitor as well as explaining why some monitor technologies (including CRT and LCDs with strobe backlights) can reduce it. A simple demonstration for anybody of how your eye movement changes what you perceive on a screen - http://www.testufo.com/#test=eyetracking. Try taking a picture of that and note that the picture looks similar to what you see when your eyes are static, but not moving. It is evident that saccadic masking (or suppression) does not counteract such things completely. Suppression is not elimination.
 
Last edited:
So instead of the monitors/panels it's now PEBCAK to blame for Freesync monitors ghosting more than G-Sync ones? there's an awful lot of PEBCAK with AMD technologies. :p

Even if it's a contributing factor in this instance (and it sounds like you're way overthinking it to me) it's a failure on AMD's part if their new technology isn't as compatible with the majority of their customer base' eyes/visual cortices.
 
Last edited:
So instead of the monitors/panels it's now PEBKAC to blame for Freesync monitors ghosting more than G-Sync ones? there's an awful lot of PEBCAK with AMD technologies. :p

Even if it's a contributing factor in this instance (and it sounds like you're way overthinking it to me) it's a failure on AMD's part if their new technology isn't as compatible with the majority of their customer base' eyes/visual cortices.

The point being made here is simply that the human eye does not perceive motion on a monitor in the same way as a camera. Demonstrating that there is an issue with pixel responsiveness with static captures of a screen is therefore not a realistic or useful representation of what the eye actually sees. It's why TFT Central are (rightfully) pushing for 'pursuit photography' to supplement or perhaps replace the usual static PixPerAn analysis and I am doing the same. It is a much more accurate reflection of what the eye sees when viewing motion on a monitor. And on a sample and hold display that means significant blur (especially at lower refresh rates like 60Hz) regardless of what your monitor's pixels are doing.
 
Last edited:
Your understanding is flawed I'm afraid, and not at all in-line with the modern scientific understanding of the topic. It is quite easy to demonstrate that eye movement is a significant contributor to motion blur when viewing moving content on a monitor. There is a wealth of scientific literature related to this topic, I'd advise a simple Google Scholar search if you would like to read some of that. There is also some literature (this for example) that specifically mentions the effects of saccadic suppression and how it fits into this specific context. I also work with individuals in the Institute Of Neuroscience who specialise in human vision, some of whom have worked on this sort of thing for many years. This sort of thing is second nature to them and I'm sure some could explain the role of saccadic suppression 'in context' far better than I could.

I would also advise you to read an article on my website entitled "Factors Affecting PC Monitor Responsiveness", paying particular attention to the "Sampling Method" and subsequent sections. There are some simple demonstrations explained and linked to there which run through the importance of eye movement when it comes to perceived blur on a monitor as well as explaining why some monitor technologies (including CRT and LCDs with strobe backlights) can reduce it. A simple demonstration for anybody of how your eye movement changes what you perceive on a screen - http://www.testufo.com/#test=eyetracking. It is evident that saccadic masking (or suppression) does not counteract such things completely. Suppression is not elimination.

In your opinion it is flawed, while my understanding is that every time your eyes move they are saccading, during the saccade all the images, movements etc processed by the brain is very blurry, in order for us to eliminate this blur (and we have to or we would be walking around in a blur 24/7) saccadic masking occurs which removes blur, and in turn it also removes motion blur.

The test ufo is flawed, the reason the testufo test is flawed is due to optokinetik tracking, we simply cant track that fast, as for the image itself, well, I cut the exact same image and moved it across the screen at the same speed and blur was there due to the speed of my monitor, slow it down and its a moving 'static' image.
 
In your opinion it is flawed, while my understanding is that every time your eyes move they are saccading, during the saccade all the images, movements etc processed by the brain is very blurry, in order for us to eliminate this blur (and we have to or we would be walking around in a blur 24/7) saccadic masking occurs which removes blur, and in turn it also removes motion blur.

The test ufo is flawed, the reason the testufo test is flawed is due to optokinetik tracking, we simply cant track that fast, as for the image itself, well, I cut the exact same image and moved it across the screen at the same speed and blur was there due to the speed of my monitor, slow it down and its a moving 'static' image.

You're way out of your depth. Your understanding of why saccadic suppression exists is correct, but your understanding of its implications when it comes to perceived motion blur on a monitor (i.e. in context) is completely wrong. :)

Perhaps you'd like to explain to everyone why if you set a 120Hz monitor to 60Hz there is significant increase in motion blur even if the pixel response behaviour remains largely the same? Or why a 60Hz (sample and hold) OLED screen with pixel responses a fraction of a millisecond still appears to blur? And why it is that a 120Hz/144Hz model with very rapid pixel response times still suffers from significantly more blur than if a strobe backlight is employed on said monitor? And whilst you're at it, why not discredit the hard work of the researchers whose work I mentioned or linked to? :p

And for the TestUFO page I linked to, why is it that you can quite clearly see a difference if your eyes are tracking the motion vs. not? You do realise if you stop tracking the motion, the pixel responses don't magically stop on the monitor, right? As I said, take a photo of this and you'll see a clear difference between what you perceive with your eyes as they move vs. the static snapshot in time taken by your camera. The static snapshot should look similar to what you observe on the moving image if your eyes are not tracking the motion. In fact anybody else reading this can do the same and can also see that the pace of motion there is not outlandishly fast and fairly representative of moderate movement in a game or other scenerio. The very fact that the image(s) appear different when your eyes are tracking the movement vs. not rules out the pixel responsiveness having anything to do with it and by process of elimination leaves perceived blur as the reason for the observed differences. Unless you have a better theory that I'm sure dozens of scientists out there would love to hear?

If you don't like or don't understand the above, try another test. In fact again, anybody should try this. Stare at the following space under this line:













And move your mouse at a moderate pace from side to side. If you stare at the mouse cursor, you should notice it appears rather different whilst you move it to if you stare at a fixed point on the screen instead. It should appear to have far more distinct repeated instances when you're staring at the screen rather than the cursor directly. Again, the pixel responses don't change just because you aren't looking at the mouse cursor directly. But the perceived motion does. This distinction is quite clear on a 60Hz sample and hold monitor, I find. More so if it is PWM-free as well.

Edit: I have asked a work colleague of mine who is an expert on eye movement and tracking motion. Surprised they were so receptive given that it's a Friday evening, but this is a sign of how respected I am amongst work colleagues (*wink*). The type of tracking typically used when observing motion on a monitor is smooth pursuit tracking, not saccadic tracking. So saccadic suppression does not have an effect!
 
Last edited:
What i have been wondering for some time now is what happens when you OC a freesync enabled monitor.. Lets say you manage to get the LG 34um67 from 75 hz to 85hz while freesync is enabled.. Does that extend the variable refresh rate range? from 48-75 to 48-85? or does it just leave it at 48-75 and go into 85hz mode the second you go over 75 fps? or does it totally invalidate the enabled freesync part of the monitor and leave it at 85hz even when enabled through catalyst?

I would presume it would work at 85 as FS works upto the monitor refresh
 
Great watch Gerard and clears up a lot. So it is clear from watching that, that G-Sync has a clear advantage with storing the last frame in the memory buffer for when the frames drop below the monitors refresh rate, so hence no stuttering or tearing but when the Freesync panel hits the monitors refresh rate, it is basically on its own and still tries to show 35fps when only 30 fps is being delivered as an example and this in turn incurs judder and tearing.

I love science :D
 
While they were talking about freesync emulating what the module does at the driver level I wondered if that would have a performance impact. It wouldn't be outlandish to say AMD themselves are aware of this frame doubling that goes on with G-Sync (I'd be surprised if they hadn't looked at them this closely) and attempted to pull it off within the driver but the algo/work needed would be done client side so to speak, where as the G-Sync module itself is offloading that work from the computer.

Great watch though, love to see old tech still being useful :)
 
Nice one Gerard, it certainly did clear something up...

From the vid:

It's actually the same kind of effect as, what people were getting with, the people that do notice the Gsync flickering when the game hitches and goes to zero instantly. It's the same kind of brightness change

As some don't notice the Gsync hitch, some won't notice the VRR hitch, but both need their flaws ironed out, AMD however have just released FS and on the first driver, perhaps it can be addressed.

As Gsync's been out for quite some while, I should think Nvidia should be spending more time sorting out their own flaws instead of putting all this effort into debunking FreeSyc instead.
 
Back
Top Bottom