If it added blur, it must be there in the first place
He didn't think it was right, as there was 2 spaceships instead of one and thought there was summit wrong
Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
If it added blur, it must be there in the first place
Because Nvidia say it is, these people are clueless. they don't have the intelligence to understand anything let alone investigate it themselves and come to an independent conclusion.
AMD made a massive deal about "possible ranges for freesync monitors" going as low as 9fps. They generated interest in freesync supporting wider ranges than gsync. Nvidia pointed out that gsync simply supports whatever panels are capable of and everyone laughed, saying that freesync monitors would be coming and they would support wider ranges than gsync.
Now that freesync is here and every monitor actually has a narrower band than equivalent gsync monitors, it isn't at all surprising to see people asking questions about why gsync monitors can do 30, but the equivalent freesync monitor can only do 40.
AMD put it out there that freesync would be better than gsync and are now reaping what they have sewn
This and the ghosting issue show the advantage of having a dedicated module that can be tuned to the panel, instead of just hacking variable refresh on to a generic controller designed for fixed refresh.
It can be overcome but will need a completely new design of ASIC to incorporate these controls for variable refresh rates.
No one is blaming AMD or attacking them, but there are currently advantages to nvidia's variable refresh implementation that the current adaptive sync monitors lack.
Well this is becoming a very interesting thread indeed. I've read through the entirety of it with a great amount of interest. I would like to add a few additional thoughts of my own on the whole 'ghosting' thing.
The advantage that G-SYNC has is that the module itself replaces the 'scaler' and other assistive electronics that are placed in a monitor to accompany the panel. This module is responsible for the pixel overdrive algorithm, whereas on FreeSync models it's up to the manufacturers own electronics to handle that. Nvidia is able to tune the pixel overdrive intensity 'on the fly' as refresh rate changes, whereas the manufacturer's solutions have only really been tuned with specific fixed refresh rates in mind (such as 144Hz).
With the FreeSync models shown in the PCPer video, you have two extremes which would be different in their pixel responsiveness regardless of FreeSync being used. The LG model uses an IPS panel - and its pixel responsiveness is slower. The BenQ uses very aggressive pixel overdrive and can be expected to suffer from a degree of inverse ghosting. I would expect some other FreeSync models (like the upcoming MG279Q) to offer a better balance with the pixel responsiveness, even if it is just tuned with a single refresh rate in mind. It may not be necessary to tune it to perfection for a huge range of refresh rates if it is already tightly tuned to offer rapid acceleration without noticeable overshoot at 144Hz.
What is crucially important in all of this is that these artificial tests (including capturing static frames with a high speed camera) are not a good representation of what the eye sees when observing motion on a monitor. It is the movement of your eyes rather than the pixel response behaviour of a modern 'sample and hold' monitor that is the most significant contributor to motion blur. When you're talking about variable refresh rates, the degree to which the eye moves changes alongside refresh rate. You could have an absolutely perfectly tuned pixel overdrive algorithm for a given refresh rate - it doesn't change the fact that there is more motion blur (a greater degree of eye movement) at lower refresh rates. By the same token, this motion blur can quite easily mask a lot of the pixel response behaviour of the monitor.
So whilst I do feel that Nvidia has an advantage in their ability to tightly control the acceleration of the monitor to reflect changing refresh rates, this isn't really as important in the real world as marketing or misrepresentation in videos would lead you to believe.
That looks darn smooth. Now I just need to find a decent 120-144hz Gsync 4K monitor (doubt there are any?) that isn't going to cost me a fortune.
....with exception to sli users....who are still lacking dsr+gsync+sli combination *
*unless it's in the latest drivers as I haven't tried yet
Gsync-SLI works though? It's just with the DSR combo that it doesn't ?