• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

Would like to see reviews of the panel, with it running in a constant refresh mode, rather than variable mode. To see if the ghosting is inherent with the panel being used, the reviews are completely biased and useless without.
 

Older vid from ces but looks as if the samsung 4k freesync has that wobbly stand still, goes down to 40hz though, one other monitor in the vid goes down to 30hz. 4k only has a 20hz range apparently :(
 
40-60 going by what he said in the vid.

That would need 2xTitan Xs @ 4K (for no dips below 40) with FPS limiter on 60 FPS, very very impractical.


Interesting that Ryan (from PCper) mentions :

"For the high refresh rate that goes down to minimum "X" Hz, is a specification of the monitor/panel not of Freesync not of AMD GPU's, that is something that the monitor vendor as well as monitor manufacturer decides of whatever the controller can handle and what the pixel response times are"

Strange to hear it from him, as he is one of the biggest Nvidia shills :p
 
Last edited:
Well it isn't necessarily a good thing leaving it down to the vendors and manufacturers.
To be honest though...you'd thought manufacturers would spend more effort on getting the dynamic range for Freesync right, but as proven wrong by LG, they would shoot themself in the foot with narrow range and deny their own potential sales :p

It make no sense, unless they want to push their potential customers into the open arms of their competition (as LG has pushed me away to wait and see what Samsung has to offer).
 
Well this is becoming a very interesting thread indeed. I've read through the entirety of it with a great amount of interest. I would like to add a few additional thoughts of my own on the whole 'ghosting' thing.

The advantage that G-SYNC has is that the module itself replaces the 'scaler' and other assistive electronics that are placed in a monitor to accompany the panel. This module is responsible for the pixel overdrive algorithm, whereas on FreeSync models it's up to the manufacturers own electronics to handle that. Nvidia is able to tune the pixel overdrive intensity 'on the fly' as refresh rate changes, whereas the manufacturer's solutions have only really been tuned with specific fixed refresh rates in mind (such as 144Hz).

With the FreeSync models shown in the PCPer video, you have two extremes which would be different in their pixel responsiveness regardless of FreeSync being used. The LG model uses an IPS panel - and its pixel responsiveness is slower. The BenQ uses very aggressive pixel overdrive and can be expected to suffer from a degree of inverse ghosting. I would expect some other FreeSync models (like the upcoming MG279Q) to offer a better balance with the pixel responsiveness, even if it is just tuned with a single refresh rate in mind. It may not be necessary to tune it to perfection for a huge range of refresh rates if it is already tightly tuned to offer rapid acceleration without noticeable overshoot at 144Hz.

What is crucially important in all of this is that these artificial tests (including capturing static frames with a high speed camera) are not a good representation of what the eye sees when observing motion on a monitor. It is the movement of your eyes rather than the pixel response behaviour of a modern 'sample and hold' monitor that is the most significant contributor to motion blur. When you're talking about variable refresh rates, the degree to which the eye moves changes alongside refresh rate. You could have an absolutely perfectly tuned pixel overdrive algorithm for a given refresh rate - it doesn't change the fact that there is more motion blur (a greater degree of eye movement) at lower refresh rates. By the same token, this motion blur can quite easily mask a lot of the pixel response behaviour of the monitor.

So whilst I do feel that Nvidia has an advantage in their ability to tightly control the acceleration of the monitor to reflect changing refresh rates, this isn't really as important in the real world as marketing or misrepresentation in videos would lead you to believe. :)
 
Last edited:
To be honest though...you'd thought manufacturers would spend more effort on getting the dynamic range for Freesync right, but as proven wrong by LG, they would shoot themself in the foot with narrow range and deny their own potential sales :p

It make no sense, unless they want to push their potential customers into the open arms of their competition (as LG has pushed me away to wait and see what Samsung has to offer).

I think that would come down to profit margins rather then intentionally pushing sales elsewhere or killing your brand image.

You can never know about the backdoor deals though in the corporate world.
 
That would need 2xTitan Xs @ 4K (for no dips below 40) with FPS limiter on 60 FPS, very very impractical.


Interesting that Ryan (from PCper) mentions :

"For the high refresh rate that goes down to minimum "X" Hz, is a specification of the monitor/panel not of Freesync not of AMD GPU's, that is something that the monitor vendor as well as monitor manufacturer decides of whatever the controller can handle and what the pixel response times are"

Strange to hear it from him, as he is one of the biggest Nvidia shills :p

Why?
It is true, nvidia have always been very open in saying that gsync supports whatever the panel supports, it was AMD who created this myth that freesync was going to suddenly support 9hz, when most of us pointed out that most likely freesync would be just as limited by the panels used (in fact more so as equivalent gsync monitors work at 30-60 or 30-144 where as freesync works from 40 on the released models).
 
Why?
It is true, nvidia have always been very open in saying that gsync supports whatever the panel supports, it was AMD who created this myth that freesync was going to suddenly support 9hz, when most of us pointed out that most likely freesync would be just as limited by the panels used (in fact more so as equivalent gsync monitors work at 30-60 or 30-144 where as freesync works from 40 on the released models).

Whoever believed 9hz would actually be useful should really start looking at getting a console :cool:

AMD marketing at it's best ? :D

I like both AMD and Nvidia hardware (depends what I want from it though) - but I take slightly more time to take "informed decisions" from other sources especially when things come from straight from the marketing department.
 
Last edited:
Depends on the game I suppose, bf4 on a 295x2 can fit in that range according to hardocp articles.

I run Crossfire R9-290s and yes it does (similar performance to the 295x2). But you are talking about a pretty optimised engine.

Most other games @ 4K fluctuate too much for this to be worthwhile. It's a very hungry resolution especially with eye candy cracked up.
 
That us great Tony, glad you got a good one and are happy.
Shame that some are not having the same great experience and get it sorted.

really??? sorry to hear that mate. Are many ppl having issues then? Ive not really been keeping up with the news on them, still very surprised each time i hear ppl saying they are bad and have ghosting issues.
 
really??? sorry to hear that mate. Are many ppl having issues then? Ive not really been keeping up with the news on them, still very surprised each time i hear ppl saying they are bad and have ghosting issues.

Maybe ghosting has to do with the panel - camera used of what I have seen in videos? not sure

Don't own Gsync or Freesync personally.

Every user is different though some can spot issues in general some can't (refresh rates, fps , input lag etc)
 
Yeah also need to take into account what an High Speed camera picks up vs real time what we see..

Having a camera high light ghosting and then we dont see the ghosting in real time is it really an issue?

So it would be nice when these reviews drop if they can see the ghosting in person after using high speed..
 
Yeah also need to take into account what an High Speed camera picks up vs real time what we see..

Having a camera high light ghosting and then we dont see the ghosting in real time is it really an issue?

So it would be nice when these reviews drop if they can see the ghosting in person after using high speed..

With the Gsync and Freesync panels being different and the camera also being a somewhat parameter of error, maybe screen recording (shadowplay, Raptr recording) might detail this more. That's the only way to highlight it imo online (real world would be better ofcourse)

If any Gsync and Freesync users have some time to record that highlight ghosting in normal circumstances ?
 
Back
Top Bottom