• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia Explains Why Their G-Sync Display Tech Is Superior To AMD's FreeSync

It was said from the off that each G-Sync module needs tuning to the monitor it is going in. It isn't a standard module plug and play. I don't protest to be efficient with monitor knowledge at all but I wouldn't make such silly statements like Humbug :o
 
Gsync also uses software though does it not. The driver is the software the module is the hardware.

Freesync software is the driver the hardware is inside the DisplayPort, and GPU.

Difference being the GSYNC module is tweaked to that display it's running in.

So far we don't know if AMD software can tweak anything in the displayport to do the same job.
 
The G-Sync Module does not control the Back/Side Lighting (Strobe Lighting) your just making crap up now.

And even if it did, it is possible to do, Free-Sync can also do it because AMD's equivalent to the G-Sync Module is on AMD's GPU.

No, the gsync module can assist the display in controlling ghosting in a way that freesync can't do at the moment. The Gsync module is tuned for each monitor to help reduce ghosting.
 
I really have them rattled now :o

Again, the reason the G-Sync Module exists is because it was the only way for Nvidia to comunicate with the display in the way it needed to for the result.

It gets around a port communication problem, the Display port could not support comunication between the GPU and Display, the mutual handshakes.

So what Nvidia did was put the G-Sync module in-between to act as the go between, it would pick up the frames from the GPU. buffer them and then communicate with the screen about its timings.

With AMD the GPU its self does this because with the VESA DP standard it can do it its self without the piggy in the middle. they solved the communication problem instead of going around it.

Thats it. i'm off to watch the news.
 
Well, good to see some constructive discussion going on...

So can someone sum up for me, what makes gsync better?

Is all this talk of ghosting limited to one monitor? For freesync that is.

How many people who are saying one is better than the other tried both?
 
No, the gsync module can assist the display in controlling ghosting in a way that freesync can't do at the moment. The Gsync module is tuned for each monitor to help reduce ghosting.

What is it the gsync module does to reduce ghosting?

I always thought ghosting was panel tech dpendent, regarding pixel response times?
 
Thats Nvidia marketing speak doing its job right there.

This is a myth.

http://support.amd.com/en-us/search/faq/226

Free-Sync does not need the G-Sync Module because with Free-Sync the GPU knows where and when the frames are. AMD's GPU dictate the Screens Timings.

Nvidia need that G-Sync module because without the VESA standards its GPU does not know where or when the frames are. it is... or rather was a communication issue between ports.

As I read it that article is just talking about synchronising the monitor refresh rate to the framerate of the graphics card, NVidia are talking about optimising the voltage fed to the panel depending on what its refresh rate is.

NVIDIA claims its G-Sync module is tuned for each display to prevent ghosting by change the amount of voltage going to pixels at different refresh rates, allowing pixels to untwist and retwist at different rates.
 
As I read it that article is just talking about synchronising the monitor refresh rate to the framerate of the graphics card, NVidia are talking about optimising the voltage fed to the panel depending on what its refresh rate is.

Thanks for the bit of info regarding upping voltage for pixel response.

Isn't that exactly what overdrive on monitors does?
 
I really have them rattled now :o

Again, the reason the G-Sync Module exists is because it was the only way for Nvidia to comunicate with the display in the way it needed to for the result.

It gets around a port communication problem, the Display port could not support comunication between the GPU and Display, the mutual handshakes.

So what Nvidia did was put the G-Sync module in-between to act as the go between, it would pick up the frames from the GPU. buffer them and then communicate with the screen about its timings.

With AMD the GPU its self does this because with the VESA DP standard it can do it its self without the piggy in the middle. they solved the communication problem instead of going around it.

Thats it. i'm off to watch the news.


Wrong again.
Nvidia crated gsync because of a lack of VESA stands, and since they went the hardware route they had the flexibility to do other things such as reduce ghosting.
 
Thanks for the bit of info regarding upping voltage for pixel response.

Isn't that exactly what overdrive on monitors does?

Overdrive doesn't work well with Variable refresh rates, what the GSYNC aleedly does is soemthing more dynamic so those same kind of benetis can be seen with VRR.
 
I really don't like companies that take the 'we'll charge whatever we can get away with' strategy. Really puts me off of supporting them especially when they try and grow that money hungry attitude to several (increasingly proprietary) components / products too.

I like the way they are getting the best tech possible but something tells me I don't want to be ripped off to the point where I'm not supporting the little guy, I'm feeding the money hungry tools and probably not even getting that huge a difference. I think both companies make good stuff and when it's undeniable (like with laptop GPU's) I bought a nvidia gpu but I think a lot of people just get a touch of OCD and let companies tell them what is best and what they need. Good on Nvidia for being somewhat a perfectionist but I think they aren't really doing it with much love for the gamers anymore. They're starting to just price people out for the fun of it. It's very early days for freesynch and gysynch but we'll see how it all goes down a bit further down the line. Don't think he said too much to convince me it's that big a difference though.
 
Last edited:
They're starting to just price people out for the fun of it.

It's because they can.

AMD while trying very badly to make it, sadly they don't have the cash flow to invest and dominate in one market let alone two (CPU and GPU) now trying their luck in monitor tech ?

With a solid investor or buy out things will change rapidly and we might actually see some advancements faster than expected.
 
Last edited:
I find it funny so many people are putting their theories forward as fact when the only FACT is that none of us knows exactly how gsync does it job or how freesync does it. We dont know exactly what is up to the solution(gsync/freesync) and what is up to the panel manufactors. We can only guess and theorise how its done and who/what is doing what part until the official parties comes forward and start sharing the specific details.

Im sure Tom Peterson or "random engineer from AMD i cant remember" are very intelligent people and knows more than any or most of us but that doesnt mean they will share the info they know or that they will do anything other than playing whatever PR game the company wants them to play. Its sad really cause i want to hear them speak unrestricted about the hardware they clearly know a thing or 2 about and have a passion for.
 
There is no way in this world that nVidia are going to give up the Secret Sauce that they use in the G-Sync module.

Ofcourse not :) i never expect them to. It would take something like a lawsuit where someone has copied the design somewhat or entirely for us to find out how they have done it. Atleast thats what i personally think would be required.

PS: 1000 Posts.. MM here i come atlast
 
I was surprised to see freesync not working below 40 fps, exactly when you need it most.

Been playing Alien Iso at 4k and it regularly dips to the high twenties, still perfectly playable with gsync.
 
Back
Top Bottom