• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia to support Freesync?

No, but they did take nVidia's idea whose implementation was over-complex, expensive and as it turned out unsustainable AMD made it so that "It Just Works" and for every price point.

I thought it was the VESA consortium who were talking about it for ages,and how they were going to introduce adaptive sync with their next spec update,and then Nvidia who were part of the consortium,decided to pre-empt the spec change by launching GSync.
 
I thought it was the VESA consortium who were talking about it for ages,and how they were going to introduce adaptive sync with their next spec update,and then Nvidia who were part of the consortium,decided to pre-empt the spec change by launching GSync.

Very well could be....
 
To be fair from what I understand (though it is from piecing together 3rd hand information from conference and emails) the resistance with VESA was from industries that use the tech professionally i.e. air traffic control displays which used a form of it (as in eDP) years before G-Sync came out and probably were looking to protect their market and a good bit of why nVidia went away and made G-Sync. But still.

I thought it was the VESA consortium who were talking about it for ages,and how they were going to introduce adaptive sync with their next spec update,and then Nvidia who were part of the consortium,decided to pre-empt the spec change by launching GSync.

It's all in the timeline. VESA could have probably brought it forward a lot quicker than they did.
 
Im laughing my butt off right now.. They had to sacrifice their stands on vesa adaptive sync support. Didn't they say that their GPUs wouldn't be able to support it even? or was that just for maxwell? Anyway this is great news for the consumer and the fact that they don't completely disable support on non verified adaptive sync monitors is also great news. It's certainly going to hurt AMD a bit but if AMDs counter is then to release a proper gaming gpu then nvidia has no leverage left. Overall i couldn't care less about AMD's or nvidias "feelings" it is just great we got some choice back into the market.

It does make me wonder if nvidia knows something about AMDs future GPU lineup and are trying to take some of the freesync owners away with this change before AMD can do some damage. To tinfoil hat or not to tinfoil hat.. hmm
 
Its not ^^^ Pascal too.

Im laughing my butt off right now.. They had to sacrifice their stands on vesa adaptive sync support. Didn't they say that their GPUs wouldn't be able to support it even? or was that just for maxwell? Anyway this is great news for the consumer and the fact that they don't completely disable support on non verified adaptive sync monitors is also great news. It's certainly going to hurt AMD a bit but if AMDs counter is then to release a proper gaming gpu then nvidia has no leverage left. Overall i couldn't care less about AMD's or nvidias "feelings" it is just great we got some choice back into the market.

It does make me wonder if nvidia knows something about AMDs future GPU lineup and are trying to take some of the freesync owners away with this change before AMD can do some damage. To tinfoil hat or not to tinfoil hat.. hmm

People who buy GPU's based on the type of screen are an extreme minority, likewise those who bought into G-Sync thinking they are forever locked into nVidia no longer are, now AMD become an available option again.

6 and two 3's
 
1a332akwxx821.png
lol :)
 
It would be nice if a firmware update allowed G-Sync monitors to operate as FreeSync ones on AMD cards.


That would require AMD to develop and support, which would also require licensing form Nviida. Lets say Nvidia allowed AMD a license, do you really think AMD are going to spend the resources writing firmware and rivers to support Nvidia's Gsync standard?

Nvidia doens't wrong AMD's drivers.
 
Apparently gsync monitor sales have taken a nose dive, people are fed up with nvidia taxes.

Hmm...

How are Nvidia going to differentiate FreeSync from Gsync now?

Nvidia expensivesync (TM), an extra £200. Selected overpriced priced monitors only. Drivers will lock out regular freesync use :D
 
Last edited:
The presentation showed some having shimmering or pulsing at certain framerates, etc.
Pretty weird since they presumably all work fine with AMD GPUs (major bugs notwithstanding). Maybe nVidia just haven't got their VRR implementation fully nailed yet in their driver.

It is an interesting question - not enough information out there yet to really know what is happening but I'm sure certain tech sites will jump on it.

nVidia have a certain way of supporting the technology i.e. the way low framerates are supported which I'm guessing some panels aren't upto and unlike the AMD implementation they probably don't compromise on - atleast not initially anyhow.
I think it took AMD a year or so to get LFC working in their drivers, and even then only for monitors with a 2.5:1 max:min ratio. So yes, it could be that nVidia haven't sorted out LFC yet.
 
Apparently gsync monitor sales have taken a nose dive, people are fed up with nvidia taxes.


The added expense on Gsync monitors is mostly just the manufacturers inflated profit margins. The Actual gsync modules and license fee is about $10-20.
 
And if there was any firmware update for the monitor, it'd come from the manufacturers not Nvidia.
And then firmware updating for monitors isn't really a thing.

I don't think Gsync Monitors will suddenly become "freesync" compatible.
 
The presentation showed some having shimmering or pulsing at certain framerates, etc.

Pretty weird since they presumably all work fine with AMD GPUs (major bugs notwithstanding). Maybe nVidia just haven't got their VRR implementation fully nailed yet in their driver.

People still get it twisted that G-Sync is simply a proprietary parallel of Adaptive-Sync technology when it isn't. The panels go through a QA process with stringent scrutineering before the modules are even considered.
 
Pretty weird since they presumably all work fine with AMD GPUs (major bugs notwithstanding). Maybe nVidia just haven't got their VRR implementation fully nailed yet in their driver.

Not enough information yet to really know - but I suspect nVidia have a certain approach to boost clarity and reduce latency when the framerate is dropping/recovering that they won't compromise on for certification (might see an additional option for a workaround appear later).
 
So if I'm Samsung, LG, Asus, Acer or any other monitor manufacturer. Why would I bother releasing a G-Sync and Freesync model of the same panel? If both nvidia and AMD gpus support freesync and freesync is cheaper, the vast majority of people will buy the freesync model.

Doesn't this mean G-Sync is now terminally ill?
 
So if I'm Samsung, LG, Asus, Acer or any other monitor manufacturer. Why would I bother releasing a G-Sync and Freesync model of the same panel? If both nvidia and AMD gpus support freesync and freesync is cheaper, the vast majority of people will buy the freesync model.

Doesn't this mean G-Sync is now terminally ill?

Not really, as when you skip past all the fanboyism, gsync is far better than a good amount of the FreeSync panels. Sadly with open standards comes no quality control, which is why some FreeSync panels are amazing, and others are abysmal.

Gsync panels will still offer a guaranteed standard, so there may be an incentive to either make a gsync model, OR to improve quality on the FreeSync side to allow gsync certification.
 
The added expense on Gsync monitors is mostly just the manufacturers inflated profit margins. The Actual gsync modules and license fee is about $10-20.

FGPAs used for G-sync cost a lot more than $10-20.

It's small volume of sales that's the killer - requires higher margin per unit, which in turn reduces sales further.
 
Back
Top Bottom