• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

Nvidia and freesync, hypothetical.

Soldato
Joined
6 Nov 2005
Posts
2,865
Hello. I was just having a think. While i have a feeling it will never happen, as gsync seems to be too profitable for them. But if nvidia did ever descide to support freesync, what do we think it would take to get it working? Would it be as simple as a driver update or do we think there are hardware changes that would need to be made?
 
AFAIK on the newer cards it would simply be done as a driver update - some older cards might not work with it through.
 
A few years ago didnt someone hack the amd drivers and manage to get physx working on an AMD card? if so could the same not happen with the nvidia drivers and freesync?
 
Considering that the some "Gsync" on the gaming laptops is actually implementation of "Adaptive Sync" (it's basically what Freesync is without the branding) without the Gsync module, I think at hardware level there should be no reason why current Nvidia cards would not be able to support Adaptive Sync; it's more of a choice of Nvidia deciding not to support it at software level.
 
Hello. I was just having a think. While i have a feeling it will never happen, as gsync seems to be too profitable for them. But if nvidia did ever descide to support freesync, what do we think it would take to get it working? Would it be as simple as a driver update or do we think there are hardware changes that would need to be made?

Ok accuse me of nitpicking by all means, but this just shows how successful AMD's marketing of Freesync has been, even though I fairly sure AMD never thought it would work out this well for themselves.

NVidia will never support Freesync, because Freesync is a proprietary AMD technology. Now Vesa's Adaptive-Sync is the open standard that allows AMD to utilise the hardware in their cards to give us the variable refresh rate.
AMD coined the term Freesync because of

A, no royalty fees on Vesa'a Adaptive-Sync
B, no large extra cost for the monitor makers to include a proprietary module (ie the Gsync module) in the monitor.

Now of course you still need a Adaptive-Sync capable scaler in the monitor, but as we have seen these result in much cheaper monitors than the GSync equivalents and as we are now seeing more and more monitors are including an Adaptive-Sync scaler anyway, it gives them a chance to put Freesync on the box to get a few more sales.

Now both FreeSync and Gsync work over DisplayPort, but as far as I know only Freesync works over HDMI. I believe that GSync doesn't currently work over HDMI due to bandwidth limitations, but I might be wrong there. This will change with HDMI 2.1 as it allows much higher bandwidth then 2.0.

Now of course up to now, the Vesa Adaptive-Sync open standard has been a optional part of the DisplayPort and HDMI standards, but this is soon to change with the new upcoming HDMI 2.1 standard as VRR (variable refresh rate) isn't an optional part like it is with DisplayPort.

Now NVidia hardware to utilise Vesa's Adaptive-Sync is definitely present in NVidia cards as the laptops that are GSync capable don't have a GSync module in them.

You can see that AMD have reined in the monitor manufactures somewhat by being much more stringent with the requirements for Freesync 2 monitors, as their were some real shockers in the early stages of Freesync monitors, with tiny ranges, but things are much better now with most monitors.

Eventually you would have to assume that NVidia will use HDMI 2.1 ports on their cards at some point, maybe even in the upcoming 11xx series, but will they utilise VVR over HDMI, who knows, maybe it would be GSync lite, or some thing.

It will be interesting when AMD use HDMI 2.1 because obviously as I said early DisplayPort has no Royalty fees, but HDMI does and I cannot see AMD suddenly calling it NQSFS (Not Quite so Free Sync), I actually expect they will just ignore the fees, as they are already paying them for HDMI 2.0 anyway.

What ever happens, it certainly going to be interesting over the next year or so in the monitor world.
 
Last edited:
nVidia are not silly.

I would imagine whatever they choose to do their upcoming hardware will allow them the option to use VESA VRR technologies without GSYNC. It'll just be disabled at a firmware or driver level.
 
nVidia are not silly.

I would imagine whatever they choose to do their upcoming hardware will allow them the option to use VESA VRR technologies without GSYNC. It'll just be disabled at a firmware or driver level.
They make money off the Gsync modules that are sold to monitor manufacturers, which in term this extra cost to the manufacturers are passed onto the buyers om the price premium that they have to pay for Gsync monitors.

It's like probably similar to how why the Intel motherboard are always more expensive comparing to AMD motherboard- because Intel can command a higher price for the chipsets that are used by board manufacturers than AMD can.
 
Hello. I was just having a think. While i have a feeling it will never happen, as gsync seems to be too profitable for them. But if nvidia did ever descide to support freesync, what do we think it would take to get it working? Would it be as simple as a driver update or do we think there are hardware changes that would need to be made?

On Pascal cards is only a driver update. The hardware to support it, is there.
But Nvidia will only do it when they start losing market share. A


nVidia are not silly.

I would imagine whatever they choose to do their upcoming hardware will allow them the option to use VESA VRR technologies without GSYNC. It'll just be disabled at a firmware or driver level.

Nvidia own words back in 2015 were that they make exclusive products for an exclusive market and want their customers feel special.

Look at those HDR Gsync monitors. Nvidia could have easily put the spec to DP1.4
However they went the long way around using an expensive FPGA CPU and a module cost north of £800 on it's own. (CPU alone cost £500+, but could be more at the street price is £2500). That shows commitment not to switch to open standards.

On the other hand we have the LG putting into the market next month the 3440x1440 144hz nano IPS 10bit (possibly 8+FRC) HDR600 Freesync 2 monitor with just a DP1.4. (something similar is expected by Samsung around Autumn).

The Gsync model it will have DP1.2 no HDR, 120hz, 8bit, and still expected to be more expensive.
Also where are those Samsung Gsync monitors? They were in the presentation sporting HDR. But that was 2 1/2 years ago. Possibly Samsung's wishful thinking died because of Nvidia's solutions.


And also are the TVs. Nvidia goes and makes their on TV line up.
Yet on the other hand we have already the Samsung NU8000 series supporting Freesync for start price £1000 (55+ models support it), and next round of Samsung TVs will go for Freesync 2.

As I wrote above, Nvidia will only switch to supporting Adaptive Sync, when they start losing a big market share. And then I wonder wouldn't the loyal and faithful Nvidia consumer feel getting milked all those years? Would they buy another Nvidia product knowingly that they are conned for X amount of years?
(the X = 3 at this moment of writing this)
 
Ok accuse me of nitpicking by all means.
Excuse me for Nitpicking right back.
NVidia will never support Freesync, because Freesync is a proprietary AMD technology. Now Vesa's Adaptive-Sync is the open standard that allows AMD to utilise the hardware in their cards to give us the variable refresh rate.
Freesync is only a proprietary on the GPU side. What the OP means is will Nvidia ever connect to adaptive sync monitors. That successful marketing you were talking about is the reason why Adaptive Sync and Freesync have become one and the same in the eyes of many people. Because they aren't Freesync monitors they are adaptive sync monitors.
AMD coined the term Freesync because of
A, no royalty fees on Vesa'a Adaptive-Sync
B, no large extra cost for the monitor makers to include a proprietary module (ie the Gsync module) in the monitor.
Actually, no, AMD coined the term Freesync because no costs would have to be paid to AMD by the monitor manufacturers to use Adaptive Sync. In the beginning, it was just a sort of dig at Nvidia who charge manufacturers to use their Gsync module but the name sort of stuck.
Now both FreeSync and Gsync work over DisplayPort, but as far as I know only Freesync works over HDMI. I believe that GSync doesn't currently work over HDMI due to bandwidth limitations, but I might be wrong there. This will change with HDMI 2.1 as it allows much higher bandwidth then 2.0.
Correct, but, I wonder what will happen. Will Nvidia need a more expensive Module to handle Gsync over both HDMI and Display port?
Now NVidia hardware to utilise Vesa's Adaptive-Sync is definitely present in NVidia cards as the laptops that are GSync capable don't have a GSync module in them.
You can't know if Nvidia's desktop cards support Adaptive Sync based on their Gysnc laptops. The requirements for supporting adaptive sync have been a mandatory part of the embedded display port specification for years.
It will be interesting when AMD use HDMI 2.1 because obviously as I said early DisplayPort has no Royalty fees, but HDMI does and I cannot see AMD suddenly calling it NQSFS (Not Quite so Free Sync), I actually expect they will just ignore the fees, as they are already paying them for HDMI 2.0 anyway.
Monitor manufactures still won't have to pay anything to AMD to use VRR over HDMI so they will still use Freesync.
What ever happens, it certainly going to be interesting over the next year or so in the monitor world.
Yes it will. And even more interesting in the TV world!!
 
AFAIK on the newer cards it would simply be done as a driver update - some older cards might not work with it through.

On Pascal cards is only a driver update.

Just curious, how do you know that Nvidia have the hardware needed for adaptive sync in their Pascal cards? Why would they put in this extra hardware if they have no intention of connecting to an Adaptive sync monitor?
 
Just curious, how do you know that Nvidia have the hardware needed for adaptive sync in their Pascal cards? Why would they put in this extra hardware if they have no intention of connecting to an Adaptive sync monitor?

Most of the hardware side is a small change to the monitor hardware with VESA adaptive sync - nVidia support it via eDP on laptops under the guise of "G-Sync" using exactly the same chips.
 
A few years ago didnt someone hack the amd drivers and manage to get physx working on an AMD card? if so could the same not happen with the nvidia drivers and freesync?
I believe what you're thinking of is when somebody hacked the Nvidia drivers to remove the "if there's an AMD card in the system then disable PhysX" function. Allowing users with a high end AMD card to use a low spec Nvidia card (I.E GTX750ti) as a dedicated PhysX card.

That function is by design disabled by Nvidia.
 
Just curious, how do you know that Nvidia have the hardware needed for adaptive sync in their Pascal cards? Why would they put in this extra hardware if they have no intention of connecting to an Adaptive sync monitor?

Wondering the same myslef. I vaguely remember reading something, perhaps around the launch of Freesync, about Nvidia lacking the proper scaler within the desktop GPUs to support adaptive sync.
 
I believe what you're thinking of is when somebody hacked the Nvidia drivers to remove the "if there's an AMD card in the system then disable PhysX" function. Allowing users with a high end AMD card to use a low spec Nvidia card (I.E GTX750ti) as a dedicated PhysX card.

That function is by design disabled by Nvidia.

i just googled to see if i had been having strange dreams of physx running on AMD.

I am slightly relieved to see this isnt the case and IS something I read ;)
(10 years ago... how time flies!!!)
http://hexus.net/tech/news/graphics/14123-when-hacks-prevail-nvidia-physx-running-amd-graphics-card/
 
Is it still the case that GSync laptops have no additional hardware either end and are actually using adaptive sync (aka freesync)?
 
Most of the hardware side is a small change to the monitor hardware with VESA adaptive sync - nVidia support it via eDP on laptops under the guise of "G-Sync" using exactly the same chips.

There is hardware needed on the GPU side as well. I asked AMD about this the time they had the questions and answer session here. You can't use their Laptop chips as proof that their desktop counterparts support adaptive sync. The Supporting VRR has been a mandatory part of the eDP spec for years. Nvidia's laptop GPUS have been changed to support the eDP standard. Just like AMD's GCN APUs could support adaptive sync, even the 1st generation ones, but, not the all desktop GCN cards could. The APUs had to adhere to the eDP specification.
 
Back
Top Bottom