• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD: FAQ for Project FreeSync is up

It's good that both AMD and Nvidia are now addressing issues such as this recently, as this has been one of those "issues" that we have had to accept for a while. (Don't count me as sounding slightly ungrateful, it's nice we can progress in other ways than X% incremental performance increases).

Problem being both of them now seem to be heading with down separate paths with their own solutions to issues instead of properly trying to work together. Ultimately it's going to be bad for all of us here being end users... (I don't want to be in a position in a few years time of having to upgrade my monitor every time I upgrade my gfx card myself).

O/T: Sad my 7950's don't support it, I'd be willing to shell out for a monitor to support this, but the card's not supporting it I mean cmon!... I know they have to make money but it's not cool being shafted. (I can't even begin to imagine the frustration of 280 owners, AMD have dropped the ball a bit there).

But yeah, good the industry is know progressing once again!

I think they will support it, just not for games. So should be OK with video.
On the laptops wasn't the tech originally for power saving? So I wonder if there may be something like that available too?
 
Hello everyone,

I've looked at the various questions you've posted here and currently working with my colleagues on providing the answers and clarifications you've asked for. I will provide you an update within the next few days.
 
Problem being both of them now seem to be heading with down separate paths with their own solutions to issues instead of properly trying to work together. Ultimately it's going to be bad for all of us here being end users... (I don't want to be in a position in a few years time of having to upgrade my monitor every time I upgrade my gfx card myself).
It'll all settle down eventually, it's just that we seem to be going through a long overdue period of rapid development in monitors. The last couple of years have seen the biggest advancement in affordable monitor technology since LCDs first became affordable IMO.

In another couple of years I expect it will all have shaken out with either a single spec or most gaming monitors supporting both.
 
It'll all settle down eventually, it's just that we seem to be going through a long overdue period of rapid development in monitors. The last couple of years have seen the biggest advancement in affordable monitor technology since LCDs first became affordable IMO.

Yup. Well said

In another couple of years I expect it will all have shaken out with either a single spec or most gaming monitors supporting both.


I suppose it will depend on just how well the two technologies work, if there is not a great deal of difference between the look and performance of them both then it might very well be a case of them becoming available on most monitors.
We could even get to the odd situation where ncal-sync ;) is used for display port and G-Sync is used for the HDMI and DVI ports on the same monitors. (LOL just imagine if it gets to that stage, you could end up paying a premium for the version with the display port option :D:p:D)
 
OK, bear with me while I try to explain freesync and all in my own words. And I just want to say before I start, that this post isn't pro-AMD or pro-NVidia. It's just my own research into this.

Variable Refresh rates and the power saving that goes with it has been around since 2009. It's part of the embedded display port specifications. All AMD did was make a proposal to VESA to get it made part of the desktop display port specification. And that is now called Adaptive-sync. It's optional in display port 1.2a but a requirement for display port 1.3. What that means is that any monitor with display port 1.3 will have Adaptive sync.

Just a quick word on VESA, it's a standard's body. It's about developing open standards for the display industry. Nvidia, Intel and AMD are all members of VESA and at the moment 3 of the 7 board of directors are from these companies.

Ok, back to adaptive sync. To use Adaptive sync you need a hardware controller that allows asynchronous updating. This can be on the monitor or in the graphics card. It would be too much trouble for monitor manufacturers to do as they would have to get it to work with every graphic card manufacturer. So these controllers will be on the graphic cards.

This is why it won't work with older GCN cards, or with Nvidia cards (more on this later) It's also the reason why all GCN APU's support it because they have this controller built in as part of the power saving features. Intel onboard graphic cards, for the same reason, will also be able to connect to an Adaptive sync monitor.

After you got your adaptive sync monitor and your graphics card with controller, then all you need is a driver.

Why freesync? Well Adaptive Sync was called freesync because there was no licensing fee needed to use it. Freesync sort of stuck. Later on, the standard became known as adaptive sync and AMD's is calling their method of connecting to an Adaptive sync monitor freesync.

Who thought of the idea first? AMD or Nvidia? I don't know, if I had to make an educated guess, I would say both companies started thinking about this at roughly the same time. Both companies are on the VESA board, both would have known the change to the desktop display port specification was coming. AMD put the necessary hardware into it's GCN 1.1 cards, Nvidia started working on Gsync.

Why did NVidia release Gsync when their was an open standard coming? Well, simples, Kepler doesn't have the hardware controller needed and I would guess that they have made no plans to put the hardware controller into Maxwell either. So they needed to have a controller on the monitor itself, and the Gsync module was born.

I just think it was a good business move to get Gsync out first. It's something they needed to do I think.
 
Sums it up nicely tbh, two points though...

Afaik amd's final implementation won't be called freesync, or at least that's what huddy said, more likely to come with some spin or other on the real tech name of adaptive sync.

My second point is more a personal feeling, it being driver controlled, we still don't know if the driver will be a one size fits all kind of deal or if it's going to require frequent updates per title etc...If the latter the glaring issues both sides have had in just getting a display output of uhd@60hz functioning flawlessly is enough to make me worry this could have the same stuttering start to life.
 
Sums it up nicely tbh, two points though...

Afaik amd's final implementation won't be called freesync, or at least that's what huddy said, more likely to come with some spin or other on the real tech name of adaptive sync.

My second point is more a personal feeling, it being driver controlled, we still don't know if the driver will be a one size fits all kind of deal or if it's going to require frequent updates per title etc...If the latter the glaring issues both sides have had in just getting a display output of uhd@60hz functioning flawlessly is enough to make me worry this could have the same stuttering start to life.

Thank you, I didn't realise AMD had planned to change the name.

As for driver related problems, each company will have to write their own driver for their own controller. But the driver should be invisible to games etc. Since Intel and AMD have both experience in this already for the last few years I am hoping that driver problems will be minimal.
 
Hello everyone,

I've looked at the various questions you've posted here and currently working with my colleagues on providing the answers and clarifications you've asked for. I will provide you an update within the next few days.

Hello again everyone,

Quick update: I just wanted to let you know we're still working on providing you the answers, especially the technical questions which require more time. Thank you for your patience everyone
 
Hello again everyone,

Quick update: I just wanted to let you know we're still working on providing you the answers, especially the technical questions which require more time. Thank you for your patience everyone

Are you waiting for a response from your technical people or are your technical people trying to work it out ?
 
He's waiting for a response, Sam relays information. Thracks is working on answering the technical questions.

You have basically retold me what I am asking. I know this guy relays the info, I know it comes from somewhere else. :)

The question was do they know the answers or are they still trying to get fully to grips with it.
 
Last edited:
OK, bear with me while I try to explain freesync and all in my own words. And I just want to say before I start, that this post isn't pro-AMD or pro-NVidia. It's just my own research into this.

Variable Refresh rates and the power saving that goes with it has been around since 2009. It's part of the embedded display port specifications. All AMD did was make a proposal to VESA to get it made part of the desktop display port specification. And that is now called Adaptive-sync. It's optional in display port 1.2a but a requirement for display port 1.3. What that means is that any monitor with display port 1.3 will have Adaptive sync.

Just a quick word on VESA, it's a standard's body. It's about developing open standards for the display industry. Nvidia, Intel and AMD are all members of VESA and at the moment 3 of the 7 board of directors are from these companies.

Ok, back to adaptive sync. To use Adaptive sync you need a hardware controller that allows asynchronous updating. This can be on the monitor or in the graphics card. It would be too much trouble for monitor manufacturers to do as they would have to get it to work with every graphic card manufacturer. So these controllers will be on the graphic cards.

This is why it won't work with older GCN cards, or with Nvidia cards (more on this later) It's also the reason why all GCN APU's support it because they have this controller built in as part of the power saving features. Intel onboard graphic cards, for the same reason, will also be able to connect to an Adaptive sync monitor.

After you got your adaptive sync monitor and your graphics card with controller, then all you need is a driver.

Why freesync? Well Adaptive Sync was called freesync because there was no licensing fee needed to use it. Freesync sort of stuck. Later on, the standard became known as adaptive sync and AMD's is calling their method of connecting to an Adaptive sync monitor freesync.

Who thought of the idea first? AMD or Nvidia? I don't know, if I had to make an educated guess, I would say both companies started thinking about this at roughly the same time. Both companies are on the VESA board, both would have known the change to the desktop display port specification was coming. AMD put the necessary hardware into it's GCN 1.1 cards, Nvidia started working on Gsync.

Why did NVidia release Gsync when their was an open standard coming? Well, simples, Kepler doesn't have the hardware controller needed and I would guess that they have made no plans to put the hardware controller into Maxwell either. So they needed to have a controller on the monitor itself, and the Gsync module was born.

I just think it was a good business move to get Gsync out first. It's something they needed to do I think.

thanks for the summary, mate.
 
I am thinking about putting together an infographic for all this sync business, good idea?

Very good idea, just make sure you get the facts straight or you will get slaughtered.

Maybe put up the proposed points and let us argue I mean discuss what the facts actually are....:D
 
Back
Top Bottom