OK, bear with me while I try to explain freesync and all in my own words. And I just want to say before I start, that this post isn't pro-AMD or pro-NVidia. It's just my own research into this.
Variable Refresh rates and the power saving that goes with it has been around since 2009. It's part of the embedded display port specifications. All AMD did was make a proposal to VESA to get it made part of the desktop display port specification. And that is now called Adaptive-sync. It's optional in display port 1.2a but a requirement for display port 1.3. What that means is that any monitor with display port 1.3 will have Adaptive sync.
Just a quick word on VESA, it's a standard's body. It's about developing open standards for the display industry. Nvidia, Intel and AMD are all members of VESA and at the moment 3 of the 7 board of directors are from these companies.
Ok, back to adaptive sync. To use Adaptive sync you need a hardware controller that allows asynchronous updating. This can be on the monitor or in the graphics card. It would be too much trouble for monitor manufacturers to do as they would have to get it to work with every graphic card manufacturer. So these controllers will be on the graphic cards.
This is why it won't work with older GCN cards, or with Nvidia cards (more on this later) It's also the reason why all GCN APU's support it because they have this controller built in as part of the power saving features. Intel onboard graphic cards, for the same reason, will also be able to connect to an Adaptive sync monitor.
After you got your adaptive sync monitor and your graphics card with controller, then all you need is a driver.
Why freesync? Well Adaptive Sync was called freesync because there was no licensing fee needed to use it. Freesync sort of stuck. Later on, the standard became known as adaptive sync and AMD's is calling their method of connecting to an Adaptive sync monitor freesync.
Who thought of the idea first? AMD or Nvidia? I don't know, if I had to make an educated guess, I would say both companies started thinking about this at roughly the same time. Both companies are on the VESA board, both would have known the change to the desktop display port specification was coming. AMD put the necessary hardware into it's GCN 1.1 cards, Nvidia started working on Gsync.
Why did NVidia release Gsync when their was an open standard coming? Well, simples, Kepler doesn't have the hardware controller needed and I would guess that they have made no plans to put the hardware controller into Maxwell either. So they needed to have a controller on the monitor itself, and the Gsync module was born.
I just think it was a good business move to get Gsync out first. It's something they needed to do I think.