That is rubbish and people trying to re-write history - I doubt if it was down to VESA and AMD we'd even have VRR now - it isn't like it is new technology at all - its been used in professional VDUs going back probably 30 years before G-Sync.
No, it's people ignoring obvious obvious truths. It takes longer than a year to get a new standard, new scalers designed, taped out, produced and tested and into new panels which have gone through a design cycle, it's that simple. It takes longer than that. Making an FPGA version would take literally 1/10th of the time. For AMD to have freesync screens out and working within a year after g-sync straight makes certain this was in the works long before g-sync was announced. There is literally no other option. Even a small basic chip such as required for a monitor still takes time to design, to test, to tape out and actually be produced, you can't just do it in a few months because you want to, it takes much longer than that and before you design a chip... you know, you actually have to have a reason to do it. Which means talks about something that requires such chips start a minimum of a few months before design on such a chip starts.
Something like VRR, freesync and having panels to market takes probably at least a couple of years, maybe longer.
as said there is a single reason to go FPGA instead of vastly cheaper dedicated hardware, time to market. Nvidia were trying to beat 'something' to market, I wonder what that something was. According to you AMD and everyone else were actively saying no to it and then reacted after g-sync was made. Ignoring again the massive time to market such a standard, chip design, design cycle for monitors takes, if they only decided to do this after g-sync went public, it didn't matter when g-sync went public had it been a year later with chips that cost 1/10th as much it wouldn't matter.
I literally said in the days around the g-sync reveal/launch that I was 100% certain an industry standard was obviously due to be announced in the next couple of months and that Nvidia were just trying to beat it to the punch to lock in their hardware users. I remember long threads of the same Nvidia people telling me what Nvidia were doing was unique and super difficult and AMD couldn't do it, nor would they do it with an industry standard because it's too complex, etc, then freesync/variable refresh rate got announced.
IT was patently obvious from the very second Nvidia did this. When there is a better, cheaper, compatible for all method of doing this and out of the blue one companies makes an absurdly overpriced stupid version with the only benefit being time to market... it's exceptionally obvious why they've done it, because if a industry standard free method gets announced first Nvidia can't latch on, released second and lock their customers in to being ripped off further.