Variable refresh rate was both AMD, already in the industry in laptops and Nvidia jumping a mainstream feature to market to rip off it's customers, again.
Look at any industry standard and look at design times for even more simple chips. VRR was already available in laptops, AMD was talking with monitor manufacturers to come agree on support for VRR, writing a standard they can all agree with, submitting the standard and all the major players working on a design cycle for scalars to support it. Nvidia knew about it and then used FPGAs which can massively short cut your time to market, to create g-sync and 'beat' the standard. By beating the standard and locking their customers to only support for g-sync they got to sell a bunch of massively massively marked up g-sync modules for years.
FPGAs are pretty much designed to be programmable chips which for something basic like a scalar is pretty easy to achieve. To make a dedicated hardware chip still takes a 18-24 months to go through design, tape out, verification, etc.
There is a reason g-sync was launched with one screen in a rush only a few months before the freesync standard was submitted to VESA and there is a reason that freesync screens started launching a year later... because discussions with monitor makers over how to support VRR started likely a year before g-sync/freesync were announced. That's how the industry worked. So AMD as per usual was working with everyone on an industry standard to push everyone forwards and Nvidia saw a chance to profit and screw over their own customers.
You can't short cut time to market on full hardware support or industry standards, there is no way freesync and adaptive sync get that much support only a year later unless the monitor makers were all discussing this a LONG time before g-sync launched, literally no chance.
Again remember this, G-sync screwed over not one AMD users, and freesync costs never screwed over a single AMD customer, but g-sync costs ripped off every single Nvidia buyer. This is what Nvidia does, sees where the market is going thanks to the work of others and tries to take advantage and profit off it. Let AMD push tessellation in hardware till devs support it then jump on board rather than the other way around, Nvidia eat the die cost till it's supported then AMD come in. Nvidia then pushed enough tessellation power to win benchmarks but again cost their own customers, putting in more hardware than needed just to cheat benchmarks rather than provide a meaningful experience for their users.