Very interesting reading.
If you need to "sync" in order to get a smooth desktop I'm going to assume there is something else going on with your setup. Perhaps look for a firmware update for your monitor. Or simply check to see you aren't at 60hz when it looks like it does.
It's quite common to have (for example) freesync not work without installing AMD drivers. Also, for some, can cause your monitor to default to 60hz until the AMD gpu drivers are installed. If you ever notice an issue where it looks like the refresh rate is 60hz (after uninstalling drivers) and you have MS basic display drivers (forget the real name though) for gpu device in Device Manger (as a result of using DDU) then check to see what refresh rate you are using.
I don't think the issue is the term Sync but what Screen Tearing is and how it occurs.
"...Yet for others you have to go frame by frame looking for it (tearing) in other monitors."
This statement did not suggest/imply nor state that monitors have no tearing. That's a strawman's arguement
. As there is no absolutes.
"Free/G Sync is only intended for monitors with very poor HW scalers."
Perhaps I could further clarify. I was thinking of how Nvidia uses it's own instead of relaying on manufacture. And, how AMD advised for better HW. Which is part of their certification process.
Furthermore, how vsync worked, before free/g sync is not the subject here. AMD (from my own research) had monitor manufactures update their HW in order to be ceritifed for Freesync. While Nvidia provided their own hardware. The point was that, at the time, the hw used for "gaming monitors" were adaquete for "desktop use" but not necessary for "gaming use".
Since then most, if not all, manufactures are on board with provided better HW, providing firmware updates, providing higher refresh rates, etc for monitors we see today.
At the end of the day, if your monitor is actually worth it's own weight, you shouldn't need to have free/g sync on. Specially for desktop use (come on now). I know I've stopped using it for sometime now. I have not notice a single difference since then either.
And yes I still agree, "What AMD and Nvidia have done was found a way to compensate for crappy monitors." That won't change.
How the industry responded with freesync/gsync has changed this though. As they continue to improve upon it.
I've gamed and seen quite a few monitors and with it turned off I've seen very few that need it on. Those who didn't neither me nor the clients have seen any noticeable tearing/smoothness issues while gaming. To imply that all monitor tear the same exact way in the same exact manner is simply false.
And yes, disabling it does improve on input latency. Sheesh, are we actually having the same discussion here?
So again, if the HW on the monitors were "fine" back then why did AMD and Nvidia push for better HW? It's very simple question guys, come on.
However to that end is Free/g sync dead? IMO
yes as monitor manufactures continue to improve on their own HW it's going to get redundant. Heck even win10 has a variable refresh rate option now.
Who saw that coming? From my own experiences the frequency and the amount of tearing isn't the same issue I've seen a few years ago. I'm sorry if that has any sentimental value to you.
But freesync 2 on the other hand...not so much. I think we need AMD to keep pushing the standards up. No different the how those standards are pushed for hdmi/display port for example.