There was nothing wrong with the post you quoted, nothing misleading about it. I mean what's misleading about it? He makes some accurate points while not being pro or negative to a vendor (Although, I don't think we should be talking price comparisons yet until probably the second half of this year)
Freesync is AMD's proprietary method of using adaptive sync to put it basicaly.
Nvidia if they adopted Adaptive sync would make their own proprietary method, as would Intel.
I wasn't really disagreeing but more giving a reason to why things can look proprietary but even if they were "Open", they possibly couldn't run it any way.
Anyways.... Just read this and found this a little concerning in truth.
What happens below that limit and above it differs from what NVIDIA has decided to do. For FreeSync (and the Adaptive Sync standard as a whole), when a game renders at a frame rate above or below this VRR window, the V-Sync setting is enforced. That means on a 60 Hz panel, if your game runs at 70 FPS, then you will have the option to enable or disable V-Sync; you can either force a 60 FPS top limit or allow 70 FPS with screen tearing. If your game runs under the 40 Hz bottom limit, say at 30 FPS, you get the same option: V-Sync on or V-Sync off. With it off, you would get tearing but optimal input/display latency but with it off you would reintroduce frame judder when you cross between V-Sync steps.
There are potential pitfalls to this solution though; what happens when you cross into that top or bottom region can cause issues depending on the specific implementation. We'll be researching this very soon.
https://www.youtube.com/watch?v=8rY0ZJJJf1A
So from that, if you go over 60Hz on either the 4K or 2560x1080 screen, it will either default to V-Sync (stutter) or no V-Sync (tearing).... That strikes me as poor and not ideal at all. Also, the 4K I think it was and the 2560x1080 had 40Hz to 60Hz only covered by Freesync, so this 20fps window is just too tight to use properly.
Maybe I have read it wrong?