Just an interesting thought/question (inspired by a post on another forum) regarding the whole 'buying the freesync monitor with an nvidia card', and one I hadnt considered before...
On my current 60hz potato monitor tearing, when it used to occur (ie back when my cards were to slow to drive it at beyond 60hz, or recently before I discovered nvidias fast sync) was super obnoxious and highly noticeable.
HOWEVER with, say, a 144hz freesync monitor, if there is tearing would it not be dramatically less visible/noticeable due to the 2.5x faster screen refresh? I hadnt considered this angle and sadly i have no way of experiencing tearing on a 144hz monitor but if it was barely noticeable that actually would make me perhaps consider the freesync option for all of its other advantages (price, improved bit depth etc)
Yes 3YOB is not quite correct with his explanations, stutter can occur when vsync is on or off but of course for different reasons, tearing can occur when thr frames drawn are not synced with the monitor refresh (hence you see partly drawn frames).
To answer your question, tearing is indeed less noticeable the faster refresh rate you have, assuming that your hardware can maintain a framerate that is at least your refresh rate. There are even techniques that Unwinder has implemented into his Afterburner software that allow you to attempt to move your tear line toward the edge of the screen with vsync off and make it less noticeable.
Personal note: As an Nvidia user with good hardware I use vsync ON so that i can use blur reduction to good effect. I'm very sensitive to tearing and blurring. I own the Benq XL2730Z Freesync monitor that i bought specifically on release (selling my Asus PG278Q) for its bright blur reduction mode other than any adaptive sync tech. In addition to this i use Blurbusters / RealNC's technique to tweak my refresh rate slightly above the standard monitor refresh rate (by fractions of a hert) so that it never quite caps out and lessens any input lag from vsync. Alternatively you can cap your fps to a fraction below your refresh but i prefer the former method.
IMO this is the best of both worlds and I'm not sure that i would use a monitor without motion blur tech until screens behind massively less blurry/ghosty in standard motion that would allow me to consider going back to adaptive sync.
Basically adaptive sync is for those that cannot maintain max framerate, if you can maintain your max or near you dont get much benefit from using it so may as well go with blur reduction (blur reduction can make things look more stuttery at low refresh rates too, it can also make tearlines far easier to spot hence the need for vsync on)
https://www.blurbusters.com/howto-low-lag-vsync-on/
FYI always use RTSS to cap framerate instead of any in game method, its more accurate