Any point in GSYNC now?

Soldato
Joined
25 Apr 2007
Posts
5,255
As per the title. Aside from a few differences like ULMB, is there any point in spending an extra 200 quid to get locked into nVidia cards?
 
Not an extra £200 no but G-sync may well have variable pixel overdrive which reduces the ghosting at lower refresh rates, something that I dislike about my current Freesync monitor.

Just ask those poor console players wondering why some of their Freesync monitors are ghosting at 60 fps on Xbox.
 
I don't like the tax on them, but GSync also has a bigger framerate range where it's active. Freesync 2 might be on par, not sure.

When any half decent VRR/Freesync implementation drops below its 40-60hz minimum, it simply doubles the frame effectively producing the same result.

So it if its minimum is 40hz, it'll refresh at 78hz if you go down to 39FPS.

Like the fella above me said, the main plus is adaptive overdrive, which AFAIK only the Nixeus EDG27 does with Freesync and good luck finding one of those this side of the pond.
 
I've been looking into the Samsung 34"CF791 Ultrawide 100 Hz FreeSync monitor and wondering whether it'd work with my GTX 1070 Ti. Looks like a mixed bag of information, but there seems to be plenty of evidence it won't work without there being some serious issues/limitations.

It therefore looks rather risky going FreeSync with an NVidia card at the moment, monitor depending. On the other hand, if NVidia continue the pricing trend seen with the RTX 2xxx series, it'd be risky getting stuck on GSync.

I guess NVidia will have to play ball eventually and iron out issues with FreeSync. I assume the point of them opening up to the standard was to ensure they don't get left out of the FreeSync TV ecosystem. But then again maybe they'll roll out full compatibility with later generations of cards/for FreesSync 2. The fact they haven't rolled out their current FreeSync compatibility to anything before the 1xxx series, even though cards as far back as the GTX 600s support GSync, suggests to me this might be the case.
 
When any half decent VRR/Freesync implementation drops below its 40-60hz minimum, it simply doubles the frame effectively producing the same result.

So it if its minimum is 40hz, it'll refresh at 78hz if you go down to 39FPS.

FreeSync currently uses a fairly crude implementation using panel self refresh to do low framerate control - it is inferior to the system used by G-Sync (which also as above utilises adaptive overdrive) to maintain as much clarity and responsiveness at lower framerates as possible. Whether it is worth it will depend a lot person to person. It isn't without its negatives as you can get strange judder and/or screendoor inversion artefacts in bright coloured content in motion but overall it is a better solution if you are dealing with situations where the framerate is dropping i.e. if you want to play at 4K with everything turned up and there simply isn't the GPU power to stay much above 30fps.
 
It just works

To quote you for the second time, I’ve now bought a gsync monitor on the basis that, for the type of monitor I want, gsync does indeed appear to be the only option that just works.

Now I’m locked in to Nvidia for the foreseeable but I didn’t want to be without working adaptive frame technology.
 
To quote you for the second time, I’ve now bought a gsync monitor on the basis that, for the type of monitor I want, gsync does indeed appear to be the only option that just works.

Now I’m locked in to Nvidia for the foreseeable but I didn’t want to be without working adaptive frame technology.
I was just quoting something that Jensen said that every made fun of the other month.
 
It will be interesting to see how future Freesync 2 monitors compare... I get the sense some manufacuters will be skipping G-Sync altogether on more premium models given the significant extra cost. The upcoming XG438Q for example... wouldn't be surprised if that ends up as Freesync only.
 
Back
Top Bottom