• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

AMD VEGA confirmed for 2017 H1

Status
Not open for further replies.
Sorry for the off topic but Rock, Paper, Shotgun did a revisit of the Sync monitors in anticipation for Freesync 2

https://www.rockpapershotgun.com/2017/02/02/freesync-vs-g-sync-revisited-freesync-2-is-coming/

Interesting read!


Having used both, on Asus PG278Q and the MG278Q; I must say I found them to be identical for all my use cases and gaming.

Only advantages G-Sync had for me was SLI support, and Borderless Windowed support, but even AMD have the latter now. Don't think they support FreeSync with XFire yet though.
 
Thanks, haven't used crossfire since the 4870X2 really.

So there's no difference between the two really then. There certainly was nothing when I had friends try both as well.
There is a difference. One is free, the other costs a three figure sum :p
 
Yeah CF and Freesync works since June 2015. And I will agree with @nashathedog about some companies (LG & Samsung) Freesync implementation.
And if someone doesn't use RTSS overlay, it works on Borderless windows also.
 
Having used both, on Asus PG278Q and the MG278Q; I must say I found them to be identical for all my use cases and gaming.

Only advantages G-Sync had for me was SLI support, and Borderless Windowed support, but even AMD have the latter now. Don't think they support FreeSync with XFire yet though.
I played Battlefront on a friends 144Hz 1080P (might have been 1440P) and it ran very nice and butter smooth with no tearing. I would have been hard pushed to tell the difference between my system and his (Fury) if they were hidden and I had no idea what screen was what (assuming his was 1440P). From my very quick play, I couldn't pick a fault with Freesync.
 
I think the big difference between Gsync and Freesync at this point is functional range. Most Freesync monitors dont apply the VRR tech below 48fps. Which definitely limits its usefulness. I would not buy one that didn't work at minimum at 40fps. 20-30fps would be ideal and I know the tech is technically capable of it.

Gsync monitors I think all work down to 30fps.
 
I think the big difference between Gsync and Freesync at this point is functional range. Most Freesync monitors dont apply the VRR tech below 48fps. Which definitely limits its usefulness. I would not buy one that didn't work at minimum at 40fps. 20-30fps would be ideal and I know the tech is technically capable of it.

Gsync monitors I think all work down to 30fps.

Any comparable FreeSync to G-Sync one from the same manufacturer runs at similar ranges usually.

My Asus MG278Q is 30-144Hz range, while being cheaper than the G-Sync PG278Q. It's only really the truly budget monitors that are limited in that range; although they have absolutely no G-Sync alternative usually; unless you pay significantly more. Even more than the difference between MG278Q and PG278Q.

Not even G-Sync goes down to 20FPS though, the lowest I've seen and experienced is 30Fps.
 
I think the big difference between Gsync and Freesync at this point is functional range. Most Freesync monitors dont apply the VRR tech below 48fps. Which definitely limits its usefulness. I would not buy one that didn't work at minimum at 40fps. 20-30fps would be ideal and I know the tech is technically capable of it.

Gsync monitors I think all work down to 30fps.
This.

Hopefully Freesync 2 monitor's will address this.
 
I may have the wrong monitor to compare, but so far for me Gsync has seemed superior (had an X34A before). I suppose being a hardware solution would make things easier in regards to the driver side (The Division lacked Freesync support for a fair few months, only recently fixed).

Unfortunately you're right, The LG monitor you have only has a 52 to 75 hz working range which as well as being only a small window also means no LFC. It's one of the main reasons I didn't buy the 38" and instead bought the Freesync version of the Acer X34 you owned before the 38" LG, The Freesync XR34 has a working range of 30 to 75 hz with LFC support for if it goes under 30. Based on the higher res of the model you have and the high freesync starting point of 52 hz it's unlikely you can keep it inside the working range when gaming, It's another example of LG using Freesync as a selling point on a premium monitor while only offering a very poor implementation of it. A prime example of why AMD have had to step in and take control of the Freesync2 name. Big monitor companies such as LG are giving Freesync a bad name simply because they have not supported it in line with how it should be offered on premium products.
 
Last edited:
I think the big difference between Gsync and Freesync at this point is functional range. Most Freesync monitors dont apply the VRR tech below 48fps. Which definitely limits its usefulness. I would not buy one that didn't work at minimum at 40fps. 20-30fps would be ideal and I know the tech is technically capable of it.

Gsync monitors I think all work down to 30fps.

They do and then they do a doubling or tripling up of frames when it goes under 30 to reduce the damage done to gameplay by such low frame rates, AMD's Freesync did not have LFC support on release, The idea for the way LFC works was taken from how G-sync deals with sub 30 fps dips. It's a must have feature on any Freesync monitor. Without LFC you are not getting the best Freesync experience.
 
This.

Hopefully Freesync 2 monitor's will address this.

I think that's exactly the major point of Freesync 2: specifying the frequency range and LFC parameters, having AMD certify that it operates properly, then you put the 'Freesync 2' sticker on the monitor and customers know they're getting a decent implementation of the technology.

That's the reason the 'certification' part suddenly came into play.
 
That April fools was terrible, at least make it look convincing, but double Nvidias performance, just lol, spotted that was a load of crap from a country mile, should have said it was double slower, then it would have been believed :p
 
I think that's exactly the major point of Freesync 2: specifying the frequency range and LFC parameters, having AMD certify that it operates properly, then you put the 'Freesync 2' sticker on the monitor and customers know they're getting a decent implementation of the technology.

That's the reason the 'certification' part suddenly came into play.

That's exactly right.
 
Status
Not open for further replies.
Back
Top Bottom