LG 34GK950G, 3440x1440, G-Sync, 120Hz

I feel like it's already been too long of a wait, and each week passes with the X34p and AW34 prices becoming more and more appealing.
 
So the freesync version will be approx 200 cheaper and will be 144hz native vs 120hz OC on the Gsync version. To be honest even with an Nvidia card the freesync version looks like the better option there.
 
I think the freesync version is DP 1.4 (144hz) and the Gsync version is DP 1.2 (120hz OC) due to the Gsync module?

Nvidia should really release a DP 1.4 Gsync module and phase out the DP 1.2 version, it is now out of date and actually making Gsync lower HZ than freesync.

I know they already have the HDR1000 DP 1.4 module, but they should also have a normal DP 1.4 and phase out the DP 1.2.
 
Last edited:
I think the freesync version is DP 1.4 (144hz) and the Gsync version is DP 1.2 (120hz OC) due to the Gsync module?

Nvidia should really release a DP 1.4 Gsync module and phase out the DP 1.2 version, it is now out of date and actually making Gsync worse than freesync.

I know they already have the HDR1000 DP 1.4 module, but they should also have a normal DP 1.4 and phase out the DP 1.2.

i know i'm starting to sound like a broken record but what nvidia "should" do is to just dump that stupid proprietary module and go with the adaptive sync standard instead. It sucks to pay 25% of a monitors cost to some silly module that could have gone into better internals or more features or god forbid more than 2 connectivity options.
 
i know i'm starting to sound like a broken record but what nvidia "should" do is to just dump that stupid proprietary module and go with the adaptive sync standard instead. It sucks to pay 25% of a monitors cost to some silly module that could have gone into better internals or more features or god forbid more than 2 connectivity options.

Yes well obviously they should do that, but assuming they want to continue with Gsync, at least they could have a DP 1.4 update of the current DP 1.2 version, that is not this massive 500+ HDR1000 option they have now which is making the monitors 2000+, which for most people is just not acceptable!
 
Last edited:
I'll have to disagree with much of the above. From initial release until today, G-Sync, thanks to the G-Sync module, has provided superior functionality and performance compared to Freesync.

At release, Freesync monitors couldn't even enable overdrive and Freesync simultaneously, making the monitors virtually useless as high performance gaming displays with Freesync enabled. Then, when Freesync monitors finally could use overdrive and Freesync simultaneously, the overdrive functionality only worked optimally near the max refresh rate. This is still the case over halfway through 2018! That's a stark contrast compared to G-Sync monitors which have had adaptive overdrive functionality from inception.

G-Sync also provides overclocked refresh rates, extended VRR ranges, superior performance at the VRR range boundaries, etc.

I'll gladly pay the "G-Sync tax" for the premium experience. It's unfortunate the 950G incorporates the first gen G-Sync module, but no doubt we'll see the updated module in future ultrawides.

Maybe we'll see some Freesync 2 monitors finally matching G-Sync's capabilities, but I'm not holding my breath for anything soon. From what I understand, it's up to the monitor manufacturers to individually implement the various features available.
 
Last edited:
I think the freesync version is DP 1.4 (144hz) and the Gsync version is DP 1.2 (120hz OC) due to the Gsync module?

Nvidia should really release a DP 1.4 Gsync module and phase out the DP 1.2 version, it is now out of date and actually making Gsync worse than freesync.

I know they already have the HDR1000 DP 1.4 module, but they should also have a normal DP 1.4 and phase out the DP 1.2.

This is what we expected, but recent specs at online stores is showing DP1.2 for the Freesync model. I think it is all a bit of a mess, with different messages from different parts of LG.

@Daniel - LG can you please provide an update and clear up some of this mess? Why is the LG website saying that the 950G is using a 100Hz panel, and other sites saying the 950G and 950F are using DP1.2?
 
I'll have to disagree with much of the above. From initial release until today, G-Sync, thanks to the G-Sync module, has provided superior functionality and performance compared to Freesync.

At release, Freesync monitors couldn't even enable overdrive and Freesync simultaneously, making the monitors virtually useless as high performance gaming displays with Freesync enabled. Then, when Freesync monitors finally could use overdrive and Freesync simultaneously, the overdrive functionality only worked optimally near the max refresh rate. This is still the case over halfway through 2018! That's a stark contrast compared to G-Sync monitors which have had adaptive overdrive functionality from inception.

G-Sync also provides overclocked refresh rates, extended VRR ranges, superior performance at the VRR range boundaries, etc.

I'll gladly pay the "G-Sync tax" for the premium experience. It's unfortunate the 950G incorporates the first gen G-Sync module, but no doubt we'll see the updated module in future ultrawides.

Maybe we'll see some Freesync 2 monitors finally matching G-Sync's capabilities, but I'm not holding my breath for anything soon. From what I understand, it's up to the monitor manufacturers to individually implement the various features available.

Gsync is better but my point was that they need to phase out the old DP 1.2 module and replace it with a similar cost DP 1.4 module, then they would have for example DP 1.4 HDR400 or something, and a DP 1.4 HDR1000 they already have. Instead of a now out of date DP 1.2 then a massive jump to the HDR1000 DP 1.4.
 
I'll have to disagree with much of the above. From initial release until today, G-Sync, thanks to the G-Sync module, has provided superior functionality and performance compared to Freesync.

At release, Freesync monitors couldn't even enable overdrive and Freesync simultaneously, making the monitors virtually useless as high performance gaming displays with Freesync enabled. Then, when Freesync monitors finally could use overdrive and Freesync simultaneously, the overdrive functionality only worked optimally near the max refresh rate. This is still the case over halfway through 2018! That's a stark contrast compared to G-Sync monitors which have had adaptive overdrive functionality from inception.

G-Sync also provides overclocked refresh rates, extended VRR ranges, superior performance at the VRR range boundaries, etc.

I'll gladly pay the "G-Sync tax" for the premium experience. It's unfortunate the 950G incorporates the first gen G-Sync module, but no doubt we'll see the updated module in future ultrawides.

Maybe we'll see some Freesync 2 monitors finally matching G-Sync's capabilities, but I'm not holding my breath for anything soon. From what I understand, it's up to the monitor manufacturers to individually implement the various features available.

It's perfectly fine if you prefer to pay the tax, cool, but please don't reason it with incorrect information. Gsync is not superior technology in its correct offerings, not by a long shot, adaptive sync and Gsync is able to do the same thing on paper and features can be added to an adaptive sync implementation due to it not being locked down. The only thing i will agree with you on is the first wave of freesync panels wasn't anything impressive and some had a few nasty issues but that is an issue with monitor manufacturers rushing a product not the standard itself. Don't forgot all the crappy gsync panels to hit the market with issues as well, one example would be the ton of early acer va panels, but i suppose we aren't blaming gsync for the crappy implementation, that would be unfair right? And then there is overshot and crosstalk, which a lot of gsync monitors have a ton of in ULMB including some of the very best proclaimed gsync panels on the market like the Dell S27 and S24. So if you want to pay 20-25% more for a screen that is not technical superior in anyway be my guest, it even makes a bit of sense if you are using an nvidia card which you most likely are, but again don't think its flawless just because Jensen told you so or because the wife is breathing down your neck wanting an explanation for the latest expenditures :P .
 
It mandates LFC and requires the monitor adds HDR support is all. Even if that support is complete rubbish. It doesn't give an experience anywhere near as polished as G-SYNC.

Could you please explain what you mean by "It doesn't give an experience anywhere near as polished as G-SYNC" ? Cause at first glance that statement looks like a bit rubbish to me but considering what you do for a living perhaps you would be so kind to indulge me/us in your perspective for this claim.

1) A few questions for you while we are at it. Does Freesync 2 not technically allow for the same kind of specifications to be met in a monitor as the latest and greatest HDR Gsync screens?
2) Presuming you are answering yes to this, is it then not down to manufacturers to decide how they go about said implementation?
3) and lastly, presuming you said yes to previous questions, isn't it then incorrect to say that a freesync 2 certified panel cannot offer, due to tech limitations, the same experience as a Gsync HDR monitor? even if there isn't one(freesync monitor) released right now with the same specs? in the end should it not be about monitor vs monitor and not Freesync vs Gsync?
 
@Stu

The guy is correct it drops to 4:2:2, which is not that good for computer games as it crunches the colours.
Why? Simple maths. DP1.2 has maximum bandwidth of 17.28Gbit/s

3440x1440 100hz 8bit comes to 14.86Gbit/s
3440x1440 120hz 8bit comes to 17.82Gbit/s
3440x1440 144hz 8bit comes to 21.39Gbit/s

DP1.4 on the other hand goes to 25.92Gbit/s data to push even higher to 165hz without issue.

I was just checking your maths, and I thought you'd do:

3340x1440x(8x3)x100hz for the bandwidth. If you do, you get roughly 11.9 GBit/s

Even at 144Hz you get 17.12 GBit/s
 
Could you please explain what you mean by "It doesn't give an experience anywhere near as polished as G-SYNC" ? Cause at first glance that statement looks like a bit rubbish to me but considering what you do for a living perhaps you would be so kind to indulge me/us in your perspective for this claim.

1) A few questions for you while we are at it. Does Freesync 2 not technically allow for the same kind of specifications to be met in a monitor as the latest and greatest HDR Gsync screens?
2) Presuming you are answering yes to this, is it then not down to manufacturers to decide how they go about said implementation?
3) and lastly, presuming you said yes to previous questions, isn't it then incorrect to say that a freesync 2 certified panel cannot offer, due to tech limitations, the same experience as a Gsync HDR monitor? even if there isn't one(freesync monitor) released right now with the same specs? in the end should it not be about monitor vs monitor and not Freesync vs Gsync?

I'd suggest reading a few of my reviews and you'll soon get a feel for the differences. I'm not going to indulge in a point to point discussion, but I very much agree with what newtoo7 above says. The biggest issues I have with FreeSync relate to the poor pixel overdrive implementation. With Nvidia G-SYNC the board specifically tunes things for a range of refresh rates. With FreeSync monitors things are typically quite well tuned for the highest static refresh rate, but as that decreases you get more obvious overshoot. The pixel responses should loosen off to prevent this - there's no point in having such high levels of overdrive at lower refresh rates, it's undesirable. I've also seen several examples of G-SYNC variants of monitors being much better tuned at the highest possible refresh rate than FreeSync variants. With the FreeSync variants using insufficient overdrive for the higher refresh rates (and, ironically, too much for lower refresh rates). Some of the 240Hz models and the LG 32GK850G vs. F reinforces this. Just wait for my review coming later today! With G-SYNC the floor of operation is always 30Hz, whereas for FreeSync models (FreeSync 2 or otherwise) it could be anything really. At least FreeSync 2 mandates LFC, but that doesn't work flawlessly in particular where the FreeSync floor is high and stuttering at the boundary is obvious. The 32GK850F reinforces this point beautifully.

So you see, I am indeed very experienced with both technologies. And the more I use both the more I agree that G-SYNC is the more polished of the two. There are some good FreeSync models out there, don't get me wrong, and I recommend some of them. But to think G-SYNC is a pointless additional expense is wrong. Nvidia are far more involved with tuning things and the results speak for themselves. AMD just leaves the monitor manufacturers to do what they want and more often than not that's a bad thing. Whether this level of careful pixel overdrive tuning could be achieved without G-SYNC is debatable, because I've yet to see it. In an ideal world there would be no G-SYNC and the monitor manufacturers would be really careful with their pixel overdrive tuning and assess it and re-tune over a broad range of refresh rates. That alone may well require specialist hardware with a G-SYNC board, I'm not sure. But the proof of the pudding is in the eating. I prefer to deal with what is out there in the real world vs. theory.

P.S. You don't know exactly what I do for a living, there's a lot more to my life than monitors. Although I can tell you I have no affiliation with either Nvidia or AMD.
 
Last edited:
Some of the 240Hz models and the LG 32GK950G vs. F reinforces this. Just wait for my review coming later today!

(I assume you mean the 34GK950G)

Oooh, you are teasing us! This comes as somewhat of a surprise, since there has been no mention previously that review units had been sent out yet.

I'm not sure why the cloak & dagger... I think LG have already lost sales here because the availability of the 950G and 950F appears to be an unknown, which I think contributed to many purchasing the Alienware when it was on sale recently. Better communication, like "coming very soon, we have sent out review units already, we're nearly there" would have influenced some people.
 
Back
Top Bottom