LG 34GK950G, 3440x1440, G-Sync, 120Hz

Let's all give @Daniel - LG a break with this. I know you're all keen to see a review of this monitor, but he can only do so much himself to make this happen. The wheels are already in motion and things will go as quickly as they can, but hounding him on this thread won't speed anything up.

Nope, Daniel needs to roll up his sleeves up, get down the LG factory and start help assembling these monitors pronto... then jump in his car and personally deliver one each to all of us! It's the least he can do really.
 
@PCM2 in your article, you state the 'F' version features 10-bit colour reproduction (8-bit + FRC), vs True 8-bit without dithering on the 'G'. What real world difference does this make if you were looking at the two monitors side by side? Same question regards the VESA DisplayHDR 400 certification... given the 'F' has it and the 'G' doesn't, what difference would you notice when viewing HDR content? Your thoughts would be welcome.
 
@PCM2 in your article, you state the 'F' version features 10-bit colour reproduction (8-bit + FRC), vs True 8-bit without dithering on the 'G'. What real world difference does this make if you were looking at the two monitors side by side? Same question regards the VESA DisplayHDR 400 certification... given the 'F' has it and the 'G' doesn't, what difference would you notice when viewing HDR content? Your thoughts would be welcome.

The content itself needs to support the bit depth otherwise you don't gain any noticeable advantage. So that rules out SDR games and most images you'll see on the internet etc. It's just a nice box for the monitor to tick for HDR, as you noted. Although even then the advantages are questionable, but I'll try not to over-complicate things. ;)
 
The content itself needs to support the bit depth otherwise you don't gain any noticeable advantage. So that rules out SDR games and most images you'll see on the internet etc. It's just a nice box for the monitor to tick for HDR, as you noted. Although even then the advantages are questionable, but I'll try not to over-complicate things. ;)

That being the case, if content does support the bit depth requirement, what would you notice different between the two? Generally speaking, is 8-bit + FRC vs 8-bit no dithering going to make much difference for most people day to day?
 
It can help with image editing to smooth out gradients and that sort of thing, but outside of that the difference is negligible. In fact, for HDR, it seems that the GPU can 'fill in the gaps' just fine if a monitor is only supporting an 8-bit signal. I saw that with the ASUS PG27UQ I recently reviewed - check out the HDR section for more thoughts on that.
 
@PCM2
Thank you for this article! While I don't understand what "120Hz via 100Hz" means, it is really good to finally have someone confirm that "overclocked" applies to the g-sync module rather than the panel (Tom's hardware is still getting it wrong).

However, in contrast to other (obviously contradictory) specs in the spec sheet, this uniquely misleading usage of terminology is clearly deliberate.

But why? Do you have any insights? Will we, going forward, have to "thank" LG for making an already vague term (monitor overclocking) entirely meaningless? Will we have to wait for reviewers to explain to us in which convoluted way the term "overclocked" is being used for each individual monitor model?

This stuff bugs me, as it has caused a ton of confusion all over the internet, with nothing to show for it. For consumers, there is nothing useful to be gleaned from the knowledge that the electronics in their monitor is overclocked. That's entirely different for a panel overclock, where that knowledge is useful in terms of judging how well a monitor can do what it's intended to do.
 
Last edited:
@PCM2
Thank you for this article! While I don't understand what "120Hz via 100Hz" means, it is really good to finally have someone confirm that "overclocked" applies to the g-sync module rather than the panel.

However, in contrast to other (obviously contradictory) specs in the spec sheet, this uniquely misleading usage of terminology is clearly deliberate.

But why? Do you have any insights? Will we, going forward, have to "thank" LG for making an already vague term (monitor overclocking) entirely meaningless? Will we have to wait for reviewers to explain to us in which convoluted way the term "overclocked" is being used for each individual monitor model?

This stuff bugs me, as it has caused a ton of confusion all over the internet, with nothing to show for it. For consumers, there is nothing useful to be gleaned from the knowledge that the electronics in their monitor is overclocked. That's entirely different for a panel overclock, where that knowledge is useful.

Indeed... I'd also like to know the technical explanation for a 144hz native panel that is 'overclocked' to 120Hz... that just doesn't appear to make sense, so it's understandable this has caused all kinds of confusion. Has the panel been 'underclocked' to 100hz, then 'overclocked' to 120Hz by the G-Sync module? Technically speaking, surely the term 'overclock' doesn't actually apply in this specific instance, which means LG are using it purely for marketing purposes?

You can set a monitor to any refresh rate in the Nvidia control panel for example, so is it something along those lines, only the G-Sync module is doing it?

I'm also curious if this process is actually to the detriment of anything and if there are any drawbacks to achieving 120Hz this way on a native 144Hz panel? If the panel is rated for a higher refresh rate you'd think it would all be fine. It certainly must be preferable to trying to push a 100Hz native panel above and beyond what it was designed to do.
 
This is exactly why I said it was an unconventional way to deliver 120Hz. What you have in this monitor is a 144Hz panel. It's capable of running at 144Hz natively. However; for a monitor with G-SYNC, the capabilities of the display are also bound to the G-SYNC module. There weren't any G-SYNC modules which LG could have used for this monitor that are designed to run 3440 x 1440 @ 144Hz. The only G-SYNC modules available support 120Hz (100Hz native 'overclocked' to 120Hz). Don't get too bogged down with the use of the word 'overclocking' here. It is something happening on the G-SYNC module itself. When this module was first used, it was in products that used native 100Hz panels and the panel itself was 'strained' to run at 120Hz via this overclock. In this case the G-SYNC module only supports this 120Hz using the 100Hz + overclock route, but the panel itself is not 'strained' at all. It's happy to run at 120Hz without issue. That's the difference here, I know it's confusing.
 
This is exactly why I said it was an unconventional way to deliver 120Hz. What you have in this monitor is a 144Hz panel. It's capable of running at 144Hz natively. However; for a monitor with G-SYNC, the capabilities of the display are also bound to the G-SYNC module. There weren't any G-SYNC modules which LG could have used for this monitor that are designed to run 3440 x 1440 @ 144Hz. The only G-SYNC modules available support 120Hz (100Hz native 'overclocked' to 120Hz). Don't get too bogged down with the use of the word 'overclocking' here. It is something happening on the G-SYNC module itself. When this module was first used, it was in products that used native 100Hz panels and the panel itself was 'strained' to run at 120Hz via this overclock. In this case the G-SYNC module only supports this 120Hz using the 100Hz + overclock route, but the panel itself is not 'strained' at all. It's happy to run at 120Hz without issue. That's the difference here, I know it's confusing.

OK... so the word 'overclocking' is technically erroneous and shouldn't even be used on this monitor basically. Because it's not being overclocked at all. Given the previous panel refresh rate 'cap' of 100Hz has been raised above what the G-Sync module can achieve, the 144hz panel is actually now constrained to 120Hz by that same module.

I guess "constrained to 120Hz" probably doesn't have quite the same ring to it and wouldn't sell as many monitors. :D
 
Well the G-SYNC module is technically in an overclocked state, hence 'overclocked'. The panel isn't. Weird, I know.

Very! But we don't interact with the G-Sync module, we interact with the panel, which itself isn't overclocked. Semantics I know lol!
 
@PCM2
I don't think the technical explanation is confusing at all. I already understood what you just mentioned perfectly fine. I suspect most following along here did.

What actually is confusing is LG's use of terminology. Nobody else (as far as I know) "advertises" a monitor as being "overclocked", with that referring to the internal electronics. This caused a lot of confusion (and still does, even for professional reviewers, as evidenced by Tom's Hardware). For all the fuss this causes, it doesn't provide any information that is actually useful to consumers. From a technical perspective it seems pointless.

Why should anyone care about how the internal electronics achieves DP1.2 conformity at 120 Hz? Many 100/120 Hz monitors must already implement this solution, without OEMs explicitly mentioning that nor consumers caring. What changed?

Is LG publishing this nonsense because some consumers allegedly and falsely assume a panel overclock is a positive selling point? If that's the case, I'd say even an ignorant consumer deserves to know that they aren't getting what they think is written on the package.

It seems to be deliberately missleading and IMHO LG deserves to be publicly shamed in response. Most people have enough difficulty understanding the display market as it is. We don't need more confusion.

I'm wondering what your take is on that.
 
Last edited:
It's typical for a manufacturer to state that a refresh rate is "overclocked" if the user needs to use an "overclock" setting in the OSD to be able to use a refresh rate. I'm assuming that is the case here, it is on other models that I've tested with this G-SYNC module (such as the AOC AG352UCG6). If the overclock was just done completely passively and was always enabled, that's different.
 
Forgive my ignorance but...

Why in 2560x1440 monitors with GSync, the module can run natively at 144Hz, and in this LG we are talking about 100Hz and overclocking it to 120Hz?
 
Back
Top Bottom