LG 34GK950G, 3440x1440, G-Sync, 120Hz

Personally, I think it would be a bigger disaster releasing this with the UW4 panel 9 months after the Acer and Alienware monitors that use the same panel.
 
I assume the GSync module will kill any hope of having HDR (even though HDR400 us not likely to be impressive)?

That I can answer for LG.

HDR content is transferred from GPU to monitor over the HDR10 protocol. The HDR10 protocol only runs over DisplayPort 1.4 (HDR10 is not part of the DisplayPort 1.3 spec). The G-SYNC module used in the 34GK950G supports DisplayPort 1.2, hence there is no way to send HDR10 data over it.

Without DisplayPort 1.4, there is no way for the monitor to accept HDR content, so it won't have HDR support. Period.

Of course, you're right that DisplayHDR 400 is anything but impressive. DisplayHDR 400 provides practically nothing of value beyond improved marketability, so the lack of DisplayHDR 400 doesn't remove anything of value either.
 
@Daniel - LG
Thank you for those answers. They are very much appreciated! Thanks also to LG for sponsoring your time here.

I agree with everyone else. This is the opposite of a disaster, for us and for LG, for the reasons already mentioned.

With your last post, the various sources of reliable information on the 34GK950G now no longer contradict each other, which gives me more confidence in our current understanding actually being correct. That is great!
 
Can someone explain to me if 3440x1440 @ 120hz is supported by DP1.2

I dont see anything definitive and I see some people saying DP1.2 supports it without issues, others saying it does but there could be flickering, and others saying it flat out doesn't.

I want a flicker-free, issue free 3440x1440 @ 120hz with the 950G (assuming the 120hz is native now).
 
I think the flickering that people have reported is due to panels that are overclocked, either 60 to 100Hz or 100 to 120Hz. Using the UW5 panel should mitigate this issue.
 
That I can answer for LG.

HDR content is transferred from GPU to monitor over the HDR10 protocol. The HDR10 protocol only runs over DisplayPort 1.4 (HDR10 is not part of the DisplayPort 1.3 spec). The G-SYNC module used in the 34GK950G supports DisplayPort 1.2, hence there is no way to send HDR10 data over it.

Without DisplayPort 1.4, there is no way for the monitor to accept HDR content, so it won't have HDR support. Period.

Of course, you're right that DisplayHDR 400 is anything but impressive. DisplayHDR 400 provides practically nothing of value beyond improved marketability, so the lack of DisplayHDR 400 doesn't remove anything of value either.

That's not quite correct. It's a bit of a complex one, but HDR requires a DP 1.4 port controller on the GPU. It does not require this on the monitor. You will find some monitors support HDR via DP 1.2, but it's a slightly modified DP 1.2 monitor port controller with some DP 1.4 features.
 
That's not quite correct. It's a bit of a complex one, but HDR requires a DP 1.4 port controller on the GPU. It does not require this on the monitor. You will find some monitors support HDR via DP 1.2, but it's a slightly modified DP 1.2 monitor port controller with some DP 1.4 features.

Damnit! When it comes to tech there is always a caveat somewhere that gets left out ;)

1)
Technically, what you're talking about is no longer really a DP1.2 monitor input. It's something that makes due with the bandwidth limits of DP1.2 but supports the protocols defined in DP1.4, which is something that draws outside the lines of VESA's DP standards.

2)
More importantly, the 34GK950G can't incorporate it's own OEM-decombobulated controller. In order to support G-SYNC, it must incorporate nVidia's controller, and that does adhere strictly to the DP1.2 specification (as every monitor input should). For this reason, at least in this instance, there is no HDR support to be had.

But yes, generally speaking you're unfortunately correct. You seem to be the guy who is keenly aware of all the crap monitor OEMs do :D
 
Damnit! When it comes to tech there is always a caveat somewhere that gets left out ;)

1)
Technically, what you're talking about is no longer really a DP1.2 monitor input. It's something that makes due with the bandwidth limits of DP1.2 but supports the protocols defined in DP1.4, which is something that draws outside the lines of VESA's DP standards.

2)
More importantly, the 34GK950G can't incorporate it's own OEM-decombobulated controller. In order to support G-SYNC, it must incorporate nVidia's controller, and that does adhere strictly to the DP1.2 specification (as every monitor input should). For this reason, at least in this instance, there is no HDR support to be had.

But yes, generally speaking you're unfortunately correct. You seem to be the guy who is keenly aware of all the crap monitor OEMs do :D

Yes it's massively confusing. I was very surprised when I started reviewing HDR models which fully supported the technology using their 'DP 1.2' (or DP 1.2a) ports. Sort of wondered whether the ports were mislabelled/incorrectly specified by the manufacturer. But this was something that was repeated on several models from different manufacturers, so it really muddied the water. I agree that G-SYNC displays will probably only support HDR via DP 1.4 and will specifically promote it as 'G-SYNC HDR'.
 
Using this calculator http://k.kramerav.com/support/BWcalculator.asp

It looks like 3440x1440 @ 120hz and 8bit color depth has a bandwidth of 17.83 Gb/s. Displayport 1.2 is rated at 17.28 Gb/s

Am I missing something here or is 120hz going to be impossible at this resolution and using DP1.2?

This is perhaps another example of things not being as clear-cut as the VESA standards would suggest. In fact I'm using a 3440 x 1440 display at 120Hz via DP 1.2 as I type this. No issues with it in the real world.
 
This is perhaps another example of things not being as clear-cut as the VESA standards would suggest. In fact I'm using a 3440 x 1440 display at 120Hz via DP 1.2 as I type this. No issues with it in the real world.
Also not sure if the calculator is calculating it based of 10 bit or 8 bit. It says 8 bit in dropdown, but the calculation below is based off 1.0 x 10, not 0.8 x 10
 
This is perhaps another example of things not being as clear-cut as the VESA standards would suggest. In fact I'm using a 3440 x 1440 display at 120Hz via DP 1.2 as I type this. No issues with it in the real world.

I've been trying to test this myself, and I was wondering how you verify that the monitor is actually refreshing at 120 Hz?
  • You can set the refresh rate to 120 Hz in Windows, because this is what the monitor reports to the driver as being supported - but that doesn't tell you what rate your monitor is actually refreshing at.
  • You can use an FPS counter which reports how quickly the GPU is generating images - but that doesn't tell you what rate your monitor is actually refreshing at either.

The only simple way I'm aware of to verify this, is for the monitor to have a built FPS counter (and I don't have such a monitor).

What are your thoughts?

EDIT:
Disregard the text below (When I tested this I used a monitor with a higher resolution which I now mis-remembered as using 3440x1440. Sorry folks):

ChrisPyzut mentions that some people outright deny it's supported, and at least based on the math, I must count myself amongst those.
 
Last edited:
ChrisPyzut mentions that some people outright deny it's supported, and at least based on the math, I must count myself amongst those.

The formula I have seen is resolution x refresh rate x 3 x color depth = bandwidth. So it would be 3440 x 1440 x 120 x 3 x 8 = 14.27 GB/s without any overhead.

This is well below the 17.28 GB/s that VESA states. However is that formula correct?
 
The formula I have seen is resolution x refresh rate x 3 x color depth = bandwidth. So it would be 3440 x 1440 x 120 x 3 x 8 = 14.27 GB/s without any overhead.

This is well below the 17.28 GB/s that VESA states. However is that formula correct?

No, you're right. I had the monitor I was testing on confused for one using 3440 x 1440, when it actually had a higher resolution that did come into conflict with DP1.2. Based on the math, there is no reason it shouldn't work.

That forumla is correct.

What I can add is that DisplayPort doesn't explicitly support any resolutions at all. A display device can use whatever resolutions and refresh rates it wants. As long as the GPU can deliver that resolution, and it is within the bandwidth limits, DP doesn't care what is being sent across it.
 
Last edited:
Back
Top Bottom