LG 34GK950G, 3440x1440, G-Sync, 120Hz

Does anyone knows if the 950F going to be HDR400 or 600?
HDR 400.
HDR 600 requires 10bit panel.

PD: The G version should be DisplayHDR 400 aswell because has the same specifications required to be HDR 400 like the F version.
 
Last edited:
Does anyone knows if the 950F going to be HDR400 or 600?

DisplayHDR 400 only. The panel maxes out at 550 nits, where 600 is needed. Really dumb. It's 8+2 bit FRC, which is enough for HDR600. I just don't understand why they didn't add 50 more nits since the 5K ultrawide has that.
 
It its not 120hz native, I'm not going to even bother with this. Alienware has better design and MUCH MUCH better customer service in the US. LG only has a 1 year warranty, which is worse than trash. Honestly even if its 120hz native, the alienware might be the better buy for us US folk.
 
Ofc on the spec states DP1.2. I assume thats wrong for 144hz and Daniel said 1.4 yes?

Did he say that? I don't recall. It won't do 144Hz, only 120Hz. I believe 120Hz is JUST within the capability of 1.2, but I have seen some people dispute this. It could be the G-Sync module makes this easier somehow. Certainly if the panel were running 144Hz it would need 1.4 though, but it isn't, even though that may be its native speed. You can still have a 1.2 port on a 144Hz panel regardless... it just wouldn't achieve that speed. For cost reasons, I'd expect it would be 1.2.
 
DisplayHDR 400 only. The panel maxes out at 550 nits, where 600 is needed. Really dumb. It's 8+2 bit FRC, which is enough for HDR600. I just don't understand why they didn't add 50 more nits since the 5K ultrawide has that.
An IPS panel always leeks light, which is often called IPS glow. That an IPS panel can get to 550 nits without insane IPS glow and/or BLB is already really good (assuming contrast is still above 1000:1).
Maybe IPS panels will improve contrast in the future, but for now, any gaming monitor much above 550 nits will require a VA panel (or a willingness to live with really crappy contrast).

In a nutshell, it's a tradeoff between max. brightness and contrast. LG had to balance the two.

LG could have used FALD, which gives you good contrast and very high brightness, but at a much higher cost.

ADDED:
LG could also have used a different panel, which sacrifices response time for better light blocking ability (like their 5k monitor), but that can't be sold for gaming.

Pretty much all engineering problems boil down to this sort of thing.

:)
 
Last edited:
PD: The G version should be DisplayHDR 400 aswell because has the same specifications required to be HDR 400 like the F version.
*shakes head* No!
For HDR, a monitor must be able to process the HDR10 protocol. The DP1. 2 G-SYNC module, as used by the G version, doesn't understand the HDR10 protocol, so it will NOT do HDR of any kind.

HDR10 is defined as part of the DP1. 4 spec, so the monitor must use nVidia's newer DP1. 4 G-SYNC HDR module, if it wants to support HDR.
 
*shakes head* No!
For HDR, a monitor must be able to process the HDR10 protocol. The DP1. 2 G-SYNC module, as used by the G version, doesn't understand the HDR10 protocol, so it will NOT do HDR of any kind.

HDR10 is defined as part of the DP1. 4 spec, so the monitor must use nVidia's newer DP1. 4 G-SYNC HDR module, if it wants to support HDR.

My bad. i read the DisplayHDR 400 specs...

Significant step up from SDR baseline:

  • True 8-bit image quality – on par with top 15% of PC displays today
  • Global dimming – improves dynamic contrast ratio
  • Peak luminance of 400 cd/m2 – up to 50% higher than typical SDR
  • Minimum requirements for color gamut and contrast exceed SDR
...and I thought G version had the same, but I missed one:

All tiers require support of the industry standard HDR-10 format.

Sorry.
 
An IPS panel always leeks light, which is often called IPS glow. That an IPS panel can get to 550 nits without insane IPS glow and/or BLB is already really good (assuming contrast is still above 1000:1).
Maybe IPS panels will improve contrast in the future, but for now anything much above 550 nits will require a VA panel (or a willingness to live with really crappy contrast).

In a nutshell, it's a tradeoff between max. brightness and contrast, and LG had to balance the two without resorting to FALD, which gives you the best of both but at a much higher cost.

Now you understand

The LG 34WK95U is DisplayHDR600 with an IPS panel (LM340RW1) with a peak brightness of 750 cd/m2. We are even talking same gen. panels here. So there's no excuse.
 
The LG 34WK95U is DisplayHDR600 with an IPS panel (LM340RW1) with a peak brightness of 750 cd/m2. We are even talking same gen. panels here. So there's no excuse.
I just added some more info to my last post before seeing yours. See above. There is an excuse, unfortunately. Better panel technology will maybe give us better options in the future, but the sad fact is that TFT tech is rather crappy overall.
 
DisplayHDR 400 specs... Significant step up from SDR baseline:

  • True 8-bit image quality – on par with top 15% of PC displays today
  • Global dimming – improves dynamic contrast ratio
  • Peak luminance of 400 cd/m2 – up to 50% higher than typical SDR
  • Minimum requirements for color gamut and contrast exceed SDR

Whether this is a significant step up depends on what you compare with. In my view DisplayHDR 400 is mostly marketing, which VESA wrote into the standard to appease OEMs wanting to slap a DisplayHDR badge on their low-cost offerings, without having to provide much of an improvement over a good SDR monitor.

A lot of mid-range SDR monitors already match or exceed those specs. For example, in terms of contrast and color gamut, there are plenty of QLED SDR monitors on the market which exceed what DisplayHDR 400 mandates. You therefore can't say "the minimum requirements for color gamut and contrast exceed SDR" (SDR isn't even standardized in that way).

Peak luminance is the only exception, but a good mid-range monitor will also typically reach well beyond 300 nits. Technically, in comparison to current SDR monitors, DisplayHDR 400 improves on peak luminance. However, humans don't perceive brightness linearly, so to get what we'd call twice the brightness we need roughly four times the amount of light. Considering that, an extra 80 nits (or so) isn't that big of a deal. In this light (pun intended), fretting over the fact that the 34GK950F lacks the 50 nits to achieve DisplayHDR 600 doesn't make that much sense. For the 34GK950F/G, reaching the black levels mandated by DisplayHDR 600 would make a far greater impact on perceived image quality than those 50 nits.

In my view, the primary benefit of DisplayHDR 400 is that it provides a reasonable baseline. It defines minimal quality standards in those areas where OEMs typically do most of their cost cutting. That is a good thing, which will hopefully help the casual consumer make better purchasing decisions.

Most people here, who already have rather serious gear at home, probably won't be overly impressed by DisplayHDR 400. It's only with 600 and 1000 that we get into territory where SDR monitors are left in the dust.
 
Did he [Daniel - LG] say that [34GK950F will reach 144]?

Daniel-LG did say that. It also states 144Hz on the spec-sheet which @Panos linked to.

Of course, you're right that the 34GK950F can't provide 3440x1440@144Hz using DP1.2 (see post #163 for the math), so something on the spec-sheet is obviously wrong. However, it's clear that 34GK950F will use DP1.4. This is why:
  1. Daniel-LG previously mentioned the 34GK950F will use DP1.4
  2. The spec-sheet also mentions DisplayHDR 400 and FreeSync 2, both of which are based on DP1.4. You can't have either using DP1.3, so the monitor must support DP1.4
  3. Every scaler on the market that supports FreeSync 2 is based on DP1.4. A monitor OEM literally can't build a monitor that supports FreeSync 2 without supporting DP1.4
This is not controversial. It's only the 34GK950G that is confusing, but at least these two bits of information are set in stone, where the second must follow from the first:
  • The 34GK950G will use nVidia's DP1.2 G-SYNC module
  • Due to the lack of bandwidh of DP1.2, the 34GK950G will not go beyond 120Hz at 3440x1440
The only open question is why, for the 34GK950G , LG states: "100 Hz native overclocked to 120Hz". That is the only thing that is contradictory (edit: because we have a lot of information pointing towards this monitor using the UW5 panel, so we'd expect it to be 144 Hz native, but limited to 120Hz by the DP.12 connector). Unfortunately, for this statement, we lack enough certain information to infer what is actually true.
 
Last edited:
Daniel-LG did say that. It also states 144Hz on the spec-sheet which @Panos linked to.

Of course, you're right that the 34GK950F can't provide 3440x1440@144Hz using DP1.2 (see post #163 for the math), so something on the spec-sheet is obviously wrong. However, it's clear that 34GK950F will use DP1.4. This is why:
  1. Daniel-LG previously mentioned the 34GK950F will use DP1.4
  2. The spec-sheet also mentions DisplayHDR 400 and FreeSync 2, both of which are based on DP1.4. You can't have either using DP1.3, so the monitor must support DP1.4
  3. Every scaler on the market that supports FreeSync 2 is based on DP1.4. A monitor OEM literally can't build a monitor that supports FreeSync 2 without supporting DP1.4
This is not controversial. It's only the 34GK950G that is confusing, but at least these two bits of information are set in stone, where the second must follow from the first:
  • The 34GK950G will use nVidia's DP1.2 G-SYNC module
  • Due to the lack of bandwidh of DP1.2, the 34GK950G will not go beyond 120Hz at 3440x1440
The only open question is why, for the 34GK950G , LG states: "100 Hz native overclocked to 120Hz". That is the only thing that is contradictory. Unfortunately, for this statement, we lack enough certain information to infer what is actually true.


Yes, the confusion and lack of clarity surrounding this is perplexing and frustrating. No one actually seems to know, although someone must! How hard can it be lol?! I get the sense until someone actually has this monitor in their hands we won't be 100% certain what it is exactly! :rolleyes:
 
This is coming hardcore out of context but when you pre-order for example the LG at overclockers uk via PayPal. When do they charge you? The moment it goes into shipping?
Sorry for the out of context question but I really need an answer :D
 
Back
Top Bottom