LG 38GL950G - 3840x1600/G-Sync/144Hz

Associate
Joined
8 Apr 2015
Posts
561
Location
Normally in the car
If the monitor is using the same panel as the 34GK950, which is a panel that is already in mass production, one would think that it shouldn't take same long Q3-4 time frame to get it ready for consumer selling, as it did with the 34GK950? But you know more than us, so I will take your word for it.

its not same panel. 34GK950G is 34", this is 38"
 
Associate
Joined
8 Apr 2015
Posts
561
Location
Normally in the car
So apologies if this repeats any info, but here are some specs I can release for this model;

38" Ultrawide 3840x1600 Res
450 nits
DCI-P3 98% coverage from Nano-IPS display (I think)
HDMI, DP & USB
G-Sync
144Hz
Sphere Lighting (like on 32GK850G and 34GK950G)
Dynamic Action Sync, Black Stabilizer
 
Associate
Joined
24 Oct 2018
Posts
7
Location
Portugal
So apologies if this repeats any info, but here are some specs I can release for this model;

38" Ultrawide 3840x1600 Res
450 nits
DCI-P3 98% coverage from Nano-IPS display (I think)
HDMI, DP & USB
G-Sync
144Hz
Sphere Lighting (like on 32GK850G and 34GK950G)
Dynamic Action Sync, Black Stabilizer


...can you try to find out if a FreeSync 2 version is planned at all ?...

I'm so disappointed that a FreeSync 2 version was not mentioned in this announcement... :(
 
Last edited:
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
So apologies if this repeats any info, but here are some specs I can release for this model;

38" Ultrawide 3840x1600 Res
450 nits
DCI-P3 98% coverage from Nano-IPS display (I think)
HDMI, DP & USB
G-Sync
144Hz
Sphere Lighting (like on 32GK850G and 34GK950G)
Dynamic Action Sync, Black Stabilizer

Is this native 144Hz? No overclocking nonsense to confuse everyone would be nice.
 
Associate
Joined
21 Dec 2018
Posts
6
I’ve literally created an account just so I could follow this discussion. I went from a 24 inch Throwaway monitor to the 34UC97 to the Acer x34, and after almost 3 years the magic is starting to wane. I need something a tad bigger for my work charts and gaming, and last night I saw that TFT article and got excited that this might be the one, so the HDR drop is a tad disappointing. Another factor I’m curious about is if the size increase will be noticeably more immersive than the 34”. Its part of the reason I’m no longer interested in 34”, the size is a bit small after all these years. The 49” is too much, I can’t imagine sacrificing my speaker positioning for that behemoth, so it’s really the 38” ultra wide or the 43” super ultra wide for me. Watching this with tentative excitement.
 
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
I'd disregard all of these monitors with "HDR" that is below 600 nits spec. You aren't going to get real HDR highlights with a regular edge lit back-light.
 
Associate
Joined
29 May 2018
Posts
146
In regard to HDR:

nVidia's older v1 G-SYNC module is limited to 3440x1440@120Hz. Even if a 3440x1440 G-SYNC monitor includes a panel that can reach beyond 120 Hz, the v1 G-SYNC module will limit the monitor to 120 Hz regardless (see 34GK950G).

It follows that 3840x1600@144Hz (as supported by the 38GL950G) is beyond the capabilities of the v1 G-SYNC module, meaning this monitor will ship with the newer v2 G-SYNC module. The v2 G-SYNC module supports HDR (or more precisely, the v2 G-SYNC module supports the HDR10 protocol).

If a monitor combines a newer panel with a HDR10 capable controller, then nothing prevents the manufacturer from at least slapping a DisplayHDR 400 badge on it. Combining this panel with the v2 G-SYNC module already provides LG with everything that is necessary to achieve that HDR certification level and it does so without LG having to invest a penny more into engineering.

Two assumptions:
  • nVidia doesn't release an entirely new revision of their v1 DP1.2 G-SYNC module that supports 3840x1600@144Hz (according to the DP1.2 spec, DP1.2 provides enough bandwidth to drive that resolution and refresh rate, but DP1.2 doesn't support HDR10).
  • LG won't pass up the marketing opportunity to slap a HDR badge on any monitor that can be called HDR capable without being taken to court for false advertising (so far this has been true of every monitor OEM).
Without a FALD backlight a DisplayHDR 1000 certification is off the table. LG might go with a DisplayHDR 600 or 400 certification, or forgo the official VESA certifications entirely and just slap their own "HDR capable" badge on it. If the above two assumptions are true, then LG is practically guaranteed to go with one of those options. That this monitor, with a v2 G-SYNC module, isn't at least marketed as "HDR capable" is almost unthinkable.

Of course DisplayHDR 400 shouldn't be taken seriously. Even DisplayHDR 600 barely deserves to be called HDR, but that is a different topic.
 
Last edited:
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
In regard to HDR:

nVidia's older v1 G-SYNC module is limited to 3440x1440@120Hz. Even if a 3440x1440 G-SYNC monitor includes a panel that can reach beyond 120 Hz, the v1 G-SYNC module will limit the monitor to 120 Hz regardless (see 34GK950G).

It follows that 3840x1600@144Hz (as supported by the 38GL950G) is beyond the capabilities of the v1 G-SYNC module, meaning this monitor will ship with the newer v2 G-SYNC module. The v2 G-SYNC module supports HDR (or more precisely, the v2 G-SYNC module supports the HDR10 protocol).

If a monitor combines a newer panel with a HDR10 capable controller, then nothing prevents the manufacturer from at least slapping a DisplayHDR 400 badge on it. Combining this panel with the v2 G-SYNC module already provides LG with everything that is necessary to achieve that HDR certification level and it does so without LG having to invest a penny more into engineering.

Two assumptions:
  • nVidia doesn't release an entirely new revision of their v1 DP1.2 G-SYNC module that supports 3840x1600@144Hz (in theory DP1.2 does provide enough bandwidth to drive that resolution and refresh rate, but DP1.2 doesn't support HDR10).
  • LG won't pass up, for no reason, on the marketing opportunity to slap a HDR badge on any monitor that deserves it.
Without a FALD backlight a DisplayHDR 1000 certification is off the table. LG might go with a DisplayHDR 600 or 400 certification, or forgo the official VESA certifications entirely and just slap their own "HDR capable" badge on it. If the above two assumptions are true, then LG is practically guaranteed to go with one of those options. That this monitor, with a v2 G-SYNC module, isn't at least marketed as "HDR capable" is almost unthinkable.

Of course DisplayHDR 400 shouldn't be taken seriously. Even DisplayHDR 600 barely deserves to be called HDR, but that is a different topic.


I think we'll start to see microwaves marketed as HDR soon. The whole thing is a complete joke to be honest. Most people on this forum are wise to it of course, but the majority of consumers are not and it only gives the greenlight to manufacturers to whack a premium on every 'HDR' product they sell.

I find it laughable that VESA claim to be a non-profit organisation... there is little doubt in my mind that some fat envelopes were passed underneath the table in order to get these so called 'standards' pushed through. The only ones that benefit are the manufacturers themselves. It's a sad state of affairs really. :(
 
Associate
Joined
5 Aug 2010
Posts
40
Location
Scotland
38GL950G-B

COMING SOON - 38" Class 21:9 UltraGear™ QHD+ Nano IPS LED Gaming Monitor w/ NVIDIA G-SYNC (38" Diagonal)
Key Features

37.5” Curved UltraWide® QHD+ (3840 x 1600) Nano IPS Display

144Hz (OverClock 175Hz)
NVIDIA G-SYNC Compatible v2 G-sync module
Sphere Lighting 2.0
3-Side Virtually Border-less Design
Tilt and Height Adjustable

VESA Display HDR 600 certified
98% DCI-P3 colour space
1000:1 contrast ratio, 450 cd/m2
with DP 1.4 and HDMI 2.0 versions

Looks like it will be expensive and a long way off but at this point looks like my next monitor purchase based on the projected features
 
Associate
Joined
29 May 2018
Posts
146
^ This is the link to the above mentioned lists of specs:

https://www.lg.com/us/monitors/lg-38GL950G-B-gaming-monitor

I think we'll start to see microwaves marketed as HDR soon. The whole thing is a complete joke to be honest. Most people on this forum are wise to it of course, but the majority of consumers are not and it only gives the greenlight to manufacturers to whack a premium on every 'HDR' product they sell.

I find it laughable that VESA claim to be a non-profit organisation... there is little doubt in my mind that some fat envelopes were passed underneath the table in order to get these so called 'standards' pushed through. The only ones that benefit are the manufacturers themselves. It's a sad state of affairs really. :(

True indeed. In my view that can unfortunately be said of any and all advertising and marketing. It's all manipulative BS specifically designed to make things appear better than they are. As PC enthusiasts we recognize the BS in this one area, but it's everywhere, and we can't be this informed about everything in our lives (or at least I can't). :(

Monitor OEMs are all VESA members, as are GPU manufacturers, OS developers and a few others. VESA is neither independent nor are they a consumer watchdog agency. VESA exists only because its members pay their dues. In exchange, VESA does exactly what its members want it to do in the way they want VESA to do it. Because VESA is paid by monitor OEMs, there is no need for any fat envelopes. Because membership dues just cover VESA's running costs, VESA technically is a non-profit. While VESA doesn't make a profit, it exists to facilitate cooperation between its members and ultimately, to improve the profits of its members.

In short, VESA doesn't exist for us consumers. It exists for the industry that wants to sell to consumers (like every such organization in the world).
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,564
Location
UK
Given the resolution and native 144hz / overclocked 175Hz this can’t be using the v1 Gsync module. I am anticipating the non FALD HDR version of the v2 module much like recently used for the Acer Predator XB273K. I would not expect FALD or HDR1000 on this new LG screen. The panel spec from LG Display is listed as VESA HDR 600 now so that’s a possibility but personally I think it will be too challenging to get HDR working with Gsync and so will be left off in any meaningful way here
 
Associate
Joined
29 May 2018
Posts
146
personally I think it will be too challenging to get HDR working with Gsync and so will be left off in any meaningful way here

That statement makes no sense. The v2 G-SYNC module was built specifically to support HDR. nVidia literally calls it their G-SYNC HDR module. That module represents a turnkey solution which monitor OEMs can just plop into their monitor. At that point there is literally NOTHING left for the OEM to do in order for HDR to work in tandem with G-SYNC. That is the opposite of challenging.

As stated above, any half decent monitor (and most definitely a semi professional monitor like this one) incorporating the v2 module will support HDR and G-SYNC by default. Whether it deserves to be called HDR is a separate, marketing related issue.

I'm unaware of the existence of a separate non-FALD version of the v2 G-SYNC module. AFAIK there is only ONE such v2 module. In monitors without a FALD backlight, the wiring that would carry the signals to control the FALD backlight is simply omitted.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,564
Location
UK
That statement makes no sense. The v2 G-SYNC module was built specifically to support HDR. nVidia literally calls it their G-SYNC HDR module. That module represents a turnkey solution which monitor OEMs can just plop into their monitor. At that point there is literally NOTHING left for the OEM to do in order for HDR to work in tandem with G-SYNC. That is the opposite of challenging.

As stated above, any half decent monitor (and most definitely a semi professional monitor like this one) incorporating the v2 module will support HDR and G-SYNC by default. Whether it deserves to be called HDR is a separate, marketing related issue.

I'm unaware of the existence of a separate non-FALD version of the v2 G-SYNC module. AFAIK there is only ONE such v2 module. In monitors without a FALD backlight, the wiring that would carry the signals to control the FALD backlight is simply omitted.

My point was that this display is not going to feature a FALD backlight, that is pretty certain when you consider that there is no mention of HDR in any way in the press release, currently known specs, "coming soon" page and that Daniel from LG has also said that he does not expect HDR support from the 38GL950G. The IPS panel that is going to be used here is also stated now as a VESA HDR 600 panel by LG.Display, when previously it was VESA HDR 1000. So i think it's safe to say that this won't be a FALD HDR display.

So with that in mind, that leaves LG with some form of edge-lit local dimming solution to consider if they want to offer any meaningful HDR support, including the VESA HDR 600 standard. I don't believe the G-sync module was designed to work with any non-FALD HDR backlighting solution to be honest, but that is one of the reasons i was saying it was going to be an issue and challenge. Even if it can work with an edge-lit local dimming solution, there are still then challenges (and associated development costs) for LG to ensure that the backlight can operate correctly with the Gsync module and variable refresh rate. Particularly when this is a gamer-orientated screen and there are a lot of complexities with ensuring suitable backlight operating speeds and the like. We know that NVIDIA faced a long and expensive period trying to ensure the FALD for the Asus ROG Swift PG279UQ could work with Gsync effectively, and at suitable speeds and performance to meet the demands of a high refresh rate gaming display. So if LG are limited to a non-FALD backlight solution to offer any meaningful HDR, then that is why i think it will prove too challenging and expensive.

Release of this monitor is still a long way away, likely well over 6 months, and so there's nothing to say that a new Gsync v2 module without HDR won't be available and used by then in some screens. My personal opinion (shared by Daniel from LG) is that we won't see any meaningful HDR from the 38GL950G.
 
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
Wow, 3840x1600 at 175 Hz. I think I just may pick this one up.

As for HDR, IMO it will be the standard "HDR400" that gets slapped on every edge lit display these days.
 
Associate
Joined
29 May 2018
Posts
146
So i think it's safe to say that this won't be a FALD HDR display
Agreed. This is unrelated to the point I was contradicting.

My personal opinion (shared by Daniel from LG) is that we won't see any meaningful HDR from the 38GL950G.

Agreed. At least almost.

To clarify, Daniel - LG said that in his personal opinion this monitor "won't have HDR". You're contradicting Daniel's opinion by stating it won't provide meaningful HDR support (implying it will at least provide meaningless HDR support). If I interpreted your comment correctly then you and I agree (and we both disagree with Daniel - LG).

I phrased it differently, because I didn't want to speculate on the level/quality of HDR support the 38GL950G will bring to the table. My point was that the 38GL950G will, with 100% certainty, be marketed as HDR capable. That will occur via one of VESA's DisplayHDR certifications (except 1000 due to the lack of FALD) or just a generic HDR badge (should it not even achieve what is required for a DisplayHDR 400 certification, which is almost unthinkable).

The recent past has shown that monitor OEMs will slap a HDR badge on any monitor that accepts a HDR10 video signal. Nothing more is required for a monitor to be marketed as "HDR capable". Using the v2 G-SYNC module provides that HDR10 capability.

I pointed this out in response to Daniel - LG's opinion on the lack of HDR support, which we both disagree with, and before LG published the specs listing DisplayHDR 600 certification.

The disagreement you and I have relates to the challenges and costs involved in LG getting nVidia's v2 G-SYNC module to work with an edge-lit backlight. AFAIK there are no such challenges and there are no additional costs.

One of the main selling points of nVidia's G-SYNC modules (v1 and v2) is precisely that they alleviate monitor OEMs from VRR related challenges and additional investments. If such challenges and investments exist, then both G-SYNC modules would have failed to reach their stated product goals.

I can list quite a few reasons why such challenges and costs don't exist, but ultimately I can't prove a negative. Do you have a concrete example of what those ADDITIONAL (VRR related) challenges and costs might be. I'm a software engineer with a degree in electronics engineering and as far as I'm concerned there are none.

I also see no evidence supporting the notion that nVidia is designing a separate G-SYNC module for non-FALD displays. If you have any concrete evidence for that I'd also find that very interesting.
 
Last edited:
Back
Top Bottom