LG 38GL950G - 3840x1600/G-Sync/144Hz

i've not been contradicting Daniel's statement about this screen not having HDR so far, as i've not said so far that i believe it will offer HDR in any form. Having said that, you make a good point that i do agree with which is that manufacturers will find clever and inventive ways of marketing new screens as HDR even if it could be considered quite misleading to consumers, but to be honest im not really interested in that. if it's not half decent HDR (with at least some form of local dimming for instance) then as far as i'm concerned it's meaningless. That includes the often abused VESA HDR 400 certification which is largely pointless in my opinion. anyway, in that context then yes i do probably disagree with Daniel as i expect the 38GL950G to have some form of HDR feature within the marketing - even if that's as basic as it only really accepting an HDR 10 input source. VESA HDR 400 is possible as we do know the screen already has 450 cd/m2 brightness spec.

I pointed this out in response to Daniel - LG's opinion on the lack of HDR support, which we both disagree with, and before LG published the specs listing DisplayHDR 600 certification.

where have LG published the official specs listing VESA HDR 600 please?

The disagreement you and I have relates to the challenges and costs involved in LG getting nVidia's v2 G-SYNC module to work with an edge-lit backlight. AFAIK there are no such challenges and there are no additional costs.

it's a personal opinion and expectation based on the developments of Gsync HDR to date. We've only seen so far Gsync with HDR where a FALD is used, and like i said before the current v2 module is, as far as i know, designed for FALD. If a FALD is not going to be used here which is pretty certain, then that implies to me that something new and different is going to be needed to pair it with an edge lit local dimming backlight. i dont think any of us know for sure what would be involved in that, and i was speculating that i believe it will be too complex and/or expensive for LG to bother. if it is easy to do, with no cost/complexity then surely that would mean it was certain to include an edge lit local dimming backlight here? I guess we will see when the final spec/monitor emerges as to whether it's been included or not. if there's truly no cost/complexity as you have stated, then i can't see any reason why it won't be included. and that would be a good thing for sure, i'd certainly welcome that addition. I just personally expect it to not be included right now.

I also see no evidence supporting the notion that nVidia is designing a separate G-SYNC module for non-FALD displays. If you have any concrete evidence for that I'd also find that very interesting.

none of us know officially what NVIDIA are and aren't doing. they don't really publish much info on it. i was speculating that they might be doing so given the potential need to be able to use Gsync with non-FALD HDR that's all in the future.
 
We've only seen so far Gsync with HDR where a FALD is used, and like i said before the current v2 module is, as far as i know, designed for FALD. If a FALD is not going to be used here which is pretty certain, then that implies to me that something new and different is going to be needed to pair it with an edge lit local dimming backlight.


The upcoming Acer XB273VK is obviously using the G-Sync v2 module, but isn't FALD, so clearly it can be used with a non-FALD monitor. The XB273VK is HDR 400 (which as you say is meanginless and pointless in an HDR sense). I would expect the LG 38GL950G comes in at HDR 400 also, if the panel is already known to only be 450cdm2.
 
The upcoming Acer XB273VK is obviously using the G-Sync v2 module, but isn't FALD, so clearly it can be used with a non-FALD monitor. The XB273VK is HDR 400 (which as you say is meanginless and pointless in an HDR sense). I would expect the LG 38GL950G comes in at HDR 400 also, if the panel is already known to only be 450cdm2.

indeed i totally agree it can be used with a non FALD. it could be the same kind of thing here on the 38GL950G with VESA HDR 400 or some poor HDR spec, but we've yet to see any screen where the Gsync v2 module has been paired with a meaninful HDR spec (non-FALD) with some kind of local dimming active. that's the part that i had speculated would be the challenge.
 
where have LG published the official specs listing VESA HDR 600 please?

Sorry. In post #36 Lliam listed specs published on LG's website and apparently added some things of his own (like DisplayHDR 600). Since TFTCentral is also expecting this monitor to come with a DisplayHDR 600 certification I assumed it was now confirmed. It isn't. In that case I'm back to having no position on the quality/level of HDR it will support. Only that it will be marketed as supporting HDR.

We've only seen so far Gsync with HDR where a FALD is used, and like i said before the current v2 module is, as far as i know, designed for FALD.

Yes, but going from the realization that the v2 G-SYNC module works with FALD, to the assumption that it can't work with any other type of backlight is a very far-out and wild assumption to make. There are many economic and technical reasons why that assumption makes no sense. There are even some pointers that don't require a background in electronics or economics. For example, in nVidia's G-SYNC HDR whitepaper, nVidia explicitly mentions edge-lit local dimming as a contrast improving HDR related technology. It seems highly unlikely that nVidia would mention that in their G-SYNC HDR whitepaper if their G-SYNC HDR module didn't support it.

As Legend mentioned there is also some evidence in form of the Acer XB273VK.

I don't think it makes much sense to get into the technical mumbo jumbo of the FPGA and controller technology, but I can guarantee that FALD is not the only type of backlight the existing v2 G-SYNC module will work with.

From this incorrect position you concluded that nVidia is working on an alternative v2 G-SYNC HDR module without FALD support. While the premise being incorrect doesn't necessarily mean that the conclusion is incorrect, it does mean there is little reason to believe otherwise. I actually wish you were right though, because nVidia's current FPGA based G-SYNC controller is rather crappy (power usage, heat dissipation and cost). I do believe nVidia has good reasons to develop an alternative. Those reasons just have nothing to do with HDR and even less to do with FALD.

EDIT: regarding your latest post:

indeed i totally agree it can be used with a non FALD. it could be the same kind of thing here on the 38GL950G with VESA HDR 400 or some poor HDR spec, but we've yet to see any screen where the Gsync v2 module has been paired with a meaninful HDR spec (non-FALD) with some kind of local dimming active. that's the part that i had speculated would be the challenge.

Okay, it seems we're having some trouble communicating. Now you are saying the existing v2 module can work with other types of backlighting.

You realize that a controller that can manage a FALD array must be flexible enough to control a variable number of dimming zones, right? Some FALD monitors will have 384 backlights, some 512, and some much more. From the controller's point of view there is no fundamental difference between any of these. If you reduce the number of dimming zones to 6 or 12, then guess what you've got? Exactly. That's your edge-lit local dimming right there. Whether the LEDs are directly behind the pixels or whether the light must first pass through a diffuser is entirely irrelevant to the controller. As far as the controller is concerned, there is no difference between any of these situations except for the number of dimming zones the controller is configured to manage.

The opposite, going from 12 to 384 dimming zones is different. That isn't so simple.
 
Last edited:
Sorry. In post #36 Lliam listed specs published on LG's website and apparently added some things of his own (like DisplayHDR 600).

ah ok, yes i saw that post too but had not seen it confirmed anywhere on LG's site or anything either. for what it's worth my original expectation was that this screen would have VESA HDR 600 certification given that the LG.Display panel roadmap has the panel listed with that spec (previously HDR 1000 in May roadmap). However, since there was absolutely no mention of HDR in the press release, Daniel said he didn't expect HDR to be included, and there's still no mention of it in the "coming soon" LG spec page, my current view is that it won't include this certification.

Yes, but going from the realization that the v2 G-SYNC module works with FALD, to the assumption that it can't work with any other type of backlight is a very far-out and wild assumption to make.

ok well if it is as easy as you suggest it is, there should be absolutely no reason why an edge lit local dimming backlight cannot be used on the 38GL90G and therefore some meaingful HDR being offered as a result. whether thats VESA HDR 600 or not, by "meaningul" i mean something with some form of local dimming employed to actually help increase the dynamic range. I guess we will see when the screen is released and official spec is available whether there is local dimming and "meaningful HDR" or not. if there isn't, then that would imply there is some complexity/cost to offering it here, otherwise it will presumably be included.

Okay, it seems we're having some trouble communicating. Now you are saying the existing v2 module can work with other types of backlighting.

i've never said otherwise. what i said was that there may be challenges getting the v2 module to work with other local dimming backlights, and non-FALD local dimming solutions.

know the v2 module can be used with other backlighting and am very aware of the Acer XB273K as a prime example. but that model is not offering any local dimming, only a standard backlight operation. it's the local dimming that i was suggesting would be omitted here on the 38GL950G and leave it without any meamingful HDR as a result.
 
know the v2 module can be used with other backlighting and am very aware of the Acer XB273K as a prime example. but that model is not offering any local dimming, only a standard backlight operation. it's the local dimming that i was suggesting would be omitted here on the 38GL950G and leave it without any meamingful HDR as a result.


Just out of curiosity, is HDR without FALD always going to be largely pointless, regardless of panel brightness? Are there any examples of monitors (or TV's) without FALD that have meaningful HDR impact?
 
Just out of curiosity, is HDR without FALD always going to be largely pointless, regardless of panel brightness? Are there any examples of monitors (or TV's) without FALD that have meaningful HDR impact?

yes you can still get some meaningful HDR benefits, often pretty decent in fact, with edge-lit LED systems. the trick is that it must include some kind of local dimming support otherwise you don't get any real benefits in dynamic range. with edge-lit local dimming the more zones the better. Some can even reach 1000 cd/m2 peak brightness and carry the VESA HDR 1000 certification. you don't have to have FALD to achieve that. For instance the Philips Momentum 43":
http://www.tftcentral.co.uk/reviews/philips_436m6vbpab.htm carries that certification and handles HDR quite well.
 
Actually I only added what it said on the info on the display model I saw in South Korea on my last visit
I never stated it was a official spec
To be honest as it was a review model specs change
I certainly would not wish to get into a debate about specs of a model not yet released
Based on the specs I saw I said I would definitely be buying one
As I spend half my working life in South Korea
I cant believe people are counter arguing about potentially fluid specs
If it has the basic specs with or without HDR that's good enough for me
 
ok well if it is as easy as you suggest it is, there should be absolutely no reason why an edge lit local dimming backlight cannot be used on the 38GL90G and therefore some meaingful HDR being offered as a result.

I'm an electronics/controller/software guy. I'm not equally knowledgeable when it comes to TFT panel technology. Don't get me wrong. I'm not saying there is NO REASON for this monitor NOT to support edge-lit local dimming. Maybe there are such reasons. All I'm saying is that the v2 G-SYNC controller is NOT such a reason. That is all.

I am very aware of the Acer XB273K as a prime example, but that model is not offering any local dimming, only a standard backlight operation. it's the local dimming that i was suggesting would be omitted here on the 38GL950G and leave it without any meamingful HDR as a result.

Okay, now I understand your definition of "meaningful HDR" (=supports local dimming). Thank you.

For VESA, any sort of local dimming makes a DisplayHDR 600 (or the newly introduced DisplayHDR 500) certification a possibility, but I'm not sure we should accept that as meaningful HDR. As you said, the more zones the better. Technically, two zones are already enough to qualify as local dimming. Each additional dimming zone raises the chances of the display showing you the HDR content as it was meant to be seen, but even with 12 zones (I'm not aware of edge-lit panels with more dimming zones) those chances are still rather low. 12 zones will work for a large explosion. 12 zones won't work for a brightly lit window in a night scene, or a light saber against a dark background. For that you need FALD, but even that is far from perfect (can't show a starry night sky in HDR).

It's a complicated topic with no simple yes or no answers. :-( What is and isn't meaningful is a subjective question. For me, using current TFT technology, nothing below DisplayHDR 1000 and nothing lacking a FALD backlight deserves to be called HDR capable. Panels with better static contrast may change that in the future, but that's how I see it today.

Ultimately it's up to the consumer to decide what is good enough. Unfortunately, I suspect many will purchase the cheap HDR options and decide, based on their experience, that HDR is pretty much meaningless.
 
Last edited:
I cant believe people are counter arguing about potentially fluid specs
If it has the basic specs with or without HDR that's good enough for me
Nobody is arguing over specs. Your listing just caused some confusion over what is confirmed at what isn't, because it mixed officially published specs with some that aren't, without explicitly mentioning it.
 
definitely keep an eye out for more info on this product and projected availability, looks like a winner but I'm guessing Q3-ish.

I like how Linus joked about LG's timing with G-SYNC. I suspect hell froze over when nVidia announced their GPUs would support VESA Adaptive Sync going forward (which most of us call FreeSync). Finally!!! Yay!

What do you guys think this means for VRR technology in general and for this monitor in particular?

A "G-SYNC Compatible" monitor is just a FreeSync monitor that nVidia felt worked well enough (whatever that means). If such a G-SYNC compatible monitor is connected to the nVidia GPU, the driver will automatically enable FreeSync support.

I suspect that for 99% of buyers, that means there is simply no reason left to spend the extra $200 on a "real" G-SYNC DP1.2 monitor or the extra $400 on a "real" G-SYNC HDR DP1.4 monitor. Why would you if you can get the same (or at least almost the same) thing for less?

It seems to me that for everything but the very highest-end G-SYNC Ultimate monitors, which have no FreeSync competition at all, G-SYNC no longer serves a purpose. Are we witnessing the beginning of the end for G-SYNC?

If alongside the 38GL950G, LG also released a 38GL950F which was $400 cheaper but came with a "G-SYNC compatible" label, which would you prefer to purchase?
 
What do you guys think this means for VRR technology in general and for this monitor in particular?

I for one have no problem whatsoever in paying a premium for a quality VRR implementation like Gsync, let's face it the monitor is your primary and most important peripheral so you deserve it to be good quality! This monitor is looking a cracker, but I expect it to be circa £1800 at least. LG will have to make it pretty much perfect for that...
 
I for one have no problem whatsoever in paying a premium for a quality VRR implementation like Gsync, let's face it the monitor is your primary and most important peripheral so you deserve it to be good quality! This monitor is looking a cracker, but I expect it to be circa £1800 at least. LG will have to make it pretty much perfect for that...

I suspect you’re not alone there. I expect there’s still a market for Gsync for the time being and many people will continue to pay the price premium for a Gsync model.

If more adaptive sync screens can offer decent and reliable VRR experience maybe longer term there will be less need for Gsync. But given NVIDA have only deemed 15 of 400 odd FreeSync screens suitable for the new Gsync compatible certification it gives you a feel for how many are not up to scratch for use with an NVIDIA card, or rather overall just not ideal for VRR. There’s more to it than just being able to support “some” VRR. A gsync model will still offer certified performance with a wide Hz range. and often (in fact in my experience, normally) they have better overdrive control, less bugs with overdrive settings, and not to mention a pretty much guaranteed next-to-no lag experience. That’s not something you get on many FreeSync screens.

I can see gsync fading out with time but I can’t see it any time that soon to be honest. There’s still going to be a demand for them in my opinion and for a while.
 
I for one have no problem whatsoever in paying a premium for a quality VRR implementation like Gsync, let's face it the monitor is your primary and most important peripheral so you deserve it to be good quality! This monitor is looking a cracker, but I expect it to be circa £1800 at least. LG will have to make it pretty much perfect for that...

If more adaptive sync screens can offer decent and reliable VRR experience maybe longer term there will be less need for Gsync. But given NVIDA have only deemed 15 of 400 odd FreeSync screens suitable for the new Gsync compatible certification it gives you a feel for how many are not up to scratch for use with an NVIDIA card, or rather overall just not ideal for VRR. There’s more to it than just being able to support “some” VRR. A gsync model will still offer certified performance with a wide Hz range.

I get what you're both saying and I agree. IMHO AMD pretty much screwed up with FreeSync for the reasons you both mentioned. I don't expect that to change tomorrow. One year from now however?

It's precisely the fact that so few out of 400 FreeSync monitors made the cut that I think G-SYNC will become obsolete (except in the highest-end FALD market where G-SYNC has no competition). The G-SYNC-Compatible logo will identify those FreeSync monitors with a non-crappy VRR implementation. At that point, for most people, G-SYNC may still be viewed as the premium VRR experience, but in many cases it will actually be FreeSync that provides it. AMD has also started fixing this problem, because for FreeSync 2 they now also mandate a decent VRR range and some quality controls. I suspect most FreeSync 2 monitors will be natural candidates for a G-SYNC-Compatible certification. The gap is already closing and will continue to close.

More importantly, nVidia's G-SYNC-Compatible certification allows monitor OEMs to serve both AMD and nVidia customers with a single monitor without anyone having to sacrifice VRR. That's just too much of a profitability/product improvement to ignore. I'd be surprised if not almost every FreeSync monitor currently in the planning phase is aiming to achieve that certification, and the viability/necessity of a separate G-SYNC model is being questioned.

For G-SYNC not to become obsolete, a very substantial amount of nVidia GPU owners (far more than just us enthusiasts) would have to reject the notion that G-SYNC-Compatible VRR implementations are at least good enough VRR. The difference would have to remain obvious enough that for most people that additional $200 - $400 G-SYNC tax continues to sound like a reasonable proposition. Given how seriously nVidia apparently takes the G-SYNC-Compatible certification, I have a hard time imagining that's how this plays out.
 
Last edited:
I suspect you’re not alone there. I expect there’s still a market for Gsync for the time being and many people will continue to pay the price premium for a Gsync model.

If more adaptive sync screens can offer decent and reliable VRR experience maybe longer term there will be less need for Gsync. But given NVIDA have only deemed 15 of 400 odd FreeSync screens suitable for the new Gsync compatible certification it gives you a feel for how many are not up to scratch for use with an NVIDIA card, or rather overall just not ideal for VRR. There’s more to it than just being able to support “some” VRR. A gsync model will still offer certified performance with a wide Hz range. and often (in fact in my experience, normally) they have better overdrive control, less bugs with overdrive settings, and not to mention a pretty much guaranteed next-to-no lag experience. That’s not something you get on many FreeSync screens.

I can see gsync fading out with time but I can’t see it any time that soon to be honest. There’s still going to be a demand for them in my opinion and for a while.
Don't put much stock in nVidia's testing for now. They deemed a monitor to be a failure if it didn't automatically enable VRR.
 
Back
Top Bottom