LG 38GL950G - 3840x1600/G-Sync/144Hz

Latency and the other elements don’t really matter for this discussion. This isn’t a thread about projectors v monitors. I was using the new generation of projectors as an example of how low nit and absence of local dimming doesn’t mean HDR cannot work and will be bad. The new technology in the new generation of projectors has made HDR worthwhile on low nit screens without local dimming and there is no reason why a new generations of monitors cannot follow the same path.

It’s not a good idea to just look at old monitors and go all future ones will be rubbish at low nit just because old ones where. It’s too early to write off this monitor for HDR. 450nit can be more then enough for good gaming HDR as long as the rest of the specs are good enough.


No, it's not a thread about projectors vs monitors, so don't mention them lol! THEY ARE A COMPLETELY DIFFERENT TECHNOLOGY lol!!

For a monitor to deliver satisfactory HDR, all it needs to do is achieve a 600cd/m2 peak level with local dimming and you've got something that's approaching OK. This isn't going to be the best HDR experience possible of course, but it's probably satisfactory for most. At lower levels with no local dimming, it will not. This is just a fact.

From the TFT Central article: "What is important though, without question, is the screens ability to improve the active contrast ratio. To do that, you NEED to have some kind of local dimming support from the backlight. Local dimming is vital to create any real improvement in the dynamic range of an image, and it is the support (or lack of in many cases) local dimming that is the main issue here."

This panel is not using future tech... it's pretty standard tech that is well understood. It is not too early to write off this monitor for HDR at all, and I'll put money on it right now not offering anything special at all in this regard. There are no other specs it has (we know the specs by and large) that will change this.
 
Last edited:
Being different technology doesn’t matter. The HDR element both the problem with low Nit and how they made it work are the same for monitors and projectors. Which was why I brought it up as you said it was impossible when clearly it’s not.

The reasons you said this monitor cannot do HDR are the same reasons people shouted projectors could never do HDR and it was proven wrong. That’s why it’s to early to write off this screen. You cannot just go its 450nit it can never do HDR because old 400nit screens cannot do HDR. It doesn’t work like that.

Perhaps it cannot do good HDR but we cannot know based on the current specs we have. A lot of it will come down to which G-SYNC module the 27GL850G-B and this panel have as if it’s the v2 module then it could benefit from HDR.

"At lower levels with no local dimming, it will not. This is just a fact."
You can still benefit from HDR even with no local dimming and lower levels.


HDR needs local dimming. The definition of HDR is High Dynamic Range, and I could go in to great depth here about contrast ratios etc. but a projector's light source (whether that be a lamp or laser) projects light over the entirety of the tiny image chip(s). The only way to dim the light is to dim the entire image. This has inherent issues which should be obvious, and it's why NO projector, even the most expensive, can deliver true HDR. Yes, some will look "OK", but I GUARANTEE if you actually compared them to a monitor/TV (at the same size, so as you're not "wowed" by the added immersion), you would clearly see even a pretty average TV does a better job, and in many cases a projector would quite likely look better in SDR mode (many users and reviews report this, with HDR actually making a projected image look worse).

I was going to say more, but a5cent has summed it up mostly.

Your whole argument seems to hinge around projectors delivering HDR, so why can't a monitor? The point is THEY CAN'T!! NO PROJECTOR is doing this, not to the true extent of what HDR is. Someone may look at it and say "oh, that looks good", but it won't be true HDR, and it will be blown out the water by what high end LCD and OLED TV's can achieve.

400 nit monitors can't achieve HDR. Never have, never will. You need to stop suggesting otherwise, and you need to re-read that article a few more times.
 
My argument was that the new projectors look far better in HDR then without HDR. The same could go with this monitor. It might not be as good as the high end LCD and OLED TV's but that doesn’t mean it cannot look better due to HDR even if it’s not what you would call full HDR. Despite what you say 400/450nit can achieve some level of HDR and look better than a none HDR display even if its not as good as the full high end displays.

Some do, some don't... depends on the model you're talking about and the environment they're being viewed in. They also often need a lot of tinkering with settings, and from reviews/user feedback I have read, work better with some content than others. It can be very hit and miss. Certainly no one is suggesting that HDR on projectors is an amazing feature, because it isn't and is not close to being properly implemented yet due to the limitations of the technology.

Yes, a low level HDR monitor will be able to display some HDR content, but in many cases (as reviews have demonstrated) it can make some games look WORSE when activated. In those games that have good implementation of HDR, it will look noticeably better on a VESA 600 or VESA 1000 monitor than it will on a VESA 400. If you're suggesting otherwise, then I don't know what to say. It's like saying you like eating beans out of the trash vs the tin. Well OK, if that's your thing, you go for it!


From my experience the high nit screens are worse for gaming up close to the screen anyway. I wouldn't want anything over 600nit on a gaming monitor. 450 is about the sweat spot.

This is a very strange statement. How is a high nit screen worse?? It suggests to me you have not experienced proper HDR implementation, because done right, it is VERY impressive. On a 450 nit screen it's not even a halfway house and isn't actually true HDR. It won't be on a projector either for the reasons already mentioned.

As a5cent says, HDR content viewed on a DisplayHDR 400 monitor in SDR and HDR mode will show very little difference... certainly nothing that you'd go "wow, HDR is amazing!". You seem to fundamentally fail to understand how HDR functions and what it requires to function properly. This isn't a matter of opinion, it's a technical fact. 450 nits is CATEGORICALLY NOT the sweet spot lol!!
 
Last edited:
Take the Division 2 HDR you cannot just get that in SDR mode by adjusting your settings. There is not just a slight difference and the difference is not because SDR calibration is off. Why don’t you try that game on a decent HDR calibrated monitor before saying the difference is slight.

Someone needs to do a comparison with photos to demonstrate this. I have seen many complaints about HDR implementation in the Division 2, but I have also seen people praise it using expensive high end 1000-nit TV's.

There is only one 1000-nit PC monitor at the moment (well, two, but they're the same 27" panel) which is £1800... have you used this? Because until you have, you can't really say this would result in unpleasantness and squinting eyes. In fact, I've never heard anyone say that before about this monitor, on the contrary, people generally love it. If your experience of 1000-nit is sat a few feet away from a 55" TV, then I'm not surprised you found it unpleasant lol! There are many people (including professional reviewers) who have highly praised this monitor's HDR capabilities in games that support it well, and I have not seen one person ever suggest that they'd prefer a cheaper inferior 400-nit monitor over this!! That would make about as much sense as saying they'd prefer 60Hz after playing at 144hz!!

It sounds like you're saying that HDR, in its true INTENDED form on a PC monitor, is ineffective, and you personally prefer a significantly weakened and watered down version of it. Although as mentioned above, if you're drawing comparison to other HDR PC monitors, then you need to have made that judgement based on viewing one of the only two 'proper' HDR monitors currently available; the Asus PG27UQ or Acer X27. If you aren't, then there is no validity to your point on this matter. If you have used these monitors, well, it raises other questions. Neither of these monitors are a perfect implementation of true HDR by any means, but it's as close as any PC monitor offers at present. You cannot fairly compare the experience of using a high end HDR TV to a PC monitor, if for no other reason than the distance you sit from it.
 
Last edited:
A true HDR experience requires that the monitor is able to display brilliant bright shades as well as deep dark shades. That's not something that any VESA DisplayHDR 400 level displays will offer. However; Pottsey is absolutely correct that there is still some benefit to running HDR on a monitor that implements it well even if it's DisplayHDR 400. HDR isn't just about the extremes (deep dark shades, bright light shades) but also about the accuracy of tone mapping between. 10-bit colour support is part of this, even if it's achieved at the GPU level (as with some HDR solutions). An appropriate colour gamut (~DCI-P3) for HDR content also helps, although that's a seperate issue really.

I've seen some absolutely awful HDR implementations at this level which just give a flooded image and are frankly ugly to look at. But there are some models where enabling HDR really lifts the image up. It gives a realism to lighting and shadow details and a nuanced variety of shades that is always lacking in SDR, regardless of the graphics settings used. I'd advise reading or watching my recent reviews of the Acer XB273K and AOC AG273QCX if you want some examples of that.


From what I've seen, I'd certainly prefer a monitor with HDR 400 than nothing at all, although I would certainly not want to pay a high premium for it. It clearly is going to be a lesser experience than what a more highly specified monitor has with FALD etc. But as you say, we are always going to be at the mercy of content itself.

It's where he says 450-nits is the "sweetspot" that he starts going off the rails though. And that he found high-nit screens to be "unpleasant with squinting eyes"... which seems odd, unless he's been sitting 2-feet away from a 55" TV on 100% brightness. I saw an Acer X27 in action a while ago and certainly didn't come away with that impression. The only problem with that monitor is the price, which certainly does make you feel unpleasant and squint. :D
 
I am about to agree with @Pottsey here, if the HDR is implanted the correct way in a game, and the monitor has full support of HDR400 (all boxes ticket in of what makes HDR400), I can see the benefit to have HDR400, if its done right, compared to a monitor who doesnt have this support.
I can also see that higher level of HDR (600-1000) could be very unpleasant for your eyes, but I havent tried it yet, so cant really say.

But Legend and a5cent are also correct that HDR400 can make the game look even worse if implanted wrong.

We dont know yet what HDR classification we gonna get for this monitor/panel, its either gonna be HDR400 or HDR600. I would think that the G-version will get HDR400 and the F-version is gonna get HDR600, if LG release it.


The bottom line is that bad HDR will look bad regardless lol! It won't really matter what monitor you have. The only saving grace HDR 400 might have is that it could make that bad implementation less obvious (simply because you are looking at a diluted HDR experience), but it's not true to how great HDR can look... and in those situations when it IS done right, you want a 600-1000 nit panel, and ideally FALD.

Assuming you have properly functioning vision, and don't literally sit in darkened room all day, HDR 600-1000 is not unpleasant for your eyes if done correctly... it's FAR SUPERIOR if done right, especially with FALD. There isn't any doubt about this, in respect to achieving a true HDR experience. Will there be moments where you might be surprised at the brightness of an explosion, a flash of light, a sunrise etc? Sure... but THAT'S THE POINT! It won't blind you, and you won't need to visit the optician the next day, but it IS supposed to have an impact. It's like sound... you wouldn't suggest that someone use inferior speakers/headphones such that loud noises didn't give the gamer a bit of a fright or shock at appropriate moments. They aren't going to blow their eardrums out, but they should feel the visceral impact of sound when it's necessary and serves that given moment in the gaming experience. This is the same function HDR serves.

With HDR 400, if done right, then sure, it's better than nothing at all. But ultimately, if the game has excellent HDR implementation, you want the best HDR monitor possible, and that will NEVER be an HDR 400 one!

:)
 
Last edited:
To be certified under the VESA HDR400 spec a display only requires the following - 400 cd/m2 peak brightness, 0.40 cd/m2 black (therefore creating a 1000:1 contrast ratio), 95% BT.709 colour space (i.e. 95% sRGB), 10-bit image processing, 8-bit panel colour depth. In real terms the only required difference beyond most normal displays there is the 400 cd/2 peak brightness. Most normal screens of 27" and above will offer 8-bit colour depth (including many TN Film panels nowadays), and all will offer at least 95% sRGB gamut as well. So the requirements of HDR400 are very lapse. I'm not talking here about displays where manufacturers go above those requirements and have extended gamut backlights/coating, 10-bit panels etc. i'm simply saying that the requirements for HDR400 are so loose that they are open to a lot of abuse and misleading marketing. You can quite easily have a display with all those "requirements" certified as HDR400 and offer no benefits beyond a normal screen without the badge.

i'm not saying that an HDR400 certified display can't be better than a non-HDR display, but it has nothing to do with the badge or certification, that's the point. Since the certification has no requirements for colour depth, gamut or contrast beyond a normal SDR display, anything which may or may not be added by the manufacturer is entirely independent and separate. they could just as easily add those features to a normal display and not bother with the HDR400 badge.

In fact to play devil's advocate a moment, you could easily have a non-HDR certified display which is much better for viewing HDR content than an HDR400 certified display potentially. you could have an HDR400 display with only sRGB gamut, 8-bit colour depth etc but still have the badge. Then a screen which doesn't carry the badge but where the manufacturer has used a wide gamut backlight/coating or a 10-bit panel. The latter would provide benefits for HDR content when it cames to colour rendering and appearance.


Very informative post overall, but I think the point above is one of the biggest problems with HDR 400, if not THE problem... it's essentially meaningless as a classification, potentially deeply misleading and ultimately has ZERO bearing on the quality of the HDR you will experience from a monitor. The fact that any such monitor could be trounced by a non-HDR variant speaks volumes lol! So while you MAY see better visuals on an HDR 400 monitor over a non-HDR one, the classification does not in any way guarantee this. That is genuinely ridiculous, and it's why, as your article rightly stated, HDR 400 needs to go. Or perhaps changed such that it adheres to stricter specifications, but I don't know if that would be viable or even change things very much.

I hope Pottsey understands now lol! :D
 
That is the minimum that does not mean a HDR400 panel is running that low. I am not interested in arguing over if the HDR400 spec it to low or not. What matters is if this panel will be any good for HDR or not and how far past the minimum it runs and we know in many areas it will be past the minimum HDR400 spec. But some of the key aspects are unknown.


I don't understand why you aren't interested in the HDR 400 spec?! This goes to the VERY HEART of the issue lol! As pointed out above, the bar for entry in to the 'HDR 400 club' is so ridiculously low that any monitor you buy with this spec isn't guaranteed to give you anything. Just being able to turn on HDR functionality on such a monitor doesn't ensure you a particular visual experience, precisely because the spec it must adhere to in order to achieve this standard is so low!

We have a pretty good idea what the HDR capability of the LG 38GL950G will be. There seems to be a lot of wishful thinking on your part here, but we can quite accurately deduce from the released info that it's going to be average. At 450 cd/m2 brightness this is obvious. There is zero indication it has FALD, and while I am quite sure it will be a good quality panel that offers something over a non-HDR variant, the price point it comes in at (which won't be cheap) will probably raise questions as to the value of that HDR experience. Of course, given there are no other 38" monitors on the horizon offering anything better, and the other solid specs this monitor offers, it may all be a moot point. I am confident this will be a great monitor, providing there are no serious bleed/glow issues... which is going to come down to the typical panel lottery.
 
Let’s say this monitor does meet the entire requirement for HDR400 but has an old G-SYNC module so is unable to turn on HDR so not HDR certified. That would make it a non-HDR certified display which would technically be much better at HDR then the lowest end HDR400 certified displays. The question is would you be able to enable HDR in games and would the option remain grayed out because it’s not listed as HDR even though it meets all the minimum HDR400 specs?


It won't have the old G-Sync module, that's impossible as it wouldn't have the bandwidth capable to drive this panel. There is a question as to whether it will have the fan that the G-Sync v2 module has had in the Acer/Asus monitors, so this remains to be seen.

Bottom line, this monitor will almost certainly be HDR 400 certified and given the panel's characteristics (which we know), then it will be OK. Nothing special, nothing WOW, but just OK. This is going to be a VERY expensive monitor though, so that needs to be considered in all of this.
 
The XB273K has the full expensive FPGA DP 1.4 G-Sync module and it's selling for only $1300. No reason this LG wouldn't have the same thing.


Not really comparable though. That's a traditional 16:9 27" panel. This is 38" 21:9 lol! The 34" LG ultrawide is £1150, which uses the older G-Sync module. There is no way the 38GL950G is coming in more than a penny under £1500, but I won't be surprised if it's closer to £2K.
 
That is not at all what I was saying. My point was; if a "only" $1,300 monitor has the full expensive FPGA DP 1.4 G-Sync module, there would be no issues with even more expensive monitors using it. Such as this LG.


There is no question that it will be using it... assuming it's real G-Sync and not 'G-Sync compatible' (i.e Freesync). It would be impossible for it to use the v1 module as this panel exceeds the bandwidth that it is capable of.
 
Once again, not what I was saying. I never mentioned anything about the old V1 G-sync module. The discussion was about if there was a different DP 1.4 G-Sync module that doesn't require HDR and HDR1000. Which there isn't. Or that all DP 1.4 G-Sync monitors have to be HDR1000, which they don't. The G-Sync module that controls all G-Sync displays that require DP 1.4 bandwidth is used in all of said G-Sync displays, regardless of their back-light method or HDR standard.


Erm, I don't think there was ever a discussion about there being a different module? There's the original v1 module, and the v2, which as you say is the only other one used... albeit in just a few monitors at present. So as per your original post, there is indeed no reason why the 39GL950G won't be using it (unless it ends up Freesync), because it has to use it.
 
QC is a lottery... do a wide enough poll for any manufacturer and you'll find a pretty even distribution of horrendous faults that should never have left the factory, to near flawless examples that we all dream of.
 
What about the professional range of monitors built by EIZO or NEC? Also lottery? If not, that would suggest it's only a lottery because monitor OEMs have decided that's how it will be.

Well no, but they are in a class of their own, some being north of £4K, so I'd expect a slightly better QC system in place there. I meant more general consumer gaming/work monitors, not dedicated professional ones.
 
I have no idea if that's true, but if it is, that means these companies can reliably weed out sub-par monitors if they want to. For their consumer products, they just choose not to and ship regardless.

In contrast to the narrative put forward in all of these forums, that would mean none of this is actually about poor QA, which typically means that some products slip through their checks and unintentionally end up being shipped. Rather, their QA is working exactly as intended, i.e. they are shipping exactly what they intend to ship... panels with unacceptable levels of backlight uniformity, BLB and all. That's not a QA issue. That's a quality policy issue.

If all of that's true, then that would severely limit how much any one company's quality policy may differ. Any company that unilaterally weeds out a lot more of their products through QA, puts themselves at a huge disadvantage, because the price of the units they can now no longer sell must be factored into the price of those they can. That's probably not something any company can afford to do in the price sensitive consumer market.

The only companies that could influence much in this area are those that are vertically integrated. Given the choice, those companies would likely opt to maximize profits over maximizing quality.

I don't know if any of this is true... I'm just thinking out loud.


It's a combination of these two things I believe. It makes sense they'll have a set standard for passing monitors, with a list of acceptable issues before it would be rejected for sale, but even within that, things will get missed. The weak link is always going to be the human at the end of that chain, checking over the screen and making sure it passes. You'd expect them to catch the obvious stuff, 100 dead pixels, scratches etc. but with the small details, it's inevitable some bad examples will slip through, not least because of the sheer number of screens they're churning out, time pressures etc. I don't see a foolproof system ever existing, unless you perhaps have advanced AI robots checking them... even then, I'm sure some would get missed.

With the top end professional grade monitors, these are typically going to be subject to more rigorous review, I think precisely because they know their customer is more savvy and won't accept the slightest fault on a £4000 monitor. They will produce less of these screens also, with a heavier emphasis on ensuring quality through all stages of that process, and more time to do it. Consumer grade monitors simply will not get the same 5-star treatment.
 
Last edited:
Many monitors and TV's do have a final visual inspection, in a turned-on state. That's probably the extent of any 'testing of electronics'. There will be automation as well, but there is still likely to be a human element involved. That may just be a cursory glance though, hardly an analysis of every pixel of the screen. Do they test EVERY single screen, I don't know, quite possibly not. It might just be a random selection from any given batch.

Like I say, there will undoubtedly be an acceptable level of 'fault'. Indeed, many manufacturers have the policy that under a set number of pixels, you are not eligible for a return. Some manufacturers do have zero-pixel policies though. We'll never really know the full ins and outs of what panel and monitor manufacturers do regards this at their end.

They will count on most consumers not demanding perfection... which if they had to adhere to would increase their productions costs massively. They will accept the % of return based on whatever cost metrics they've analysed and determined to be the most profitable... realising that absorbing the cost of returns won't exceed the cost of adhering to a far better QA process. Frustrating for us of course, but they're always going to be looking at their bottom lines first.
 
if I had to pick one, then yes

BUT

there is the potential there will be both

I have dropped my HQ contact an email asking him to clarify - once I get a response from him I will pass on the info I can


Interesting. I do wonder how convincing the argument would be for G-Sync over Freesync though... given the latter would in theory play perfectly well with an Nvidia GPU. And also in light of the price hike a full fat G-Sync version would get. I'm sure some die hards would be all over it, but the majority would surely plump for Freesync if it works just as well? I'm a bit hazy on the specific technical variations that might be experienced between the two though, and if there is indeed an argument either way that isn't based solely on price.
 
Hi All,

right

38GL950G - this will be dedicated G-Sync model - we are expecting roughly September 2019 for launch here in the UK

there will be a Freesync/G-sync Compatible model coming as well, but this we wont see till 2020

Currently I do not have access to any specs of either model - but I have asked if I can have some that I can release to you all so once I hear back I will advise

It would be good to know where it will land regards HDR. I know early leaks suggested HDR 1000, then corrected to HDR 600 (but no confirmation), but you said yourself a while back you didn't think, in your opinion, that it would have HDR. Certainly anything below HDR 600 then it might as well not.
 
I think I remember Nvidia saying they only use this module if it is 1000?

As Baddass states, the Acer XB273K demonstrates HDR-1000 isn't a requirement for this module. And besides, the v1 G-Sync module wouldn't have the required bandwidth for this panel, so it has to be using the v2 (if it is indeed going to be full fat G-Sync).


Seems to be different levels of HDR400 interpritation too, my LG F absolutely buries the Aorus I bought to have a play with yet they still rate 400 (tbf though I think the LG is 450).

The problem is that it's such a low barrier for entry, and is largely meaningless. Yes, you could have a fairly decent monitor that has the HDR 400 standard, vs one that really doesn't... but the point is that it won't be adherence to the HDR 400 standard that makes the better monitor what it is... it will be other specifications that go beyond what HDR 400 requires. It's just a pointless standard and shouldn't even exist in its current state.
 
Back
Top Bottom