LG 38GL950G - 3840x1600/G-Sync/144Hz

Soldato
Joined
29 May 2006
Posts
5,353
Maybe there are some situations where an HDR400 display can look a bit better if it’s implemented properly and works well. The key point though is that it’s NOT HDR. There is no increase to the dynamic range /contrast (which after all, is what “high dynamic range” is all about) without local dimming included. HDR400 displays don’t require local dimming and I’ve never seen one that includes it

So any possible improvements in perceived picture quality are not because there’s any better dynamic range. It could come from slight improvements in peak brightness, potentially boosted colours (if they’ve added a separate wider gamut backlight option) or might simply be because the screens HDR preset mode is set up to look more sharp, vivid or bright. The latter is likely very common, much like some game and movie preset modes. Anyway the point is, there is nothing creating a higher dynamic range on those models so it’s in no way HDR
But it is HDR, why are you saying it is not? It can play HDR content and look better while doing it and the minimum spec of HDR400 is much higher than the typical average monitor meaning you get a higher range of HDR then none HDR monitors. On average there is an increase in dynamic range/contrast and a wider colour gamut along with the other improvement in the minimum spec like no fake 8bit.

It is far better to have a monitor that hits the HDR400 spec or exceeds it but falls short of HDR600 then to have a none HDR monitor.

Saying its not HDR is like saying the GeForce 2070 is not a graphics card because we have the GeForce 2080ti. Yes the HDR1000 looks better but that doesn't mean the HDR400 spec doesn't do anything. You just need to turn HDR on and off in the division 2 on a HDR400 display to see that HDR does make a difference. All of you saying it is not HDR have clearly not bothered playing around with it for a few hours.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
But it is HDR, why are you saying it is not? It can play HDR content and look better while doing it and the minimum spec of HDR400 is much higher than the typical average monitor meaning you get a higher range of HDR then none HDR monitors. On average there is an increase in dynamic range/contrast and a wider colour gamut along with the other improvement in the minimum spec like no fake 8bit.

It is far better to have a monitor that hits the HDR400 spec or exceeds it but falls short of HDR600 then to have a none HDR monitor.

Saying its not HDR is like saying the GeForce 2070 is not a graphics card because we have the GeForce 2080ti. Yes the HDR1000 looks better but that doesn't mean the HDR400 spec doesn't do anything. You just need to turn HDR on and off in the division 2 on a HDR400 display to see that HDR does make a difference. All of you saying it is not HDR have clearly not bothered playing around with it for a few hours.

i dont think you really understand what HDR is but i'll try and help explain as best i can. HDR is all about creating a higher dynamic range, that being a higher contrast between whites and blacks on an image. That is the absolute fundamental definition of HDR. HDR actually has nothing to do with colour space/gamut, colour depth, resolution or anything else if we are being strict, those are just other additional enhancements that have been bundled in to the general consumer term for "HDR". When HDR is used nowadays it often includes certain requirements for those other areas but they are also additional measures that were added to most common certification schemes, introduced by bodies like the Ultra HD Alliance or VESA to try and bring some kind of order to the HDR market. If you go back to the basics of what HDR is all about though, it is about improving the contrast ratio on the screen at any one point in time.

On an LCD monitor the only way to achieve a higher contrast ratio than the panel is capable of natively is through the use of local dimming. That involves dimming some parts (or "zones") of the backlight where content is darker, and raising brightness of other zones. That creates an improvement in the "active contrast ratio" on the screen and creates the actual HDR effect - that being an improvement in the dyanmic range of the content you are viewing. If you don't have local dimming to actually improve the dynamic range, you don't have an HDR screen.

You can add other enhancements like an extended colour gamut, 10-bit colour depth, improved resolution etc which will enhance the look of the image, but it won't offer you true HDR. there's no improvement to the dynamic range which is absolutely key.

The issue with VESA DisplayHDR400 certification is that its spec states that you dont need local dimming at all! Ive yet to see an HDR400 screen where manufacturers have bothered to include local dimming (why would they bother?) and so these screens cannot, by definition, produce an HDR image! it's impossible to improve the screens contrast ratio beyond the panels native capabilities as the backlight is shone uniformly through the panel at all times.

If we are being even more critical of the VESA DisplayHDR 400 certification (which i think is totally justified) it doesn't even require any improvement to colour gamut, has a crap peak brightness of only 400 cd/m2 (which is barely beyond a normal display anyway) and only needs 8-bit colour depth. A screen can "earn" this badge and have absolutely nothing even remotely close to HDR. it will never have any improvement to dynamic range, and so shouldnt be called HDR at all, without local dimming anyway. but they can also offer no improvements in anything else.

You will likely find some screens with the HDR400 badge where the manufacturers have at least added a wider gamut backlight and maybe even offered 10-bit colour depth support and so they could therefore be more suited to HDR than a traditional standard gamut/8-bit screen. however, it could just as easily be achieved by adding those to a normal SDR screen. Again, even if theyve done that, it's not producing an High Dynamic Range and so that's why it's not HDR.

As i said before, turning "HDR mode" on or inputting an HDR source to an HDR400 monitor might make the image look better, or at least different. that has nothing to do with it offering an improved dynamic range though - as it can't achieve that! other image enhancements might come in to play, like moving to the wider gamut mode, increasing the brightness, artificially improving the vividness or sharpness of the image. that can certainly all make the image look better, but it's not much more than a glorified preset mode. again, there's no improvement to the contrast ratio/dynamic range being displayed.

for further reading i would recommend reading this HDR article and also the previously linked piece about why VESA HDR400 is bad

i hope that helps a bit
 
Soldato
Joined
29 May 2006
Posts
5,353
“i dont think you really understand what HDR is but i'll try and help explain as best i can. HDR is all about creating a higher dynamic range, that being a higher contrast between whites and blacks on an image.”
I am getting a bit fed up of people saying that. I do know what HDR stands for and the HDR400 spec increases the dynamic range by a noticeable amount over a typical average none HDR monitor. The HD400 spec has a minimum color, contrast, black level e.c.t so that you get a decent HDR effect that is beyond what typical none HDR400 monitor can do. So while it’s not as good as HDR1000 is it better than no HDR. The black levels, peak brightness and contrast alone for the HDR400 spec are much better than typical monitors so when it’s correctly implanted you get an improvement to HDR.




“The issue with VESA DisplayHDR400 certification is that its spec states that you dont need local dimming at all! Ive yet to see an HDR400 screen where manufacturers have bothered to include local dimming (why would they bother?) and so these screens cannot, by definition, produce an HDR image!”
That is wrong. You do not need local dimming although it is desirable. There are ways to improve contrast up to a point without local dimming and HDR400 has better contrast, better black levels and better peak brightness then typical none HDR displays. All 3 of those are key to HDR and improve HDR. How can you say better contrast, better black levels and better peak brightness has no impact on HDR just because there is no local dimming? I fully agree local dimming is desirable and offers a much improved image but that doesn’t mean without it that it is not HDR.

Monitors that meet the HDR400 spec will for the most part be far better than those that do not meet the HDR400 spec. Yes I do know some cheap low end monitor have a poor implantation but that doesn’t mean all HDR400 monitors are rubbish.



“and only needs 8-bit colour depth”
Perhaps you missed the point. Not just 8bit but true 8bit to get rid of all those fake 8bit displays and just because the minimum is 8bit it does not mean a HDR400 is only 8bit. A display can be HDR400 with 10bit colour.

You also need to factor in just because a display is only HDR400 spec it doesn’t mean it’s at the bare minimum. Take the LG 38GL950G/27GL850G-B what little we have on specs suggest its past the HDR400 spec but short of the HDR600 spec. Its past the minimum 400nit for HDR400, looks like it will be 10bit colour, these panels are not sitting on the bar minimum of HDR 400. In theory you can be 590nit and with local dimming but that will still only class you as HDR400. You need to look at the full specs before writing off a HDR400 display.

Anyway is not like there are any suitable alternatives that I can find to the LG 38GL950G/27GL850G-B. I am not aware of any HDR600+ screens that meet all my requirements or match the 27GL850G-B in other specs.


“As i said before, turning "HDR mode" on or inputting an HDR source to an HDR400 monitor might make the image look better, or at least different. that has nothing to do with it offering an improved dynamic range though - as it can't achieve that!”
Yes it can and I suggest you go and read up on it all again and test on a real monitor in a decent HDR game like The Division 2.

EDIT: I read that link and a lot of what they do not like about the HDR400 spec is not a problem in relation to the panels this thread is about. They prefer HDR600+ because of things like “10-bit support), colour gamut requirements are boosted to 90%+ of the DCI-P3 coverage,” all of which this panel even if its at HDR400 will support. The unknown element is the full contrast ratio. What if that is beyond the minimum HDR400 spec just like the other areas that are beyond the minimum. What happens if this panel has the global dimming part of the HDR600 spec but is still only HDR400. This panel looks to be in many ways close to and matching the HDR600 spec but just falls short in 1 or 2 areas.
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
I do know what HDR stands for and the HDR400 spec increases the dynamic range by a noticeable amount over a typical average none HDR monitor.......There are ways to improve contrast up to a point without local dimming

how are you suggesting this is achieved? they are using a standard single-zone backlight unit, shining through a normal LCD panel. It's just not possible to improve the dynamic range (contrast ratio) beyond the panel's native capability.

The HD400 spec has a minimum color, contrast, black level

To be certified under the VESA HDR400 spec a display only requires the following - 400 cd/m2 peak brightness, 0.40 cd/m2 black (therefore creating a 1000:1 contrast ratio), 95% BT.709 colour space (i.e. 95% sRGB), 10-bit image processing, 8-bit panel colour depth. In real terms the only required difference beyond most normal displays there is the 400 cd/2 peak brightness. Most normal screens of 27" and above will offer 8-bit colour depth (including many TN Film panels nowadays), and all will offer at least 95% sRGB gamut as well. So the requirements of HDR400 are very lapse. I'm not talking here about displays where manufacturers go above those requirements and have extended gamut backlights/coating, 10-bit panels etc. i'm simply saying that the requirements for HDR400 are so loose that they are open to a lot of abuse and misleading marketing. You can quite easily have a display with all those "requirements" certified as HDR400 and offer no benefits beyond a normal screen without the badge.

So while it’s not as good as HDR1000 is it better than no HDR

i'm not saying that an HDR400 certified display can't be better than a non-HDR display, but it has nothing to do with the badge or certification, that's the point. Since the certification has no requirements for colour depth, gamut or contrast beyond a normal SDR display, anything which may or may not be added by the manufacturer is entirely independent and separate. they could just as easily add those features to a normal display and not bother with the HDR400 badge.

In fact to play devil's advocate a moment, you could easily have a non-HDR certified display which is much better for viewing HDR content than an HDR400 certified display potentially. you could have an HDR400 display with only sRGB gamut, 8-bit colour depth etc but still have the badge. Then a screen which doesn't carry the badge but where the manufacturer has used a wide gamut backlight/coating or a 10-bit panel. The latter would provide benefits for HDR content when it cames to colour rendering and appearance.

HDR400 has better contrast, better black levels and better peak brightness then typical none HDR displays

no it doesn't, the only thing it does guarantee is at least 400 cd/m2 which admitedly is a bit higher than most normal display, although many of those are typically 350 cd/m2

I fully agree local dimming is desirable and offers a much improved image but that doesn’t mean without it that it is not HDR.

it's not HDR simply because the dynamic range - the contrast ratio - is not being improved. thats the crux of this debate

Perhaps you missed the point. Not just 8bit but true 8bit to get rid of all those fake 8bit displays and just because the minimum is 8bit it does not mean a HDR400 is only 8bit.

at the smaller end of the monitor market from 19 - 24" some 6-bit+FRC panels are still used, including for some IPS panel options. Anything above that size, across all technologies including TN Film is almost entirely 8-bit now, if not 8-bit+FRC. It's a moot point really.

You also need to factor in just because a display is only HDR400 spec it doesn’t mean it’s at the bare minimum. Take the LG 38GL950G/27GL850G-B what little we have on specs suggest its past the HDR400 spec but short of the HDR600 spec. Its past the minimum 400nit for HDR400, looks like it will be 10bit colour, these panels are not sitting on the bar minimum of HDR 400. In theory you can be 590nit and with local dimming but that will still only class you as HDR400. You need to look at the full specs before writing off a HDR400 display.

i agree it's possible to have an HDR400 display which actually delivers some HDR benefits (in terms of talking about contrast ratio specifically) but i've yet to see an HDR400 display use any form of local dimming to actually make this happen. My issue with HDR400 is that it doesnt require any local dimming, and so manufacturers don't bother with it as it's not needed.

EDIT: I read that link and a lot of what they do not like about the HDR400 spec is not a problem. They prefer HDR600+ because of things like “10-bit support), colour gamut requirements are boosted to 90%+ of the DCI-P3 coverage,” all of which this panel even if its at HDR400 will support

for transparency, that is my content i linked to. I agree this LG 38GL950G will deliver some benefits in what is commonly referred to as "HDR" in the consumer market, thanks to its improved colour space and colour depth. we don't yet have confirmation as to whether the screen will have local dimming of any sort, and so while it might deliver improved colours, if it can't actually deliver an improved dynamic range (talking about an improved contrast ratio here) then it isn't an HDR display- and again, in my opinion shouldnt be labelled as one at all.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
Oh, boy. It's good nobody recalls DisplayHDR 500.

DisplayHDR 500 is just the same as 600, but with a lower peak brightness requirement. So it's not too bad overall as a certification scheme. it still requires local dimming, 10-bit colour depth, and a DCI-P3 colour gamut. so it's a big step up from HDR400 which doesn't require any of those :)
 
Associate
Joined
24 Aug 2018
Posts
61
DisplayHDR 500 is just the same as 600, but with a lower peak brightness requirement. So it's not too bad overall as a certification scheme. it still requires local dimming, 10-bit colour depth, and a DCI-P3 colour gamut. so it's a big step up from HDR400 which doesn't require any of those :)

How big an area of a independently controlled backlight square zone in FALD can be to retain meaningful HDR effect?

Also, there is a handful of edge-lit DisplayHDR 600 monitors on the market now. You have even reviewed some of them. Had you got any meaningful HDR experience with edge-lit systems?
 
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
How big an area of a independently controlled backlight square zone in FALD can be to retain meaningful HDR effect?

Also, there is a handful of edge-lit DisplayHDR 600 monitors on the market now. You have even reviewed some of them. Had you got any meaningful HDR experience with edge-lit systems?

even some edge lit local dimming screens with smaller numbers of zones can produce some improvements in active contrast ratio and actually impove the dynamic range. it can vary, and there isn't really any certification or criteria that dictates the success of those techniques currently. HDR500, 600and 1000 do at least dictate that you need local dimming, which is a necessity for increasing the dynamic range/contrast. I've seen some edge lit local dimming that works quite well (e.g Philips 43" Momentum). Generally the more zones the better, so FALD should in theory be better than edge lit dimming zones. and then mini LED should be better than FALD, and future micro LED even better. it can still be a bit hit and miss, as just because a screen is FALD doesnt mean it's necessarily working well. for instance the Dell UP2718Q had FALD but it was very slow to respond, making it not very good for gaming.

Future Micro LED or OLED would provide the most finite control at a pixel level, and so is probably the best method for HDR currently. that helps avoid blooming for instance, although with OLED at least you have a more limited max brightness to contend with.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
To be certified under the VESA HDR400 spec a display only requires the following - 400 cd/m2 peak brightness, 0.40 cd/m2 black (therefore creating a 1000:1 contrast ratio), 95% BT.709 colour space (i.e. 95% sRGB), 10-bit image processing, 8-bit panel colour depth. In real terms the only required difference beyond most normal displays there is the 400 cd/2 peak brightness. Most normal screens of 27" and above will offer 8-bit colour depth (including many TN Film panels nowadays), and all will offer at least 95% sRGB gamut as well. So the requirements of HDR400 are very lapse. I'm not talking here about displays where manufacturers go above those requirements and have extended gamut backlights/coating, 10-bit panels etc. i'm simply saying that the requirements for HDR400 are so loose that they are open to a lot of abuse and misleading marketing. You can quite easily have a display with all those "requirements" certified as HDR400 and offer no benefits beyond a normal screen without the badge.

i'm not saying that an HDR400 certified display can't be better than a non-HDR display, but it has nothing to do with the badge or certification, that's the point. Since the certification has no requirements for colour depth, gamut or contrast beyond a normal SDR display, anything which may or may not be added by the manufacturer is entirely independent and separate. they could just as easily add those features to a normal display and not bother with the HDR400 badge.

In fact to play devil's advocate a moment, you could easily have a non-HDR certified display which is much better for viewing HDR content than an HDR400 certified display potentially. you could have an HDR400 display with only sRGB gamut, 8-bit colour depth etc but still have the badge. Then a screen which doesn't carry the badge but where the manufacturer has used a wide gamut backlight/coating or a 10-bit panel. The latter would provide benefits for HDR content when it cames to colour rendering and appearance.


Very informative post overall, but I think the point above is one of the biggest problems with HDR 400, if not THE problem... it's essentially meaningless as a classification, potentially deeply misleading and ultimately has ZERO bearing on the quality of the HDR you will experience from a monitor. The fact that any such monitor could be trounced by a non-HDR variant speaks volumes lol! So while you MAY see better visuals on an HDR 400 monitor over a non-HDR one, the classification does not in any way guarantee this. That is genuinely ridiculous, and it's why, as your article rightly stated, HDR 400 needs to go. Or perhaps changed such that it adheres to stricter specifications, but I don't know if that would be viable or even change things very much.

I hope Pottsey understands now lol! :D
 
Soldato
Joined
29 May 2006
Posts
5,353
“400 cd/m2 peak brightness, 0.40 cd/m2 black (therefore creating a 1000:1 contrast ratio), 95% BT.709 colour space (i.e. 95% sRGB), 10-bit image processing, 8-bit panel colour depth.”
That is the minimum that does not mean a HDR400 panel is running that low. I am not interested in arguing over if the HDR400 spec it to low or not. What matters is if this panel will be any good for HDR or not and how far past the minimum it runs and we know in many areas it will be past the minimum HDR400 spec. But some of the key aspects are unknown.



“You can quite easily have a display with all those "requirements" certified as HDR400 and offer no benefits beyond a normal screen without the badge.”
That’s not correct as of my understanding. Even if you have a display with all those specs but it’s not labeled as HDR/doesn't meet all the minimum specs I don’t believe you can turn on HDR and map the higher peak nit, higher dynamic range. You are stuck at SDR levels even though technically you can run HDR. To use the HDR wider range of settings you have to have a monitor that is seen as HDR.

EDIT: I think there is a small range below HDR400 that is still seen as HDR by windows. So I guess technically you could be at HDR400 spec but that just means you are hitting HDR400 on a device that isn't officially tested but meets all the requirements.



“In fact to play devil's advocate a moment, you could easily have a non-HDR certified display which is much better for viewing HDR content than an HDR400 certified display potentially.”
Technically correct but how would you turn on HDR content and map the wider HDR range? As far as I can tell you would be stuck at SDR levels even though your specs could run HDR. EDIT: I realize after typing this that if you are not HDR400 certified but meets all the HDR specs you should be able to turn on HDR400 and see the same results as HDR400. EDIT end:

So even though that screen should be as good as HDR400 only the HDR400 screen would be able to turn on and run the HDR game/film. If there is a way to override windows please tell me. I have an old WCG monitor that I use for photo editing but windows 10 doesn’t recognize it as WCG while Windows 7/8 does. The monitor I am in front of at the moment all but meats the very minimum HDR spec but I see no way to turn on HDR in game with this screen as the screen itself predates the official HDR spec even though technically it should gain a small benefit if I could turn on HDR.




“no it doesn't, the only thing it does guarantee is at least 400 cd/m2 which admitedly is a bit higher than most normal display, although many of those are typically 350 cd/m2”

As I understand it and please correct me if I have misread the spec but HDR400 requires a corner max limit of 0.40 and tunnel max limit of 0.10. Compared to an average of 0.50/0.50 and SDR displays. HDR400 also requires a true black level of 0.0005.

This panel is expected to exceed 400 cd/m2. Wont that mean it has to have a better contrast ratio then 1000:1?



“it's not HDR simply because the dynamic range - the contrast ratio - is not being improved. thats the crux of this debate” .”
Which is also the unknown factor for this panel :(
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
ok Pottsey, so we are talking now specifically about the LG 38GL950G? previously i was only really commenting on the specifics around what consistutes an HDR display and why the HDR400 spec is largely pointless.

if we are talking about this specific screen my take on it is as follows. At the moment we know that it will feature a DCI-P3 colour gamut, and so colour wise it should offer a boost for HDR. The panel being used was originally planned to be a VESA HDR600 certified panel (in fact at one time LG.Display had it listed as HDR1000) which implies to me that it could well have some form of local dimming and then a peak brightness of at least 600 cd/m2. if it does indeed have local dimming, and an HDR600 certification then it should be capable of providing some level of HDR experience, at least being technically capable of producing an improved dynamic range. it remains to be seen though whether this local dimming will be implemented, we've not yet had confirmation.

if it turns out to only have an HDR 400 badge, then i would expect no local dimming, only the standard 450 cd/m2 brightness we already know from the spec and so its only benefits really for HDR would be from the improved colour space and maybe a small boost in brightness capability.


That’s not correct as of my understanding. Even if you have a display with all those specs but it’s not labeled as HDR/doesn't meet all the minimum specs I don’t believe you can turn on HDR and map the higher peak nit, higher dynamic range. You are stuck at SDR levels even though technically you can run HDR. To use the HDR wider range of settings you have to have a monitor that is seen as HDR. EDIT: I think there is a small range below HDR400 that is still seen as HDR by windows. So I guess technically you could be at HDR400 spec but that just means you are hitting HDR400 on a device that isn't officially tested but meets all the requirements.

think of it this way when considering an HDR400 display with for arguements sake a 400 cd/m2 peak brightness. Imagine you have an input source like an ultra HD Blu Ray HDR player for arguements sake and plug that in to an HDR monitor which can accept an HDR input signal. in the HDR content if an image was mastered at 400 cd/m2 brightness, the monitor would dynamically control the backlight and push it up to basically 100% to produce that peak brightness. but in doing so, because youve turned the backlight up to full, the black depth is now much higher. the resulting contrast ratio is still bound by the native contrast ratio of the panel. because the backlight isn't capable of keeping some areas of the screen at <100% brightness without local dimming, you get no improvement to the dynamic range/contrast. If you had an SDR display which didn't accept the HDR input signal automatically, you could increase the brightness of the backlight to 100% as well, achieve the same 400 cd/m2 peak brightness and end up with the same native contrast ratio of the panel.

maybe there's some benefits in accepting an HDR input signal and the mapping of different brightness levels from content to screen. but my fundamental arguement here is that at no point does it improve the dynamic range of the screen - if you dont have local dimming, you're always going to be bound to the native panel contrast ratio.

As I understand it and please correct me if I have misread the spec but HDR400 requires a corner max limit of 0.40 and tunnel max limit of 0.10. Compared to an average of 0.50/0.50 and SDR displays. HDR400 also requires a true black level of 0.0005.

The "average SDR 0.5 - 0.6 cd/m2" black depth is ficticious and used purely to make it look like HDR400 is better, when really it isn't. If we assumed an SDR display had the same 400 cd/m2 max brightness (which is what they're implying in this comparison on their site), then a 0.5 cd/m2 black depth would be 800:1 contrast ratio. 0.6 cd/m2 black depth would be 667:1 contrast ratio. neither of those figures are realistic nowadays, unless it's a very poor panel. Most TN Film panels are 900 - 1000:1, most IPS panels are 1000 - 1200:1 and most VA panels are 2000 - 3000:1 in real tests.

If you consider the HDR400 spec then of 0.40 cd/m2 black depth, with a 400 cd/m2 brightness that still only gives you 1000:1 contrast ratio. so again the standard doesnt require anything beyond what is already achieved from an SDR display. it all boils down to the native contrast ratio of the panel that's used unless local dimming has been added.

you can see conversely from the HDR500,600,1000 specs that contrast ratio has to be much higher (5000:1+) by those listed specs, and this is only achieved because of local dimming - which is a listed requirement for HDR500 and above

ps the last spec of 0.005 cd/m2 black depth is for the "DisplayHDR 400 True Black" certification which is totally different, and for OLED type display. not to be confused with HDR400 :)
 
Associate
Joined
24 Aug 2018
Posts
61
As I understand it and please correct me if I have misread the spec but HDR400 requires a corner max limit of 0.40 and tunnel max limit of 0.10. Compared to an average of 0.50/0.50 and SDR displays. HDR400 also requires a true black level of 0.0005.

You are a fine troll, mate :)
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
That is the minimum that does not mean a HDR400 panel is running that low. I am not interested in arguing over if the HDR400 spec it to low or not. What matters is if this panel will be any good for HDR or not and how far past the minimum it runs and we know in many areas it will be past the minimum HDR400 spec. But some of the key aspects are unknown.


I don't understand why you aren't interested in the HDR 400 spec?! This goes to the VERY HEART of the issue lol! As pointed out above, the bar for entry in to the 'HDR 400 club' is so ridiculously low that any monitor you buy with this spec isn't guaranteed to give you anything. Just being able to turn on HDR functionality on such a monitor doesn't ensure you a particular visual experience, precisely because the spec it must adhere to in order to achieve this standard is so low!

We have a pretty good idea what the HDR capability of the LG 38GL950G will be. There seems to be a lot of wishful thinking on your part here, but we can quite accurately deduce from the released info that it's going to be average. At 450 cd/m2 brightness this is obvious. There is zero indication it has FALD, and while I am quite sure it will be a good quality panel that offers something over a non-HDR variant, the price point it comes in at (which won't be cheap) will probably raise questions as to the value of that HDR experience. Of course, given there are no other 38" monitors on the horizon offering anything better, and the other solid specs this monitor offers, it may all be a moot point. I am confident this will be a great monitor, providing there are no serious bleed/glow issues... which is going to come down to the typical panel lottery.
 
Soldato
Joined
29 May 2006
Posts
5,353
“ok Pottsey, so we are talking now specifically about the LG 38GL950G? previously i was only really commenting on the specifics around what consistutes an HDR display and why the HDR400 spec is largely pointless”
I have been talking from the point of view of the LG38GL950G and LG 27GL850G panels. Which was why I was saying to the others, it’s too early to write it off even if it’s only at the HDR400 specs. As it’s not at the bare minimum of HDR400. Thank you for the clarification I did mix up HDR400 True and HDR400.

Going back to your other post


“In fact to play devil's advocate a moment, you could easily have a non-HDR certified display which is much better for viewing HDR content than an HDR400 certified display potentially.”
Let’s say this monitor does meet the entire requirement for HDR400 but has an old G-SYNC module so is unable to turn on HDR so not HDR certified. That would make it a non-HDR certified display which would technically be much better at HDR then the lowest end HDR400 certified displays. The question is would you be able to enable HDR in games and would the option remain grayed out because it’s not listed as HDR even though it meets all the minimum HDR400 specs?
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Let’s say this monitor does meet the entire requirement for HDR400 but has an old G-SYNC module so is unable to turn on HDR so not HDR certified. That would make it a non-HDR certified display which would technically be much better at HDR then the lowest end HDR400 certified displays. The question is would you be able to enable HDR in games and would the option remain grayed out because it’s not listed as HDR even though it meets all the minimum HDR400 specs?


It won't have the old G-Sync module, that's impossible as it wouldn't have the bandwidth capable to drive this panel. There is a question as to whether it will have the fan that the G-Sync v2 module has had in the Acer/Asus monitors, so this remains to be seen.

Bottom line, this monitor will almost certainly be HDR 400 certified and given the panel's characteristics (which we know), then it will be OK. Nothing special, nothing WOW, but just OK. This is going to be a VERY expensive monitor though, so that needs to be considered in all of this.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,568
Location
UK
agreed, it must be using the v2 G-sync HDR module given the bandwidth requirements of the resolution and refresh rate. I know this might start a further debate about whether it's a true G-sync screen (with module) or G-sync compatible, but i still believe this is a g-sync module screen for reasons i explained earlier in the thread :)
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
The XB273K has the full expensive FPGA DP 1.4 G-Sync module and it's selling for only $1300. No reason this LG wouldn't have the same thing.


Not really comparable though. That's a traditional 16:9 27" panel. This is 38" 21:9 lol! The 34" LG ultrawide is £1150, which uses the older G-Sync module. There is no way the 38GL950G is coming in more than a penny under £1500, but I won't be surprised if it's closer to £2K.
 
Back
Top Bottom