LG 38GL950G - 3840x1600/G-Sync/144Hz

Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Latency and the other elements don’t really matter for this discussion. This isn’t a thread about projectors v monitors. I was using the new generation of projectors as an example of how low nit and absence of local dimming doesn’t mean HDR cannot work and will be bad. The new technology in the new generation of projectors has made HDR worthwhile on low nit screens without local dimming and there is no reason why a new generations of monitors cannot follow the same path.

It’s not a good idea to just look at old monitors and go all future ones will be rubbish at low nit just because old ones where. It’s too early to write off this monitor for HDR. 450nit can be more then enough for good gaming HDR as long as the rest of the specs are good enough.


No, it's not a thread about projectors vs monitors, so don't mention them lol! THEY ARE A COMPLETELY DIFFERENT TECHNOLOGY lol!!

For a monitor to deliver satisfactory HDR, all it needs to do is achieve a 600cd/m2 peak level with local dimming and you've got something that's approaching OK. This isn't going to be the best HDR experience possible of course, but it's probably satisfactory for most. At lower levels with no local dimming, it will not. This is just a fact.

From the TFT Central article: "What is important though, without question, is the screens ability to improve the active contrast ratio. To do that, you NEED to have some kind of local dimming support from the backlight. Local dimming is vital to create any real improvement in the dynamic range of an image, and it is the support (or lack of in many cases) local dimming that is the main issue here."

This panel is not using future tech... it's pretty standard tech that is well understood. It is not too early to write off this monitor for HDR at all, and I'll put money on it right now not offering anything special at all in this regard. There are no other specs it has (we know the specs by and large) that will change this.
 
Last edited:
Soldato
Joined
29 May 2006
Posts
5,351
No, it's not a thread about projectors vs monitors, so don't mention them lol! THEY ARE A COMPLETELY DIFFERENT TECHNOLOGY lol!!

For a monitor to deliver satisfactory HDR, all it needs to do is achieve a 600cd/m2 peak level with local dimming and you've got something that's approaching OK. This isn't going to be the best HDR experience possible of course, but it's probably satisfactory for most. At lower levels with no local dimming, it will not. This is just a fact.

From the TFT Central article: "What is important though, without question, is the screens ability to improve the active contrast ratio. To do that, you NEED to have some kind of local dimming support from the backlight. Local dimming is vital to create any real improvement in the dynamic range of an image, and it is the support (or lack of in many cases) local dimming that is the main issue here."

This panel is not using future tech... it's pretty standard tech that is well understood. It is not too early to write off this monitor for HDR at all, and I'll put money on it right now not offering anything special at all in this regard. There are no other specs it has (we know the specs by and large) that will change this.
Being different technology doesn’t matter. The HDR element both the problem with low Nit and how they made it work are the same for monitors and projectors. Which was why I brought it up as you said it was impossible when clearly it’s not.

The reasons you said this monitor cannot do HDR are the same reasons people shouted projectors could never do HDR and it was proven wrong. That’s why it’s to early to write off this screen. You cannot just go its 450nit it can never do HDR because old 400nit screens cannot do HDR. It doesn’t work like that.

Perhaps it cannot do good HDR but we cannot know based on the current specs we have. A lot of it will come down to which G-SYNC module the 27GL850G-B and this panel have as if it’s the v2 module then it could benefit from HDR.

"At lower levels with no local dimming, it will not. This is just a fact."
You can still benefit from HDR even with no local dimming and lower levels.
 
Associate
Joined
29 May 2018
Posts
146
I don’t understand a projector is effectively a low nit screen yet the new generation of projectors show a benefit with HDR without local dimming. Why couldn’t this monitor be like that?

You're confused because you're making an apples to oranges comparison.

The WHOLE POINT of HDR (high dynamic range) is for blacks to be darker while high intensity whites (and only them) become brighter. Supporting a wider range of colors (100% DCI-P3) is theoretically unrelated to HDR, but as support for additional color spaces was integrated into the HDR protocols (HDR10, Dolby Vision), the mainstream associates that with HDR as well.

The "massive" benefit you're attributing to HDR projectors is primarily a result of the extended color space. For projectors that's a huge achievement and nothing to sneeze at. People comparing that to monitors wouldn't call that "massiv" however. Professional monitors have long been able to achieve DCI-P3, even without HDR support.

The specular highlights caused by the sun reflecting off of waves or chrome can achieve 10'000 nits. Those are the sorts of things HDR is intended to more faithfully reproduce! That's the WHOLE POINT! However, that is only possible with display technologies that can achieve very high local brightness. A DisplayHDR 1000 TV/Monitor definitely goes a long way towards that goal. A DisplayHDR 600 monitor makes it noticeable. DisplayHDR 400 monitors are ridiculed because the effect is barely noticeable. On a 2m wide screen, the HT5550 achieves a peak luminance of 250 nits (maybe less depending on the screen surface). In the TV and monitor space, that's the equivalent of an SDR (standard dynamic range) display, or about half of what the much ridiculed DisplayHDR 400 certification mandates.

On top of that, every projector I've ever seen attempt DCI-P3 took a huge hit to brightness. If the HT5550 can achieve DCI-P3 while maintaining brightness, rather than cutting it in half, then that would be spectacular. However, that's only a spectacular accomplishment amongst projectors. Compare the resulting HDR image to high end televisions and it's obvious you're not getting anything resembling real HDR. I've yet to look at a laser projector. I imagine they might be better suited to providing a real HDR experience.

We don't know anything about the 38GL950G's level of HDR support. If it's not at least DisplayHDR 600 then Legend is right. There is nothing to debate. It will not provide anything resembling a real HDR experience. That deosn't mean it will be a bad monitor.

BTW:
If you've read TFTCentral's article about DisplayHDR 400, and your take away was that higher peak brightness is not that important, then you're not being honest. The WHOLE POINT of the article was that DisplayHDR 400 needs to go because it can't provide a HDR experience, despite insinuating it can... just like most projectors.
 
Last edited:
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Being different technology doesn’t matter. The HDR element both the problem with low Nit and how they made it work are the same for monitors and projectors. Which was why I brought it up as you said it was impossible when clearly it’s not.

The reasons you said this monitor cannot do HDR are the same reasons people shouted projectors could never do HDR and it was proven wrong. That’s why it’s to early to write off this screen. You cannot just go its 450nit it can never do HDR because old 400nit screens cannot do HDR. It doesn’t work like that.

Perhaps it cannot do good HDR but we cannot know based on the current specs we have. A lot of it will come down to which G-SYNC module the 27GL850G-B and this panel have as if it’s the v2 module then it could benefit from HDR.

"At lower levels with no local dimming, it will not. This is just a fact."
You can still benefit from HDR even with no local dimming and lower levels.


HDR needs local dimming. The definition of HDR is High Dynamic Range, and I could go in to great depth here about contrast ratios etc. but a projector's light source (whether that be a lamp or laser) projects light over the entirety of the tiny image chip(s). The only way to dim the light is to dim the entire image. This has inherent issues which should be obvious, and it's why NO projector, even the most expensive, can deliver true HDR. Yes, some will look "OK", but I GUARANTEE if you actually compared them to a monitor/TV (at the same size, so as you're not "wowed" by the added immersion), you would clearly see even a pretty average TV does a better job, and in many cases a projector would quite likely look better in SDR mode (many users and reviews report this, with HDR actually making a projected image look worse).

I was going to say more, but a5cent has summed it up mostly.

Your whole argument seems to hinge around projectors delivering HDR, so why can't a monitor? The point is THEY CAN'T!! NO PROJECTOR is doing this, not to the true extent of what HDR is. Someone may look at it and say "oh, that looks good", but it won't be true HDR, and it will be blown out the water by what high end LCD and OLED TV's can achieve.

400 nit monitors can't achieve HDR. Never have, never will. You need to stop suggesting otherwise, and you need to re-read that article a few more times.
 
Soldato
Joined
29 May 2006
Posts
5,351
HDR needs local dimming. The definition of HDR is High Dynamic Range, and I could go in to great depth here about contrast ratios etc. but a projector's light source (whether that be a lamp or laser) projects light over the entirety of the tiny image chip(s). The only way to dim the light is to dim the entire image. This has inherent issues which should be obvious, and it's why NO projector, even the most expensive, can deliver true HDR. Yes, some will look "OK", but I GUARANTEE if you actually compared them to a monitor/TV (at the same size, so as you're not "wowed" by the added immersion), you would clearly see even a pretty average TV does a better job, and in many cases a projector would quite likely look better in SDR mode (many users and reviews report this, with HDR actually making a projected image look worse).

I was going to say more, but a5cent has summed it up mostly.

Your whole argument seems to hinge around projectors delivering HDR, so why can't a monitor? The point is THEY CAN'T!! NO PROJECTOR is doing this, not to the true extent of what HDR is. Someone may look at it and say "oh, that looks good", but it won't be true HDR, and it will be blown out the water by what high end LCD and OLED TV's can achieve.

400 nit monitors can't achieve HDR. Never have, never will. You need to stop suggesting otherwise, and you need to re-read that article a few more times.
My argument was that the new projectors look far better in HDR then without HDR. The same could go with this monitor. It might not be as good as the high end LCD and OLED TV's but that doesn’t mean it cannot look better due to HDR even if it’s not what you would call full HDR. Despite what you say 400/450nit can achieve some level of HDR and look better than a none HDR display even if its not as good as the full high end displays. From my experience the high nit screens are worse for gaming up close to the screen anyway. I wouldn't want anything over 600nit on a gaming monitor. 450 is about the sweat spot.
 
Associate
Joined
29 May 2018
Posts
146
My argument was that the new projectors look far better in HDR then without HDR.

Not sure what you're finding so hard to grasp.

New DisplayHDR 400 monitors are no better than newer and good SDR monitors. Many higher end 10 bit SDR monitors are better. The difference between a DisplayHDR 400 monitor in SDR and HDR mode, assuming the monitor is using the same color space for both, is imperceptible. Certainly not anything worth caring about. It was explained why. That's the difference between monitors and projectors, which you apparently don't want to accept. We can't force wisdom upon you, so I'll leave it at that.
 
Soldato
Joined
29 May 2006
Posts
5,351
Not sure what you're finding so hard to grasp.

New DisplayHDR 400 monitors are no better than newer and good SDR monitors. Many higher end 10 bit SDR monitors are better. The difference between a DisplayHDR 400 monitor in SDR and HDR mode, assuming the monitor is using the same color space for both, is imperceptible. Certainly not anything worth caring about. It was explained why. That's the difference between monitors and projectors, which you apparently don't want to accept. We can't force wisdom upon you, so I'll leave it at that.
It’s not that I do no grasp what you are saying. It’s that I think you are wrong and I do not agree with you. Take games like the Division 2, you cannot turn HDR on unless you have a HDR monitor and even on a 300 or 400nit monitor, HDR on can makes a large difference over HDR off in that game. I do not know how you can say its imperceptible.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
My argument was that the new projectors look far better in HDR then without HDR. The same could go with this monitor. It might not be as good as the high end LCD and OLED TV's but that doesn’t mean it cannot look better due to HDR even if it’s not what you would call full HDR. Despite what you say 400/450nit can achieve some level of HDR and look better than a none HDR display even if its not as good as the full high end displays.

Some do, some don't... depends on the model you're talking about and the environment they're being viewed in. They also often need a lot of tinkering with settings, and from reviews/user feedback I have read, work better with some content than others. It can be very hit and miss. Certainly no one is suggesting that HDR on projectors is an amazing feature, because it isn't and is not close to being properly implemented yet due to the limitations of the technology.

Yes, a low level HDR monitor will be able to display some HDR content, but in many cases (as reviews have demonstrated) it can make some games look WORSE when activated. In those games that have good implementation of HDR, it will look noticeably better on a VESA 600 or VESA 1000 monitor than it will on a VESA 400. If you're suggesting otherwise, then I don't know what to say. It's like saying you like eating beans out of the trash vs the tin. Well OK, if that's your thing, you go for it!


From my experience the high nit screens are worse for gaming up close to the screen anyway. I wouldn't want anything over 600nit on a gaming monitor. 450 is about the sweat spot.

This is a very strange statement. How is a high nit screen worse?? It suggests to me you have not experienced proper HDR implementation, because done right, it is VERY impressive. On a 450 nit screen it's not even a halfway house and isn't actually true HDR. It won't be on a projector either for the reasons already mentioned.

As a5cent says, HDR content viewed on a DisplayHDR 400 monitor in SDR and HDR mode will show very little difference... certainly nothing that you'd go "wow, HDR is amazing!". You seem to fundamentally fail to understand how HDR functions and what it requires to function properly. This isn't a matter of opinion, it's a technical fact. 450 nits is CATEGORICALLY NOT the sweet spot lol!!
 
Last edited:
Soldato
Joined
29 May 2006
Posts
5,351
“In those games that have good implementation of HDR, it will look noticeably better on a VESA 600 or VESA 1000 monitor than it will on a VESA 400.”
I am not suggesting otherwise. What I am saying is in games with good HDR like The Division 2 then VESA 400 on a decent HDR screen will be noticeably better than on a none HDR screen. I know a lot of games have poor implantation of HDR and a lot of monitors do as well but that doesn’t mean all future 400nit displays will be useless at HDR.

As for Nit when playing games you don’t need or want everything to be too realistic. You don’t want all the explosions, gun flashes and things to be too realistic or to bright. When it’s badly implanted it is not uncommon on the higher nit screens for the experience to become unpleasant with squinting eyes which subtracts from the gaming experience. 800-1000 nit is nice when you are 10 feet away but it can soon become in some situations at typical monitor distances unpleasant which is not a problem you have at 400nit.



“Certainly no one is suggesting that HDR on projectors is an amazing feature, because it isn't and is not close to being properly implemented yet due to the limitations of the technology.”
In the newest generation it’s much better than no HDR and much better than the previous generation. It’s still behind top end displays but it is a useful good feature. It’s better to have it then not. But this thread it’s about projectors. I am trying to get across that even on 400nit displays there can be a noticeable improvement over no HDR.


"HDR content viewed on a DisplayHDR 400 monitor in SDR and HDR mode will show very little difference... certainly nothing that you'd go "wow, HDR is amazing!"."
That is not something I agree with. With a good implantation its very noticeable. Playing the Division 2 HDR on as low as 350nit or 450 is pretty good and beautiful compared to no HDR. I do not know how anyone can say its very little difference unless you have one of the monitors with rubbish HDR.
 
Associate
Joined
29 May 2018
Posts
146
It’s not that I do no grasp what you are saying. It’s that I think you are wrong and I do not agree with you. Take games like the Division 2, you cannot turn HDR on unless you have a HDR monitor and even on a 300 or 400nit monitor, HDR on can makes a large difference over HDR off in that game. I do not know how you can say its imperceptible.

No, you really don't grasp it, or at least you're not providing any arguments that suggest you do. You might understand the words, but certainly none of the technical underpinnings.

Any DisplayHDR 400 monitor MIGHT slightly adjust the image it displays when HDR is activated, for reasons I won't go into here. However, in general, it won't achieve much you couldn't have done by just adjusting your monitor settings in SDR mode. If that's what you're seeing and you think HDR is notably better, then that's only because your SDR calibration is off. I'm pretty sure you've never actually looked at DisplayHDR 400 monitor yourself. It sounds like you're just taking projectors as your reference and baselessly claiming monitors will behave similarly, while not addressing any of the technical explanations we've given as to why they won't. I'm honestly not sure if you're trolling or just ignorant. Either way, I'm out.
 
Last edited:
Soldato
Joined
29 May 2006
Posts
5,351
No, you really don't grasp it, or at least you're not providing any arguments that suggest you do. 350 nit HDR is a rubbishYou might understand the words, but certainly none of the technical underpinnings.

Any DisplayHDR 400 monitor MIGHT slightly adjust the image it displays when HDR is activated, for reasons I won't go into here. However, in general, it won't achieve much you couldn't have done by just adjusting your monitor settings in SDR mode. If that's what you're seeing and you think HDR is notably better, then that's only because your SDR calibration is off. I'm pretty sure you've never actually looked at DisplayHDR 400 monitor yourself. It sounds like you're just taking projectors as your reference and baselessly claiming monitors will behave similarly, while not addressing any of the technical explanations we've given as to why they won't. I'm honestly not sure if you're trolling or just ignorant. Either way, I'm out.
That's funny as I was just thinking the same thing about you. Take the Division 2 HDR you cannot just get that in SDR mode by adjusting your settings. There is not just a slight difference and the difference is not because SDR calibration is off. Why don’t you try that game on a decent HDR calibrated monitor before saying the difference is slight.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Take the Division 2 HDR you cannot just get that in SDR mode by adjusting your settings. There is not just a slight difference and the difference is not because SDR calibration is off. Why don’t you try that game on a decent HDR calibrated monitor before saying the difference is slight.

Someone needs to do a comparison with photos to demonstrate this. I have seen many complaints about HDR implementation in the Division 2, but I have also seen people praise it using expensive high end 1000-nit TV's.

There is only one 1000-nit PC monitor at the moment (well, two, but they're the same 27" panel) which is £1800... have you used this? Because until you have, you can't really say this would result in unpleasantness and squinting eyes. In fact, I've never heard anyone say that before about this monitor, on the contrary, people generally love it. If your experience of 1000-nit is sat a few feet away from a 55" TV, then I'm not surprised you found it unpleasant lol! There are many people (including professional reviewers) who have highly praised this monitor's HDR capabilities in games that support it well, and I have not seen one person ever suggest that they'd prefer a cheaper inferior 400-nit monitor over this!! That would make about as much sense as saying they'd prefer 60Hz after playing at 144hz!!

It sounds like you're saying that HDR, in its true INTENDED form on a PC monitor, is ineffective, and you personally prefer a significantly weakened and watered down version of it. Although as mentioned above, if you're drawing comparison to other HDR PC monitors, then you need to have made that judgement based on viewing one of the only two 'proper' HDR monitors currently available; the Asus PG27UQ or Acer X27. If you aren't, then there is no validity to your point on this matter. If you have used these monitors, well, it raises other questions. Neither of these monitors are a perfect implementation of true HDR by any means, but it's as close as any PC monitor offers at present. You cannot fairly compare the experience of using a high end HDR TV to a PC monitor, if for no other reason than the distance you sit from it.
 
Last edited:
Soldato
Joined
18 Feb 2010
Posts
6,810
Location
Newcastle-upon-Tyne
A true HDR experience requires that the monitor is able to display brilliant bright shades as well as deep dark shades. That's not something that any VESA DisplayHDR 400 level displays will offer. However; Pottsey is absolutely correct that there is still some benefit to running HDR on a monitor that implements it well even if it's DisplayHDR 400. HDR isn't just about the extremes (deep dark shades, bright light shades) but also about the accuracy of tone mapping between. 10-bit colour support is part of this, even if it's achieved at the GPU level (as with some HDR solutions). An appropriate colour gamut (~DCI-P3) for HDR content also helps, although that's a seperate issue really.

I've seen some absolutely awful HDR implementations at this level which just give a flooded image and are frankly ugly to look at. But there are some models where enabling HDR really lifts the image up. It gives a realism to lighting and shadow details and a nuanced variety of shades that is always lacking in SDR, regardless of the graphics settings used. I'd advise reading or watching my recent reviews of the Acer XB273K and AOC AG273QCX if you want some examples of that.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
A true HDR experience requires that the monitor is able to display brilliant bright shades as well as deep dark shades. That's not something that any VESA DisplayHDR 400 level displays will offer. However; Pottsey is absolutely correct that there is still some benefit to running HDR on a monitor that implements it well even if it's DisplayHDR 400. HDR isn't just about the extremes (deep dark shades, bright light shades) but also about the accuracy of tone mapping between. 10-bit colour support is part of this, even if it's achieved at the GPU level (as with some HDR solutions). An appropriate colour gamut (~DCI-P3) for HDR content also helps, although that's a seperate issue really.

I've seen some absolutely awful HDR implementations at this level which just give a flooded image and are frankly ugly to look at. But there are some models where enabling HDR really lifts the image up. It gives a realism to lighting and shadow details and a nuanced variety of shades that is always lacking in SDR, regardless of the graphics settings used. I'd advise reading or watching my recent reviews of the Acer XB273K and AOC AG273QCX if you want some examples of that.


From what I've seen, I'd certainly prefer a monitor with HDR 400 than nothing at all, although I would certainly not want to pay a high premium for it. It clearly is going to be a lesser experience than what a more highly specified monitor has with FALD etc. But as you say, we are always going to be at the mercy of content itself.

It's where he says 450-nits is the "sweetspot" that he starts going off the rails though. And that he found high-nit screens to be "unpleasant with squinting eyes"... which seems odd, unless he's been sitting 2-feet away from a 55" TV on 100% brightness. I saw an Acer X27 in action a while ago and certainly didn't come away with that impression. The only problem with that monitor is the price, which certainly does make you feel unpleasant and squint. :D
 
Soldato
Joined
18 Feb 2010
Posts
6,810
Location
Newcastle-upon-Tyne
From what I've seen, I'd certainly prefer a monitor with HDR 400 than nothing at all, although I would certainly not want to pay a high premium for it. It clearly is going to be a lesser experience than what a more highly specified monitor has with FALD etc. But as you say, we are always going to be at the mercy of content itself.

It's where he says 450-nits is the "sweetspot" that he starts going off the rails though. And that he found high-nit screens to be "unpleasant with squinting eyes"... which seems odd, unless he's been sitting 2-feet away from a 55" TV on 100% brightness. I saw an Acer X27 in action a while ago and certainly didn't come away with that impression. The only problem with that monitor is the price, which certainly does make you feel unpleasant and squint. :D

Yeah, I wouldn't pay a premium for HDR at that level either. It's a nice feature to have, a little bonus, but not something to pay extra for. For me I really like much higher peak luminance and an effective local dimming solution, it's an important part of the full-fat experience! :)
 
Soldato
Joined
29 May 2006
Posts
5,351
From what I've seen, I'd certainly prefer a monitor with HDR 400 than nothing at all, although I would certainly not want to pay a high premium for it. It clearly is going to be a lesser experience than what a more highly specified monitor has with FALD etc. But as you say, we are always going to be at the mercy of content itself.

It's where he says 450-nits is the "sweetspot" that he starts going off the rails though. And that he found high-nit screens to be "unpleasant with squinting eyes"... which seems odd, unless he's been sitting 2-feet away from a 55" TV on 100% brightness. I saw an Acer X27 in action a while ago and certainly didn't come away with that impression. The only problem with that monitor is the price, which certainly does make you feel unpleasant and squint. :D
Most games with HDR have a slider to cap the maximum luminance. It’s common for gamers to put the slider down to 400/700 nit either for the entire picture or certain elements of it because at high levels it can get unpleasant. It depends on the game you are playing but some of them are too bright an experience more so when in the dark room and a bright flash appears. In the division with the UI it’s common for gamers to have it around 400 for the peak brightness for that element of HDR.

This screenshot is a good example https://i.imgur.com/VUJV9lx.jpg its all dark and all of a sudden you get this massively bright light. Its fine on a SDR or low nit HDR screen but on a high HDR screen it can be eye watering unless you cap the peak brightness.

Most decent games like Assassin's Creed Origins, Star Wars : Battlefront 2, Rise of the Tomb Raider, Division all have peak nit slider. The problem is when a game has a poor implantation of HDR without a slider and you are stuck at 1000 nit. Most of the time 1000nit is better but there are times when its worse then 400 nit.

Some of it comes down to personal preference take division some players have the UI nit level at 1000, others prefer lower at 400. The sweet spot isn’t the same for everyone and can even change based on the environment being played in and the game. Even on a 350 or 450 nit screen HDR is noticeable better in the Division then no HDR. Which is why I am hoping this panel has HDR even if the nit level is only mid range.
 
Associate
Joined
2 Mar 2019
Posts
85
I am not suggesting otherwise. What I am saying is in games with good HDR like The Division 2 then VESA 400 on a decent HDR screen will be noticeably better than on a none HDR screen. I know a lot of games have poor implantation of HDR and a lot of monitors do as well but that doesn’t mean all future 400nit displays will be useless at HDR.

As for Nit when playing games you don’t need or want everything to be too realistic. You don’t want all the explosions, gun flashes and things to be too realistic or to bright. When it’s badly implanted it is not uncommon on the higher nit screens for the experience to become unpleasant with squinting eyes which subtracts from the gaming experience. 800-1000 nit is nice when you are 10 feet away but it can soon become in some situations at typical monitor distances unpleasant which is not a problem you have at 400nit.




In the newest generation it’s much better than no HDR and much better than the previous generation. It’s still behind top end displays but it is a useful good feature. It’s better to have it then not. But this thread it’s about projectors. I am trying to get across that even on 400nit displays there can be a noticeable improvement over no HDR.


"HDR content viewed on a DisplayHDR 400 monitor in SDR and HDR mode will show very little difference... certainly nothing that you'd go "wow, HDR is amazing!"."
That is not something I agree with. With a good implantation its very noticeable. Playing the Division 2 HDR on as low as 350nit or 450 is pretty good and beautiful compared to no HDR. I do not know how anyone can say its very little difference unless you have one of the monitors with rubbish HDR.

I am about to agree with @Pottsey here, if the HDR is implanted the correct way in a game, and the monitor has full support of HDR400 (all boxes ticket in of what makes HDR400), I can see the benefit to have HDR400, if its done right, compared to a monitor who doesnt have this support.
I can also see that higher level of HDR (600-1000) could be very unpleasant for your eyes, but I havent tried it yet, so cant really say.

But Legend and a5cent are also correct that HDR400 can make the game look even worse if implanted wrong.

We dont know yet what HDR classification we gonna get for this monitor/panel, its either gonna be HDR400 or HDR600. I would think that the G-version will get HDR400 and the F-version is gonna get HDR600, if LG release it.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
I am about to agree with @Pottsey here, if the HDR is implanted the correct way in a game, and the monitor has full support of HDR400 (all boxes ticket in of what makes HDR400), I can see the benefit to have HDR400, if its done right, compared to a monitor who doesnt have this support.
I can also see that higher level of HDR (600-1000) could be very unpleasant for your eyes, but I havent tried it yet, so cant really say.

But Legend and a5cent are also correct that HDR400 can make the game look even worse if implanted wrong.

We dont know yet what HDR classification we gonna get for this monitor/panel, its either gonna be HDR400 or HDR600. I would think that the G-version will get HDR400 and the F-version is gonna get HDR600, if LG release it.


The bottom line is that bad HDR will look bad regardless lol! It won't really matter what monitor you have. The only saving grace HDR 400 might have is that it could make that bad implementation less obvious (simply because you are looking at a diluted HDR experience), but it's not true to how great HDR can look... and in those situations when it IS done right, you want a 600-1000 nit panel, and ideally FALD.

Assuming you have properly functioning vision, and don't literally sit in darkened room all day, HDR 600-1000 is not unpleasant for your eyes if done correctly... it's FAR SUPERIOR if done right, especially with FALD. There isn't any doubt about this, in respect to achieving a true HDR experience. Will there be moments where you might be surprised at the brightness of an explosion, a flash of light, a sunrise etc? Sure... but THAT'S THE POINT! It won't blind you, and you won't need to visit the optician the next day, but it IS supposed to have an impact. It's like sound... you wouldn't suggest that someone use inferior speakers/headphones such that loud noises didn't give the gamer a bit of a fright or shock at appropriate moments. They aren't going to blow their eardrums out, but they should feel the visceral impact of sound when it's necessary and serves that given moment in the gaming experience. This is the same function HDR serves.

With HDR 400, if done right, then sure, it's better than nothing at all. But ultimately, if the game has excellent HDR implementation, you want the best HDR monitor possible, and that will NEVER be an HDR 400 one!

:)
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,567
Location
UK
Maybe there are some situations where an HDR400 display can look a bit better if it’s implemented properly and works well. The key point though is that it’s NOT HDR. There is no increase to the dynamic range /contrast (which after all, is what “high dynamic range” is all about) without local dimming included. HDR400 displays don’t require local dimming and I’ve never seen one that includes it

So any possible improvements in perceived picture quality are not because there’s any better dynamic range. It could come from slight improvements in peak brightness, potentially boosted colours (if they’ve added a separate wider gamut backlight option) or might simply be because the screens HDR preset mode is set up to look more sharp, vivid or bright. The latter is likely very common, much like some game and movie preset modes. Anyway the point is, there is nothing creating a higher dynamic range on those models so it’s in no way HDR
 
Soldato
Joined
22 Jun 2012
Posts
3,732
Location
UK
To make it as simple as possible :

400 and 600 = Mediocre at best, quite possibly worse than SDR
1000 = Good
 
Back
Top Bottom