LG 38GL950G - 3840x1600/G-Sync/144Hz

That website isn't a "source"... it's a guy writing stuff on his website, which anyone can do lol! Where's he getting his info from? As a5cent mentions, this could still just be G-Sync 'compatible' (i.e Freesync), but there are arguments (and evidence) for both possibilities. We just won't know for sure until closer to launch. Even if full fat G-Sync is the intention though, LG could change their mind on this if they discover the monitor is going to come in crazy expensive. A Freesync version will undoubtedly be cheaper and they may see more business sense in going with that alone... we shall see. There is no denying the fact that now Nvidia supports Freesync, there is a much more level playing field between a G-Sync and Freesync monitor... yet one is still going to be far more expensive.

As to HDR, Daniel from LG has confirmed in this thread that the panel is 450nits... he's a more reliable source than that website you've quoted. If that's the case, this monitor is effectively usless for HDR content, and should be disregarded for anyone looking for this functionality.

+1 (100% agree). That part about a single model making more business sense is also super important.
 
This is what displayninja actually said about a hypothetical FreeSync + DisplayHDR 600 variant:



In regard to HDR:

Note that they don't actually say it will support DisplayHDR 600. They simply point to another panel with a DisplayHDR 600 rating and then mistakenly surmise that the 38GL950G's panel will have similar enough specs to achieve the same. That's speculation and very poor speculation at that. Why? For instance, in the paragraph preceeding it, displayninja states the following:



As you correctly implied, two monitors being variants of each other mean they use the same panel. The panel determines peak brightness. However, a panel with 450 nits peak brightness can't achieve a VESA DisplayHDR 600 rating. Displayninja contradicts itself within two paragraphs. Either the FreeSync variant isn't a real variant because it uses a different panel (highly unlikley), or neither variant will support DisplayHDR 600 due to an insufficiently bright backlight (far more likely).

In regard to the FreeSync variant:

I'd say you've misquoted. They speculated that "there will likely be a FreeSync model", which is entirely different from you saying "there won't be only one version of this panel".

I didnt really misquot as I said "Wont be only one version of this panel, if LG havent changed their mind." You see I am not saying monitor, I am just talking about panels..
It is confirmed that there will be another panel that will be 38", if my source isnt lying, but I think you can find other sources with the same information, if you dont trust the source I used.

Also you are reading way too much into every word, just to try to find something wrong in the persons text. Sure my text is probably not the best way how to phrase yourself, but I am also not native to the english language.
 
I didnt really misquot as I said "Wont be only one version of this panel, if LG havent changed their mind." You see I am not saying monitor, I am just talking about panels..
It is confirmed that there will be another panel that will be 38", if my source isnt lying, but I think you can find other sources with the same information, if you dont trust the source I used.

Also you are reading way too much into every word, just to try to find something wrong in the persons text. Sure my text is probably not the best way how to phrase yourself, but I am also not native to the english language.


What is your "source" for another 38" panel? From all I've seen, there is only one 38" panel... the LM375QW2... which is the panel going into this monitor. It's the only one listed on TFT Central's panel database (a reliable source), and I've not seen mention of another one elsewhere. The other 38" panels listed on there are the older 60Hz/75Hz ones from a couple of years ago.
 
I didn't really misquot as I said "Wont be only one version of this panel, if LG havent changed their mind." You see I am not saying monitor, I am just talking about panels.

Okay. Thanks for clarifying. In that case you didn't misquote, but now I don't see how anything displayninja wrote backs up your position, because they are talking about monitor variants, while you are talking about panel variants. In that case there wasn't much point to linking to the displayninja website in the first place. Unfortunately, with that position I think you're even less likely to be correct, because as far as I can tell nobody anywhere claims LG is working on a second version of the LM375QW2 (the panel used in the 38GL950G). Source?

We'll have to disagree about me reading way too much into every word. I'd say the exact opposite is true. For example, even with all the good will and semantic fudging in the world, I'm still unable to re-interpret displayninja's sentence about nVidia's higher HDR standards in a way that would make it correct. I'm pretty sure you can't re-interpret it into correctness either. There is no need to nitpick any of that to find a mistake. It's just flat out wrong. I provided those details because just dismissing your source out of hand felt unfair if I didn't also explain why.

PS: Your English is great for a non-native speaker.

Edit: damnit, @Legend beat me to it...
 
Last edited:
What is your "source" for another 38" panel? From all I've seen, there is only one 38" panel... the LM375QW2... which is the panel going into this monitor. It's the only one listed on TFT Central's panel database (a reliable source), and I've not seen mention of another one elsewhere. The other 38" panels listed on there are the older 60Hz/75Hz ones from a couple of years ago.

True, they only list one. I guess, there is only gonna be slight changes to the freesync panel that is most likely coming as well, but not enough for it to show up on TFT centrals list of coming panels. But that list doesnt exclude that we could be getting a panel with some minor adjustment, though I dont know if you really could go from 450 to 600 HDR panel with just some minor changes, as I am no expert what is possible or not.

I know the 34" inches F model offers HDR400, but the G version does not. I am pretty sure both of those monitors are using the "same" panel, but with just some minor adjustments to the panels inside.

So my point is, that you could actually say that the G and F version is not using the exact same panel, and I am expecting it would be the same case with this upcoming 38 inches.
 
Last edited:
Okay. Thanks for clarifying. In that case you didn't misquote, but now I don't see how anything displayninja wrote backs up your position, because they are talking about monitor variants, while you are talking about panel variants. In that case there wasn't much point to linking to the displayninja website in the first place. Unfortunately, with that position I think you're even less likely to be correct, because as far as I can tell nobody anywhere claims LG is working on a second version of the LM375QW2 (the panel used in the 38GL950G). Source?

We'll have to disagree about me reading way too much into every word. I'd say the exact opposite is true. For example, even with all the good will and semantic fudging in the world, I'm still unable to re-interpret displayninja's sentence about nVidia's higher HDR standards in a way that would make it correct. I'm pretty sure you can't re-interpret it into correctness either. There is no need to nitpick any of that to find a mistake. It's just flat out wrong. I provided those details because just dismissing your source out of hand felt unfair if I didn't also explain why.

PS: Your English is great for a non-native speaker.

Edit: damnit, @Legend beat me to it...

Actually they are talking about both or to phrase it 100% correct, mentioning both, so I am not wrong. And thanks, but I think my english is terrible :p.
 
Last edited:
True, they only list one. I guess, there is only gonna be slight changes to the freesync panel that is most likely coming as well, but not enough for it to show up on TFT centrals list of coming panels. But that list doesnt exclude that we could be getting a panel with some minor adjustment, though I dont know if you really could go from 450 to 600 HDR panel with just some minor changes, as I am no expert what is possible or not.

I know the 34" inches F model offers HDR400, but the G version does not. I am pretty sure both of those monitors are using the "same" panel, but with just some minor adjustments to the panels inside.

So my point is, that you could actually say that the G and F version is not using the exact same panel, and I am expecting it would be the same case with this upcoming 38 inches.


Well in respect to HDR, it really won't make much difference what they 'rate' it. It's a 450 nit panel and therefore nowhere near bright enough for HDR. It's nothing to get excited over with this monitor and in fact only something to be discouraged by, as they will no doubt market it as such and bump the price up accordingly, when it won't be fit for purpose.
 
Actually they are talking about both or to phrase it 100% correct, mentioning both, so I am not wrong.
Then it would be nice if you'd actually provide the quote. Expecting us to divine what you're referring to, or hoping we can pick the right passage out of such a large article by chance isn't going to work. Either way, I don't see them talking about both anywhere. Until you provide the exact quote that supports your claim, I'll stick with the position that you're misunderstanding something.

I know the 34" inches F model offers HDR400, but the G version does not. I am pretty sure both of those monitors are using the "same" panel, but with just some minor adjustments to the panels inside.
This is exactly where I suspected your assumptions actually stem from. Those assumptions are wrong. The panels are identical. There are no slight adjustments. Both 34GK950G and F also achieve the exact same peak luminance. If there had been any slight adjustment, LG would very likely also have changed the panel name/number, because that almost always necessitates different firmware. Over the years I've seen quite a few people claim that slightly different versions of the same panel exist. So far that never turned out to be true.

The reason the 34GK950F has a DisplayHDR 400 rating, while the 34GK950G does NOT, has nothing to do with the panel. Zero. Zilch. Nada. The difference is due to the controllers. nVidia's v1 G-SYNC controller doesn't support HDR10. Monitors that can't accept an HDR10 signal can't support any form of HDR (real, or fake or otherwise). Most likely, LG chose the v1 over the v2 module due to cost, and sacrificed HDR support and refresh rate in the process.

None of that translates to the 38GL950G however, because IF the 38GL950G supports real G-SYNC, then it must use the v2 G-SYNC module, as the v1 module doesn't have the bandwidth to support 3840x1440@175Hz. The v2 G-SYNC module also supports HDR10.
 
Last edited:
It'll take me that long to get the boss (wife) to approve the expenditure! :p

My wife never approves, I just condition her to the idea by mentioning it a few thousand times till she screams at me one day... "Will you just buy the bloody thing and shut the hell up".
Whatever you say luv... you da boss. ;)

But there is zero chance in hell I'm getting the 38" past her after buying the 34" (I am not even going to try, it would be a suicide mission lol).
 
Last edited:
Well in respect to HDR, it really won't make much difference what they 'rate' it. It's a 450 nit panel and therefore nowhere near bright enough for HDR. It's nothing to get excited over with this monitor and in fact only something to be discouraged by, as they will no doubt market it as such and bump the price up accordingly, when it won't be fit for purpose.
A lot of projectors are under 450nit but gain a massive benefit in HDR. Why would this monitor be any different? I am not saying it will be as good as a high nit panel but surly there will still be a decent benefit in HDR even on a sub 450nit panel?
 
A lot of projectors are under 450nit but gain a massive benefit in HDR. Why would this monitor be any different? I am not saying it will be as good as a high nit panel but surly there will still be a decent benefit in HDR even on a sub 450nit panel?

If it's 450nit, cannot hit a cd/m2 peak much higher than that, and is absent any kind of local dimming (which it will be) it will be completely ineffective regards HDR, simple as that. Read this... http://www.tftcentral.co.uk/blog/wh...dr-at-all-and-why-displayhdr-400-needs-to-go/

If, and only if, this monitor is capable of 600 cd/m2 peak output, and can achieve the DisplayHDR 600 standard, we might have something worth talking about. But otherwise, forget it.
 
A lot of projectors are under 450nit but gain a massive benefit in HDR. Why would this monitor be any different? I am not saying it will be as good as a high nit panel but surly there will still be a decent benefit in HDR even on a sub 450nit panel?

Projectors and HDR make me want to pull my hair out.

Can you tell us which make/model of a sub 450 nit projector gains a "massiv" benefit from HDR? I've yet to see any projector do HDR without completely washing out black levels.

Here are some things you may want to consider:

TFTCentral (why DisplayHDR 400 is a joke)
CNET (why projectors have difficulty with HDR)
 
Projectors and HDR make me want to pull my hair out.

Can you tell us which make/model of a sub 450 nit projector gains a "massiv" benefit from HDR? I've yet to see any projector do HDR without completely washing out black levels.

Here are some things you may want to consider:

TFTCentral (why DisplayHDR 400 is a joke)
CNET (why projectors have difficulty with HDR)
The new generation of projectors look good in HDR better then without HDR. Like the new BenQ W2700/HT3550 or the BenQ W5700/HT5550
For some reason they change the name based on region. If those work I didn’t understand why monitors would not be the same. Surly there would be some benefit on the monitor even if its not a good as a high nit panel.
 
If it's 450nit, cannot hit a cd/m2 peak much higher than that, and is absent any kind of local dimming (which it will be) it will be completely ineffective regards HDR, simple as that. Read this... http://www.tftcentral.co.uk/blog/wh...dr-at-all-and-why-displayhdr-400-needs-to-go/

If, and only if, this monitor is capable of 600 cd/m2 peak output, and can achieve the DisplayHDR 600 standard, we might have something worth talking about. But otherwise, forget it.
I don’t understand a projector is effectively a low nit screen yet the new generation of projectors show a benefit with HDR without local dimming. Why couldn’t this monitor be like that?
 
I don’t understand a projector is effectively a low nit screen yet the new generation of projectors show a benefit with HDR without local dimming. Why couldn’t this monitor be like that?


A projector is not a monitor... they are completely different technologies (not to mention utilised in very different ways) and I have yet to see a review of a reasonably priced projector that does HDR justice. Even top end ones can struggle, and require a bat cave environment. It's self evident from the abundance of DisplayHDR 400 monitors out there currently that it isn't fit for purpose. Only at the DisplayHDR 600 point does it become more worthwhile. Read the article posted above if you aren't clear on why.
 
A projector is not a monitor... they are completely different technologies (not to mention utilised in very different ways) and I have yet to see a review of a reasonably priced projector that does HDR justice. Even top end ones can struggle, and require a bat cave environment. It's self evident from the abundance of DisplayHDR 400 monitors out there currently that it isn't fit for purpose. Only at the DisplayHDR 600 point does it become more worthwhile. Read the article posted above if you aren't clear on why.
Different technologies in how they produced light but the principle is the same. If they can benefit then there is no reason that I can see why this screen cannot benefit in the same way. I have read that link and it seems to suggest higher nit is not that important and in fact can be bad for a gamer who sits close to the screen. Lower nit panels can still benefit and look better with HDR then without. You don't need the full DisplayHDR 600 point. Failing to hit the DisplayHDR 600 spec because your peak nit is to low but hitting all the other specs should still make for a good gaming HDR panel.
 
Different technologies in how they produced light but the principle is the same. If they can benefit then there is no reason that I can see why this screen cannot benefit in the same way. I have read that link and it seems to suggest higher nit is not that important and in fact can be bad for a gamer who sits close to the screen. Lower nit panels can still benefit and look better with HDR then without. You don't need the full DisplayHDR 600 point. Failing to hit the DisplayHDR 600 spec because your peak nit is to low but hitting all the other specs should still make for a good gaming HDR panel.


Yet here we are with no decent implication of HDR in a monitor with such low nit levels. And HDR implication in projectors is still a way behind where it needs to be... I don't know where you're getting your info from, as I have done a lot of projector research and while they have improved over the years, some do a decent job (more expensive ones, not cheap models) they still don't even get close to what top end HDR TV's can produce... and again, you need a BAT CAVE environment anyway. No way can a projector produce satisfactory HDR in your average ambient lit room.
 
Yet here we are with no decent implication of HDR in a monitor with such low nit levels. And HDR implication in projectors is still a way behind where it needs to be... I don't know where you're getting your info from, as I have done a lot of projector research and while they have improved over the years, some do a decent job (more expensive ones, not cheap models) they still don't even get close to what top end HDR TV's can produce... and again, you need a BAT CAVE environment anyway. No way can a projector produce satisfactory HDR in your average ambient lit room.
Like I said the new generation of projectors that is just coming out like the BenQ W2700/HT3550 or the BenQ W5700/HT5550 seems to have changed all that. The same could apply to new monitors. Just because there have been no decent low nit level HDR monitor’s in the past it doesn’t mean all future low nit monitor’s will not benefit from HDR. For gaming you don’t need and in fact most likely want to avoid high nit. It’s more the other elements of the HDR spec that make the difference. As long as this screen hits the other specs then 450 nit should be just about in the sweat spot for gaming.

I wouldn't write off the new LG panels yet for HDR. Though I wouldn't rush out and buy it for HDR eithers until more testing has been done.
 
Like I said the new generation of projectors that is just coming out like the BenQ W2700/HT3550 or the BenQ W5700/HT5550 seems to have changed all that. The same could apply to new monitors. Just because there have been no decent low nit level HDR monitor’s in the past it doesn’t mean all future low nit monitor’s will not benefit from HDR. For gaming you don’t need and in fact most likely want to avoid high nit. It’s more the other elements of the HDR spec that make the difference. As long as this screen hits the other specs then 450 nit should be just about in the sweat spot for gaming.

I wouldn't write off the new LG panels yet for HDR. Though I wouldn't rush out and buy it for HDR eithers until more testing has been done.


Again though, you're comparing two disparate technologies which neither function or are used in the same way. I really don't see the value or point in comparing? It would be like comparing the comfort of a gaming chair to a sofa! Latency is also a big issue with projectors... they generally aren't great to game on, more so if you're used to a low latency monitor. Some do a decent job, but again, if you're used to a high refresh gaming monitor, it's far from ideal, and no projector has A-Sync technology either. Projectors are primarily going to be utilised for film/TV viewing, and that's what they are generally geared up for. This LG monitor is more for gamers, clearly.

There's nothing in this LG panel which suggests it will implement HDR any better than anything else. HDR has a ways to go yet, and the shady marketing practices surrounding it certainly don't help. You DO need high nit panels for specular highlights though. It's never going to see great implication whith low nit, not as things currently stand anyway, it just isn't. You only need to see how amazing HDR looks with properly created content on a top-end high-nit TV to realise this. No projector looks that good. A projector does immersion far more successfully though, given how large an area they can be projected on. This can actually be preferable to any TV, depending on the content being viewed. But we're talking about the 'experience' there, which is something else.
 
Last edited:
Again though, you're comparing two disparate technologies which neither function or are used in the same way. I really don't see the value or point in comparing? It would be like comparing the comfort of a gaming chair to a sofa! Latency is also a big issue with projectors... they generally aren't great to game on, more so if you're used to a low latency monitor. Some do a decent job, but again, if you're used to a high refresh gaming monitor, it's far from ideal, and no projector has A-Sync technology either. Projectors are primarily going to be utilised for film/TV viewing, and that's what they are generally geared up for. This LG monitor is more for gamers, clearly.

There's nothing in this LG panel which suggests it will implement HDR any better than anything else. HDR has a ways to go yet, and the shady marketing practices surrounding it certainly don't help. You DO need high nit panels for specular highlights though. It's never going to see great implication whith low nit, not as things currently stand anyway, it just isn't. You only need to see how amazing HDR looks with properly created content on a top-end high-nit TV to realise this. No projector looks that good. A projector does immersion far more successfully though, given how large an area they can be projected on. This can actually be preferable to any TV, depending on the content being viewed. But we're talking about the 'experience' there, which is something else.
Latency and the other elements don’t really matter for this discussion. This isn’t a thread about projectors v monitors. I was using the new generation of projectors as an example of how low nit and absence of local dimming doesn’t mean HDR cannot work and will be bad. The new technology in the new generation of projectors has made HDR worthwhile on low nit screens without local dimming and there is no reason why a new generations of monitors cannot follow the same path.

It’s not a good idea to just look at old monitors and go all future ones will be rubbish at low nit just because old ones where. It’s too early to write off this monitor for HDR. 450nit can be more then enough for good gaming HDR as long as the rest of the specs are good enough.
 
Back
Top Bottom