LG 38GL950G - 3840x1600/G-Sync/144Hz

Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Given the 34GK950G is £1200, circa £1500 seems plausible, but who knows... if it's nearer £2K that's going to be painful, but it'll have the market to itself.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Also this will realistically be 2080/2080Ti or higher really required which is an unfortunate circumstance...

A 1080Ti would do the job nicely. Even a 1080 wouldn't be too shabby, but if you're wanting to push the 144Hz, yeah, 2080Ti ideally. At the price this is going to be though, it's not for the budget gamer.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
So apologies if this repeats any info, but here are some specs I can release for this model;

38" Ultrawide 3840x1600 Res
450 nits
DCI-P3 98% coverage from Nano-IPS display (I think)
HDMI, DP & USB
G-Sync
144Hz
Sphere Lighting (like on 32GK850G and 34GK950G)
Dynamic Action Sync, Black Stabilizer

Is this native 144Hz? No overclocking nonsense to confuse everyone would be nice.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
In regard to HDR:

nVidia's older v1 G-SYNC module is limited to 3440x1440@120Hz. Even if a 3440x1440 G-SYNC monitor includes a panel that can reach beyond 120 Hz, the v1 G-SYNC module will limit the monitor to 120 Hz regardless (see 34GK950G).

It follows that 3840x1600@144Hz (as supported by the 38GL950G) is beyond the capabilities of the v1 G-SYNC module, meaning this monitor will ship with the newer v2 G-SYNC module. The v2 G-SYNC module supports HDR (or more precisely, the v2 G-SYNC module supports the HDR10 protocol).

If a monitor combines a newer panel with a HDR10 capable controller, then nothing prevents the manufacturer from at least slapping a DisplayHDR 400 badge on it. Combining this panel with the v2 G-SYNC module already provides LG with everything that is necessary to achieve that HDR certification level and it does so without LG having to invest a penny more into engineering.

Two assumptions:
  • nVidia doesn't release an entirely new revision of their v1 DP1.2 G-SYNC module that supports 3840x1600@144Hz (in theory DP1.2 does provide enough bandwidth to drive that resolution and refresh rate, but DP1.2 doesn't support HDR10).
  • LG won't pass up, for no reason, on the marketing opportunity to slap a HDR badge on any monitor that deserves it.
Without a FALD backlight a DisplayHDR 1000 certification is off the table. LG might go with a DisplayHDR 600 or 400 certification, or forgo the official VESA certifications entirely and just slap their own "HDR capable" badge on it. If the above two assumptions are true, then LG is practically guaranteed to go with one of those options. That this monitor, with a v2 G-SYNC module, isn't at least marketed as "HDR capable" is almost unthinkable.

Of course DisplayHDR 400 shouldn't be taken seriously. Even DisplayHDR 600 barely deserves to be called HDR, but that is a different topic.


I think we'll start to see microwaves marketed as HDR soon. The whole thing is a complete joke to be honest. Most people on this forum are wise to it of course, but the majority of consumers are not and it only gives the greenlight to manufacturers to whack a premium on every 'HDR' product they sell.

I find it laughable that VESA claim to be a non-profit organisation... there is little doubt in my mind that some fat envelopes were passed underneath the table in order to get these so called 'standards' pushed through. The only ones that benefit are the manufacturers themselves. It's a sad state of affairs really. :(
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
We've only seen so far Gsync with HDR where a FALD is used, and like i said before the current v2 module is, as far as i know, designed for FALD. If a FALD is not going to be used here which is pretty certain, then that implies to me that something new and different is going to be needed to pair it with an edge lit local dimming backlight.


The upcoming Acer XB273VK is obviously using the G-Sync v2 module, but isn't FALD, so clearly it can be used with a non-FALD monitor. The XB273VK is HDR 400 (which as you say is meanginless and pointless in an HDR sense). I would expect the LG 38GL950G comes in at HDR 400 also, if the panel is already known to only be 450cdm2.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
know the v2 module can be used with other backlighting and am very aware of the Acer XB273K as a prime example. but that model is not offering any local dimming, only a standard backlight operation. it's the local dimming that i was suggesting would be omitted here on the 38GL950G and leave it without any meamingful HDR as a result.


Just out of curiosity, is HDR without FALD always going to be largely pointless, regardless of panel brightness? Are there any examples of monitors (or TV's) without FALD that have meaningful HDR impact?
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
If more adaptive sync screens can offer decent and reliable VRR experience maybe longer term there will be less need for Gsync. But given NVIDA have only deemed 15 of 400 odd FreeSync screens suitable for the new Gsync compatible certification it gives you a feel for how many are not up to scratch for use with an NVIDIA card, or rather overall just not ideal for VRR. There’s more to it than just being able to support “some” VRR. A gsync model will still offer certified performance with a wide Hz range. and often (in fact in my experience, normally) they have better overdrive control, less bugs with overdrive settings, and not to mention a pretty much guaranteed next-to-no lag experience. That’s not something you get on many FreeSync screens.

I don't doubt not all Freesync monitors will be plug and play with an Nvidia card... plenty won't offer the same experience as a proper G-Sync module equipped monitor. I think it's good if Nvidia are going to be stringent with their certification though... I guess we'll have to see how the monitors they've passed cope, but hopefully in respect to the factors you mention, they do perform on a par with what a G-Sync enabled monitor would. If not, I do wonder what the point of this whole process is? Everyone is just going to be miffed at Nvidia for certifying a monitor which is barely up to task. Who wins in that equation?

Are you planning on doing some testing as and when the drivers drop and (if) you can get your hands on some of the certified monitors? I certainly wouldn't take Nvidia's word for it, so I'm sure the community would be very appreciative of an unbiased third party looking at how they shape up.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
To be fair, it wouldn't be the first time LG have messed up monitor specs. From past experience, they have their intern's cousin's sister's dog write their marketing blurbs for them. :p
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Oh ya woops I mean the LG will have the DP 1.4 G-Sync chip, but not the HDR part. IMO it will be the G-Sync "ultimate" FPGA but without the FALD back-light to support real HDR.

They are separate things though... the newer G-Sync module can support HDR-1000, but only if the panel has that tech built in... i.e FALD lighting zones, required brightness etc. In absence of those, it will just be a regular G-Sync monitor, but clearly due to the bandwidth requirements, this will be the newer module as the older one will not do 175Hz @ 3840x1600.

It will be interesting to see how this stacks up against the XG438Q... although price wise there may be a massive gulf. In terms of PPI though, they will be virtually the same (110 on the LG vs 104 on the Asus), but the LG will be slightly easier to run, and due to that 175Hz OC, obviously be able to run much faster. However, IPS means less vibrancy vs the VA panel on the XG438Q which also has HDR-600 to bolster that. Consider IPS glow/bleed as well, and the Asus could end up looking quite a bit nicer... however VA often suffers with ghosting/smearing so that could ruin it.

We shall just have to wait and see...

:rolleyes:
 
Last edited:
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Yes, I am aware of that. My point is the expensive $500 Intel Altera Arria 10 GX480 FPGA DP 1.4 G-Sync module has to be in this display, increasing the cost. I highly doubt NVIDIA has made yet another G-Sync module. A cost that is somewhat wasted seeing as the display has no meaningful HDR.


Given the complete absence of HDR, it does beg the question why this is G-Sync at all, in light of Freesync now being supported by Nvidia GPUs. By the time this monitor is released (Q4 2019 seems likely), and at the price point it will no doubt be at, I am not sure it will be seen as offering particularly good value at all. Quite the opposite possibly. :rolleyes:
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
This won't be Q2 or even Q3 I suspect, definitely Q4 and may even slip into 2020, based on past form from LG and every other major monitor release of the past few years. I'd like to be wrong, but everyone should expect the worst here.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
My source: https://www.displayninja.com/new-monitors-in-2019/
Wont be only one version of this panel, if LG havent changed their mind.

That website isn't a "source"... it's a guy writing stuff on his website, which anyone can do lol! Where's he getting his info from? As a5cent mentions, this could still just be G-Sync 'compatible' (i.e Freesync), but there are arguments (and evidence) for both possibilities. We just won't know for sure until closer to launch. Even if full fat G-Sync is the intention though, LG could change their mind on this if they discover the monitor is going to come in crazy expensive. A Freesync version will undoubtedly be cheaper and they may see more business sense in going with that alone... we shall see. There is no denying the fact that now Nvidia supports Freesync, there is a much more level playing field between a G-Sync and Freesync monitor... yet one is still going to be far more expensive.

As to HDR, Daniel from LG has confirmed in this thread that the panel is 450nits... he's a more reliable source than that website you've quoted. If that's the case, this monitor is effectively usless for HDR content, and should be disregarded for anyone looking for this functionality.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
I didnt really misquot as I said "Wont be only one version of this panel, if LG havent changed their mind." You see I am not saying monitor, I am just talking about panels..
It is confirmed that there will be another panel that will be 38", if my source isnt lying, but I think you can find other sources with the same information, if you dont trust the source I used.

Also you are reading way too much into every word, just to try to find something wrong in the persons text. Sure my text is probably not the best way how to phrase yourself, but I am also not native to the english language.


What is your "source" for another 38" panel? From all I've seen, there is only one 38" panel... the LM375QW2... which is the panel going into this monitor. It's the only one listed on TFT Central's panel database (a reliable source), and I've not seen mention of another one elsewhere. The other 38" panels listed on there are the older 60Hz/75Hz ones from a couple of years ago.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
True, they only list one. I guess, there is only gonna be slight changes to the freesync panel that is most likely coming as well, but not enough for it to show up on TFT centrals list of coming panels. But that list doesnt exclude that we could be getting a panel with some minor adjustment, though I dont know if you really could go from 450 to 600 HDR panel with just some minor changes, as I am no expert what is possible or not.

I know the 34" inches F model offers HDR400, but the G version does not. I am pretty sure both of those monitors are using the "same" panel, but with just some minor adjustments to the panels inside.

So my point is, that you could actually say that the G and F version is not using the exact same panel, and I am expecting it would be the same case with this upcoming 38 inches.


Well in respect to HDR, it really won't make much difference what they 'rate' it. It's a 450 nit panel and therefore nowhere near bright enough for HDR. It's nothing to get excited over with this monitor and in fact only something to be discouraged by, as they will no doubt market it as such and bump the price up accordingly, when it won't be fit for purpose.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
A lot of projectors are under 450nit but gain a massive benefit in HDR. Why would this monitor be any different? I am not saying it will be as good as a high nit panel but surly there will still be a decent benefit in HDR even on a sub 450nit panel?

If it's 450nit, cannot hit a cd/m2 peak much higher than that, and is absent any kind of local dimming (which it will be) it will be completely ineffective regards HDR, simple as that. Read this... http://www.tftcentral.co.uk/blog/wh...dr-at-all-and-why-displayhdr-400-needs-to-go/

If, and only if, this monitor is capable of 600 cd/m2 peak output, and can achieve the DisplayHDR 600 standard, we might have something worth talking about. But otherwise, forget it.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
I don’t understand a projector is effectively a low nit screen yet the new generation of projectors show a benefit with HDR without local dimming. Why couldn’t this monitor be like that?


A projector is not a monitor... they are completely different technologies (not to mention utilised in very different ways) and I have yet to see a review of a reasonably priced projector that does HDR justice. Even top end ones can struggle, and require a bat cave environment. It's self evident from the abundance of DisplayHDR 400 monitors out there currently that it isn't fit for purpose. Only at the DisplayHDR 600 point does it become more worthwhile. Read the article posted above if you aren't clear on why.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Different technologies in how they produced light but the principle is the same. If they can benefit then there is no reason that I can see why this screen cannot benefit in the same way. I have read that link and it seems to suggest higher nit is not that important and in fact can be bad for a gamer who sits close to the screen. Lower nit panels can still benefit and look better with HDR then without. You don't need the full DisplayHDR 600 point. Failing to hit the DisplayHDR 600 spec because your peak nit is to low but hitting all the other specs should still make for a good gaming HDR panel.


Yet here we are with no decent implication of HDR in a monitor with such low nit levels. And HDR implication in projectors is still a way behind where it needs to be... I don't know where you're getting your info from, as I have done a lot of projector research and while they have improved over the years, some do a decent job (more expensive ones, not cheap models) they still don't even get close to what top end HDR TV's can produce... and again, you need a BAT CAVE environment anyway. No way can a projector produce satisfactory HDR in your average ambient lit room.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Like I said the new generation of projectors that is just coming out like the BenQ W2700/HT3550 or the BenQ W5700/HT5550 seems to have changed all that. The same could apply to new monitors. Just because there have been no decent low nit level HDR monitor’s in the past it doesn’t mean all future low nit monitor’s will not benefit from HDR. For gaming you don’t need and in fact most likely want to avoid high nit. It’s more the other elements of the HDR spec that make the difference. As long as this screen hits the other specs then 450 nit should be just about in the sweat spot for gaming.

I wouldn't write off the new LG panels yet for HDR. Though I wouldn't rush out and buy it for HDR eithers until more testing has been done.


Again though, you're comparing two disparate technologies which neither function or are used in the same way. I really don't see the value or point in comparing? It would be like comparing the comfort of a gaming chair to a sofa! Latency is also a big issue with projectors... they generally aren't great to game on, more so if you're used to a low latency monitor. Some do a decent job, but again, if you're used to a high refresh gaming monitor, it's far from ideal, and no projector has A-Sync technology either. Projectors are primarily going to be utilised for film/TV viewing, and that's what they are generally geared up for. This LG monitor is more for gamers, clearly.

There's nothing in this LG panel which suggests it will implement HDR any better than anything else. HDR has a ways to go yet, and the shady marketing practices surrounding it certainly don't help. You DO need high nit panels for specular highlights though. It's never going to see great implication whith low nit, not as things currently stand anyway, it just isn't. You only need to see how amazing HDR looks with properly created content on a top-end high-nit TV to realise this. No projector looks that good. A projector does immersion far more successfully though, given how large an area they can be projected on. This can actually be preferable to any TV, depending on the content being viewed. But we're talking about the 'experience' there, which is something else.
 
Last edited:
Back
Top Bottom