LG 38GL950G - 3840x1600/G-Sync/144Hz

Man of Honour
Joined
12 Jan 2003
Posts
20,567
Location
UK
Agreed. There’s no indication that it will feature HDR 1000 so I don’t believe it will be Gsync Ultimate certified. Just regular Gsync
 
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
Oh ya woops I mean the LG will have the DP 1.4 G-Sync chip, but not the HDR part. IMO it will be the G-Sync "ultimate" FPGA but without the FALD back-light to support real HDR.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Oh ya woops I mean the LG will have the DP 1.4 G-Sync chip, but not the HDR part. IMO it will be the G-Sync "ultimate" FPGA but without the FALD back-light to support real HDR.

They are separate things though... the newer G-Sync module can support HDR-1000, but only if the panel has that tech built in... i.e FALD lighting zones, required brightness etc. In absence of those, it will just be a regular G-Sync monitor, but clearly due to the bandwidth requirements, this will be the newer module as the older one will not do 175Hz @ 3840x1600.

It will be interesting to see how this stacks up against the XG438Q... although price wise there may be a massive gulf. In terms of PPI though, they will be virtually the same (110 on the LG vs 104 on the Asus), but the LG will be slightly easier to run, and due to that 175Hz OC, obviously be able to run much faster. However, IPS means less vibrancy vs the VA panel on the XG438Q which also has HDR-600 to bolster that. Consider IPS glow/bleed as well, and the Asus could end up looking quite a bit nicer... however VA often suffers with ghosting/smearing so that could ruin it.

We shall just have to wait and see...

:rolleyes:
 
Last edited:
Associate
Joined
12 Dec 2010
Posts
1,837
Location
Washington D.C.
Yes, I am aware of that. My point is the expensive $500 Intel Altera Arria 10 GX480 FPGA DP 1.4 G-Sync module has to be in this display, increasing the cost. I highly doubt NVIDIA has made yet another G-Sync module. A cost that is somewhat wasted seeing as the display has no meaningful HDR.
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
Yes, I am aware of that. My point is the expensive $500 Intel Altera Arria 10 GX480 FPGA DP 1.4 G-Sync module has to be in this display, increasing the cost. I highly doubt NVIDIA has made yet another G-Sync module. A cost that is somewhat wasted seeing as the display has no meaningful HDR.


Given the complete absence of HDR, it does beg the question why this is G-Sync at all, in light of Freesync now being supported by Nvidia GPUs. By the time this monitor is released (Q4 2019 seems likely), and at the price point it will no doubt be at, I am not sure it will be seen as offering particularly good value at all. Quite the opposite possibly. :rolleyes:
 
Associate
Joined
29 May 2018
Posts
146
Yes, I am aware of that. My point is the expensive $500 Intel Altera Arria 10 GX480 FPGA DP 1.4 G-Sync module has to be in this display.

Yup, but I think you missed the counterpoint being made:

Some monitors will include the Arria 10 GX480 FPGA because they need DP1.4 bandwidth. They may even achieve a DisplayHDR 600 rating, but they will still "only" be G-SYNC monitors (not Ultimate) because they won't achieve 1000 nits peak luminance.

If we're talking about the G-SYNC hardware specifically, nVidia's monitor branding is now more confusing than helpful.

I guess it's probably best to refer to these modules as the v1 and v2 G-SYNC modules (or similar) and not use nVidia's marketing termonology.

That's the point I was trying to get across.
 
Associate
Joined
29 May 2018
Posts
146
Given the complete absence of HDR, it does beg the question why this is G-Sync at all, in light of Freesync now being supported by Nvidia GPUs. By the time this monitor is released (Q4 2019 seems likely), and at the price point it will no doubt be at, I am not sure it will be seen as offering particularly good value at all. Quite the opposite possibly. :rolleyes:

Agreed, although I'd say it begs that question in the absence of FALD (FALD is the only big panel feature FreeSync controllers currently can't handle. Non-FALD HDR is something FreeSync 2 does just fine).

From a pricing perspective it seems wasteful to add $500 to a monitor's cost for non-FALD G-SYNC VRR when you can get a (not always quite as good but) similar VRR experience by adding only $2 for the FreeSync controller.
 
Last edited:
Associate
Joined
29 May 2018
Posts
146
@Daniel - LG

These are the questions raised in the thread so far:
  1. Can you confirm the July/August release date for the UK?
  2. On LG's web page for the 38GL950G, one of the key features mentioned is NVIDIA G-SYNC Compatible. LG's plaque at CES 2019 mentioned only G-SYNC. As we've since learned, those are two different things. A G-SYNC Compatible certified monitor lacks the G-SYNC hardware, meaning it's actually a VESA adaptive sync (a.k.a FreeSync) monitor. Which of those two is correct? Is it real G-SYNC, or will it be G-SYNC Compatible certified?
  3. IF the 38GL950G is a real G-SYNC monitor (NOT G-SYNC Compatible), will LG eventually release an "F" version?
  4. IF the 38GL950G is a real G-SYNC monitor (NOT G-SYNC Compatible), will it require a fan/blower to cool the v2 G-SYNC module (like Acer's X27 or Asus' PG27UQ)?
  5. Any new info on eventual HDR support?
 
Last edited:
Associate
Joined
15 Feb 2015
Posts
1,064
I'm going to assume that it's actually a real G-SYNC monitor which is a bit of a shame really given this still locks you into NVidia - hopefully not a big job for LG to also release a Freesync version that is GSYNC compatible.

Really hoping that it doesn't have or require active cooling :rolleyes:

I also wouldn't be that surprised if other manufacturers ship monitors based on this panel, possibly before LG manages to ship this :p
 
Associate
Joined
29 May 2018
Posts
146
G-Sync is still better and NVIDIA are the only ones who have a GPU fast enough to do this resolution/refresh rate justice anyway.

True. However, with the exception of FALD (AFAIK no FreeSync controller supports FALD), I think it's important to understand that there is no technical reason for FreeSync to be inferior.
  • A monitor OEM can provide an equally good implementation of pixel overdrive (reduces blur) for a FreeSync monitor as they can for a G-SYNC monitor.
  • A monitor OEM can purchase a fast FreeSync controller and optimize their firmware so as to reduce input-lag to similar levels that is typical of G-SYNC monitors.
  • A monitor OEM can purchase a capable FreeSync controller and optimize their firmware to support LFE compensation, thereby matching the VRR range of G-SYNC monitors.
My point is that the quality differences don't stem from technical limitations inherent to FreeSync. What it comes down to are nVidia's strict certification program and the fact that a FreeSync monitor requires a larger engineering effort on behalf of the OEM (which they often skimp on). With FreeSync 2 AMD has finally introduced their own certification requirements for monitors. While I think it has yet to be proven in practice, at least technically, "FreeSync 2" monitors could perform just as well as their G-SYNC counterparts.

I don't know if a single monitor can be certified as both "FreeSync 2" and "G-SYNC Compatible". If that is possible, and most monitors end up carrying both certifications, we might not be able to tell the difference between a G-SYNC Compatible monitor and a real G-SYNC monitor, because the FreeSync 2 certification could mandate the same level of quality nVidia reserves for real G-SYNC.

This is all speculative because for now there is very little to go on. I'm just saying the differences between FreeSync and G-SYNC aren't set in stone and could change quickly.
 
Associate
Joined
21 Nov 2018
Posts
5
can i get some opinions between the 34GK950f and this monitor..the 38gl950g is better in everyway [as of this moment since we dont know the imput lag info] but i can get the 34GK950f for $799 and dont have to pay tax...lets assume the 38gl950g will be $1500 USD, will that $700 price difference be worth it? i'll be using the monitor mainly for gaming and i have a 1080ti TIA
 
Associate
Joined
13 Aug 2012
Posts
253
You cleverly put ultimate in quotes ;)

As far as I can tell nVidia's monitor branding has almost no direct correlation to hardware:

G-SYNC Compatible > Despite the name, need not use either G-SYNC module. Must pass nVidia's basic VRR tests. Need not support HDR.
G-SYNC > Can use either the v1 or v2 G-SYNC module. Must pass nVidia's quality tests. Need not support HDR.
G-SYNC Ultimate > Must use the v2 G-SYNC module. Must pass nVidia's quality tests. Must support HDR at peak brightness of 1000 nits

Because G-SYNC can imply either the v1 or v2 G-SYNC module, the 38GL950G is really just G-SYNC (assuming it's not actually G-SYNC Compatible). At least that's what everything I've read implies.

Which monitors use module v2?
 
Associate
Joined
17 Aug 2018
Posts
209
Location
Edmonton, Alberta, Canada
can i get some opinions between the 34GK950f and this monitor..the 38gl950g is better in everyway [as of this moment since we dont know the imput lag info] but i can get the 34GK950f for $799 and dont have to pay tax...lets assume the 38gl950g will be $1500 USD, will that $700 price difference be worth it? i'll be using the monitor mainly for gaming and i have a 1080ti TIA

ultimately the main differences between it and the 34GK950F will be its more expensive (unknown how much), 4" bigger, and harder to drive. IMO 1080Ti's are already at their limits with 3440x1440 if 100fps gaming is your target. If you're OK with closer to 50-80 fps on your 38" 144hz monitor then your 1080Ti would be ok but I know if it was me I'd be looking at 2080Ti minimum for any resolution above 3440x1440...

Also, I would encourage you to see if you can go see a 38" monitor in person first. I was absolutely convinced I wanted this same monitor when i heard the announcement but then I went to my local BestBuy to see a 38" ultrawide in person (an older LG model) and they are BIIIIIIIG...almost too big imo for a monitor sitting 20-24" inches from your face... I am still considering it but its no longer a homerun purchase, i need to see if I have the room to set it back an appropriate distance on my desk (should probably be about 36" away imo)
 
Associate
Joined
29 May 2018
Posts
146
Which monitors use module v2?
The ones I can think of off the top of my head are:
- X27
- PG27UQ
- XB273K
The upcoming:
- X35
- PG35VQ
will also use the v2 module. Some believe the 38GL950 will as well.

Anytime you see G-SYNC (real G-SYNC, not "G-SYNC compatible") paired with any sort of HDR capability or DP1.4 currently means the monitor is using the v2 module, as the v1 supports neither (only SDR and DP1.2).
 
Last edited:
Associate
Joined
28 Feb 2019
Posts
3
This is definitely **the** monitor I've been waiting for, but if it's going to cost >= $1,500, I might as well get 2x 27'' now (one 144Hz+ and another just 60Hz) and then wait for these to get to 38UC99 (or Dell's U3818DW) pricing (which is now < $900 most of the time)… That's actually one of the reasons why I'd rather get a FreeSync version of this monitor, and save the G-Sync tax, which I know is better but I wouldn't really take advantage of anyway. I just want the 144Hz+ @ 3840x1600 more than I want adaptive sync… ;)
 
Back
Top Bottom