LG 38GL950G - 3840x1600/G-Sync/144Hz

Man of Honour
Joined
12 Jan 2003
Posts
20,567
Location
UK
ok thanks, hadn't seen that before. That menu software matches the 34GK950G (G-sync model) and is a different style to the menu found on the 34GK950F (FreeSync model). another indication in my mind that the 38GL950G is a true G-sync screen.

in my opinion the evidence for and against this being a normal G-sync screen is

FOR it being a traditional G-sync screen with G-sync module

  • G naming convention (38GL950G)
  • overclockable refresh rate of 175Hz - never seen any non-Gsync module screen offer anything like this
  • OSD menu design and apperance matches 34GK950G and not F model (recent displays from LG)
  • only "NVIDIA G-sync" listed on product spec at CES demos (no mention of "G-sync compatible")
FOR it being a FreeSync/Adaptive Sync screen just with "G-sync compatible" instead
  • LG coming soon spec page lists it as "G-sync compatable" - although this was, i am 90% sure, already advertised in this way before NVIDIA's announcement about supporting adaptive Sync from some displays
 
Associate
Joined
25 Apr 2017
Posts
1,118
ok thanks, hadn't seen that before. That menu software matches the 34GK950G (G-sync model) and is a different style to the menu found on the 34GK950F (FreeSync model). another indication in my mind that the 38GL950G is a true G-sync screen.

in my opinion the evidence for and against this being a normal G-sync screen is

FOR it being a traditional G-sync screen with G-sync module

  • G naming convention (38GL950G)
  • overclockable refresh rate of 175Hz - never seen any non-Gsync module screen offer anything like this
  • OSD menu design and apperance matches 34GK950G and not F model (recent displays from LG)
  • only "NVIDIA G-sync" listed on product spec at CES demos (no mention of "G-sync compatible")
FOR it being a FreeSync/Adaptive Sync screen just with "G-sync compatible" instead
  • LG coming soon spec page lists it as "G-sync compatable" - although this was, i am 90% sure, already advertised in this way before NVIDIA's announcement about supporting adaptive Sync from some displays
It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible. They may have been caught off guard.
 
Caporegime
Joined
18 Oct 2002
Posts
29,803
It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible.

You do know that companies just can't make changes on the fly instantly don't you? Like magic? These things are planned months and months in advance and set in stone long before the show.
 
Associate
Joined
29 May 2018
Posts
146
Tbh, I'm in awe of that 'nVidia co-op Freesync' move you detailed above! That would be pretty genius (and admirable in a dirty sort of way) ;)
Please keep it to yourself that I have a dirty mind. Thank you ;)

It could also be they decided to opt out of G-Sync the moment NVIDIA announced FreeSync support and they switched to G-Sync compatible. They may have been caught off guard.

In terms of product development, configuring the electronics to most optimally control the panel is really the only (noteworthy) thing a monitor OEM does. Exchanging the electronics basically means starting from scratch. LG would be throwing out at least nine months of an engineering team's efforts. While nothing is impossible, I also think that is highly unlikely.
 
Last edited:
Associate
Joined
29 May 2018
Posts
146
@Baddass
The v1 G-SYNC module isn't powerful enough to drive this monitor. If the 38GL950G is in fact G-SYNC, then it must be using the v2 G-SYNC module. Now consider this:

The x27 and PG27UQ require a fan. Both Asus and Acer have confirmed that the fan is an active cooling requirement for the v2 G-SYNC module (not for the FALD backlight). From everything we've seen, the 38GL950G has no fan. That means the 38GL950G is not using the v2 G-SYNC module.

Caveats:
  1. The resolution and refresh rates LG has published are correct
  2. nVidia will not surprise us with brand new/yet unreleased/unkown G-SYNC hardware
If both caveats hold true, then the 38GL950G can only be FreeSync. If it is FreeSync, it very likely will be FreeSync 2 certified, meaning it will be using a very new controller which I'd fully expect to manage a 175 Hz overclock.
 
Soldato
Joined
18 Feb 2010
Posts
6,810
Location
Newcastle-upon-Tyne
@a5cent

You seem to be jumping to conclusions here. Have you not entertained the thought that perhaps Nvidia have more than just 2 types of G-SYNC module? This is a brand new monitor model and it offers capabilities that others do not. In fact it's not even due out until Q3/Q4.

I agree with Baddass that the weight of evidence suggests this is a G-SYNC monitor. There is no way LG would go 'all out' with the G-SYNC compatible marketing without making a big thing about the fact the monitor also supports FreeSync. They would not have a 'G' for the model suffix, plus the other points Baddass raised which are very good ones. The subtitle for the official product page is also very clear. No mention of FreeSync nor 'G-SYNC compatible'.

"38" Class 21:9 UltraGear™ QHD+ Nano IPS LED Gaming Monitor w/ NVIDIA G-SYNC"

Of course we're all aware these product pages can't be trusted to be entirely complete (this one isn't) or accurate. But you're basing your assumptions on a single line on this product page which doesn't really have a clear meaning anyway and is seemingly contradicted by other evidence.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,567
Location
UK
The v1 G-SYNC module isn't powerful enough to drive this monitor. If the 38GL950G is in fact G-SYNC, then it must be using the v2 G-SYNC module. Now consider this:
i agree with this point, v1 module is not suitable here for this resolution and refresh rate

The x27 and PG27UQ require a fan. Both Asus and Acer have confirmed that the fan is an active cooling requirement for the v2 G-SYNC module (not for the FALD backlight)
i dont believe this to be true. I believe the reason for the active cooling is more due to the FALD backlight, you can tell how how the overall screen gets quite quickly when FALD is enabled for gaming, and i don't believe that is only heat coming from the G-sync module. Where have Asus and Acer specifically said it's cooling for the G-sync v2 module and not related to the FALD?

e have not seen any screens with the v2 module yet without FALD so it will be interesting to see whether the forthcoming Acer Predator XB273K has a fan or not (using the v2 module as far as we know, if not a new different module potentially) and without FALD. I am expecting that model not to have a fan to be honest.

From everything we've seen, the 38GL950G has no fan.
i think this is jumping to conclusions a bit. no one has had chance to examine this screen in that level of detail as far as i know, open it up or check in case it's got a fan. no one has had chance to push the screen in extended periods of gaming, and identify whether a fan kicks in or not to cool it. there's a chance it may have a cooling fan. there's also a good chance that even if the G-sync v2 module needed additional cooling, that without the FALD on the 38GL950G they could come up with a passive cooling system i expect quite easily. So to say that because we're guessing it doesn't have a fan, that must mean it's not using a G-sync module is a bit rash.

One other additional piece of evidence for the use of a G-sync module here is that the spec announced so far talks only about a single DP and a single HDMI interface. that's a G-sync limitation, and you would expect to see a second HDMI, maybe a USB type-C or other connections featured if it was not using a G-sync module and was in fact a normal adaptive sync screen.
 
Associate
Joined
29 May 2018
Posts
146
You seem to be jumping to conclusions here. Have you not entertained the thought that perhaps Nvidia have more than just 2 types of G-SYNC module?

No, I'm not jumping to conclusions. I explicitly mentioned that exact caveat in my very last post for exactly that reason. Obviously, and as already stated, if there are more types of G-SYNC modules then the ones we are currently aware of, then my last hypothesis is almost certainly wrong. I'd love it if nVidia did have a new G-SYNC DP1.4 module up their sleeves, I just think the chances of that being true are very slim. Assuming such a G-SYNC module exists actually would be jumping to conclusions.

But you're basing your assumptions on a single line on this product page which doesn't really have a clear meaning anyway and is seemingly contradicted by other evidence.

No.

a) I and others are just collecting evidence for the things we think might point towards G-SYNC or FreeSync, and then we debate them. I'm leaning in one direction, but I'm not even close to assuming anything.
b) My very last post was not related to that "G-SYNC Compatible" line at all, and it is a reason to call G-SYNC in question.
 
Last edited:
Associate
Joined
29 May 2018
Posts
146
i dont believe this to be true. I believe the reason for the active cooling is more due to the FALD backlight

I remember reading a publication where the author claimed to have been told by both Acer and Asus that the fan was for the G-SYNC module. Unfortunately I can't remember where and my Google-Fu has failed me. However, PCPer did a teardown of the PG27UQ and stated the same thing:

Now that we have a better view of the PCB, we can see exactly what the aforementioned blower fan and heatsink assembly are responsible for— the all-new G-SYNC module.

source

This is rather central to me leaning towards FreeSync, so if you're right that the v2 G-SYNC module doesn't require a fan, then that would sway me towards it being G-SYNC. The specs for the FPGA nVidia uses also suggests to me it would require a fan. That's what I'm going with for now.

no one has had chance to examine this screen in that level of detail as far as i know, open it up or check in case it's got a fan. no one has had chance to push the screen in extended periods of gaming, and identify whether a fan kicks in or not to cool it. there's a chance it may have a cooling fan. there's also a good chance that even if the G-sync v2 module needed additional cooling, that without the FALD on the 38GL950G they could come up with a passive cooling system i expect quite easily. So to say that because we're guessing it doesn't have a fan, that must mean it's not using a G-sync module is a bit rash.

I'm going off the pictures of the product and I don't think it's rash to assume that a fan would require ventilation slots. On the X27 and PG27UQ they are easily visible. I don't see any similarly obvious ventilation slots on the 38GL950G. Uncertain? Yes. Even speculative? Absolutely. Rash? I don't think so.

Tne other additional piece of evidence for the use of a G-sync module here is that the spec announced so far talks only about a single DP and a single HDMI interface. that's a G-sync limitation, and you would expect to see a second HDMI, maybe a USB type-C or other connections featured if it was not using a G-sync module and was in fact a normal adaptive sync screen.

A very good point indeed.
 
Last edited:
Soldato
Joined
18 Feb 2010
Posts
6,810
Location
Newcastle-upon-Tyne
How can you say you're not jumping to conclusions? You're assuming that this particular module is used. You're assuming it has some limitations tied to it that may or may not even apply to this monitor re. a cooling solution. And you're assuming, against a larger weight of contrary evidence, that this monitor is actually a FreeSync model that is being marketed in a completely unique and counter-intuitive way. We'll know one way or another soon enough, but to suggest you're not jumping to conclusions is flat out nonsense.
 
Associate
Joined
25 Apr 2017
Posts
1,118
The comical aspect of this is that if this is a G-Sync compatible monitor,NVIDIA have managed to wipe out every trace of AMD from the marketing lol. Not a hint of FreeSync to be seen. Is this how things go from here?
 
Associate
Joined
29 May 2018
Posts
146
I've never seen such walls of text over such a small detail on a monitor, it's mind-boggling!!!

I don't know about you, but for me whether it's FreeSync or G-SYNC is a make or break issue. That's not a minor detail. Up until I read about G-SYNC-Compatible, a monitor being FreeSync meant I couldn't buy it.

Either way, I think the topic is probably exhausted at this point.
 
Soldato
Joined
4 Jul 2012
Posts
16,911
The comical aspect of this is that if this is a G-Sync compatible monitor,NVIDIA have managed to wipe out every trace of AMD from the marketing lol. Not a hint of FreeSync to be seen. Is this how things go from here?
As long as it works for everyone, it literally doesn't matter. I doubt AMD even care to be honest. Just as long as it doesn't result in repercussions for AMD cards, it's fine.
 
Associate
Joined
17 Nov 2018
Posts
11
Considering that LG has managed to turn 34GK950G into paper launch with poor availability that rivals 2080Ti's it will be interesting to follow this panels road to market, how much of an vaporware will this be in comparison :D
 
Back
Top Bottom