LG 38GL950G - 3840x1600/G-Sync/144Hz

Soldato
OP
Joined
31 Dec 2006
Posts
7,224
If more adaptive sync screens can offer decent and reliable VRR experience maybe longer term there will be less need for Gsync. But given NVIDA have only deemed 15 of 400 odd FreeSync screens suitable for the new Gsync compatible certification it gives you a feel for how many are not up to scratch for use with an NVIDIA card, or rather overall just not ideal for VRR. There’s more to it than just being able to support “some” VRR. A gsync model will still offer certified performance with a wide Hz range. and often (in fact in my experience, normally) they have better overdrive control, less bugs with overdrive settings, and not to mention a pretty much guaranteed next-to-no lag experience. That’s not something you get on many FreeSync screens.

I don't doubt not all Freesync monitors will be plug and play with an Nvidia card... plenty won't offer the same experience as a proper G-Sync module equipped monitor. I think it's good if Nvidia are going to be stringent with their certification though... I guess we'll have to see how the monitors they've passed cope, but hopefully in respect to the factors you mention, they do perform on a par with what a G-Sync enabled monitor would. If not, I do wonder what the point of this whole process is? Everyone is just going to be miffed at Nvidia for certifying a monitor which is barely up to task. Who wins in that equation?

Are you planning on doing some testing as and when the drivers drop and (if) you can get your hands on some of the certified monitors? I certainly wouldn't take Nvidia's word for it, so I'm sure the community would be very appreciative of an unbiased third party looking at how they shape up.
 
Man of Honour
Joined
12 Jan 2003
Posts
20,564
Location
UK
I've read a fair bit about this whole nVidia Freesync thing and can't remember seeing that. Heresy?
Yeah not seen that anywhere. Potentially getting it confused with the fact that if it’s a certified compatible monitor then VRR gets enabled automatically in drivers. If it’s not certified you can manually enable it but with no performance guarantees
 
Soldato
Joined
14 Sep 2008
Posts
2,616
Location
Lincoln
Yeah not seen that anywhere. Potentially getting it confused with the fact that if it’s a certified compatible monitor then VRR gets enabled automatically in drivers. If it’s not certified you can manually enable it but with no performance guarantees

That ^^

I haven't seen the full tests but I've definitely seen the testing involves the refresh ranges maximum:minimum has to be over 2.4:1. I suspect there's more to it but *shrug*
 
Associate
Joined
29 May 2018
Posts
146
Don't put much stock in nVidia's testing for now. They deemed a monitor to be a failure if it didn't automatically enable VRR.
That statement makes zero sense to me. Source?
The statement makes 100% sense. Are you confusing "sense" with "I don't believe it"?

No.

Adaptive Sync is never supposed to activate automatically. Ever. Not even when connected to an AMD GPU. That is done explicitly by the driver if it is configured to do so.

What you're saying is that nVidia deemed a monitor to be a failure for working correctly. At least that's how I understand it. That makes no sense.
 
Last edited:
Associate
Joined
29 May 2018
Posts
146
I guess we'll have to see how the monitors they've passed cope, but hopefully in respect to the factors you mention, they do perform on a par with what a G-Sync enabled monitor would. If not, I do wonder what the point of this whole process is? Everyone is just going to be miffed at Nvidia for certifying a monitor which is barely up to task. Who wins in that equation?

I assume that depends very much on who you are.

  • For nVidia, the main goal is surely to charge OEMs for their certification service. That is very likely to be far more profitable than building/selling their G-SYNC module. It also allows their hardware to provide VRR capabilities for an ever growing number of entertainment products (e.g. televisions), where FreeSync is becoming the standard. There's more but I'll leave it at that for now...
  • For OEMs the main goal will be to improve profitability by shrinking their product portfolio (eliminating real G-SYNC monitors), while selling more of the products that remain (G-SYNC Compatible FreeSync monitors).
  • The overwhelming majority of consumers will appreciate the ability to acquire a VRR monitor for their nVidia GPU at a FreeSync price (and not care that much about "details" like VRR range).
That represents a win for most people.

Whether it also represents a win for us remains to be seen, but for now I see no reason to believe it won't. FreeSync is not inherently limited. FreeSync can, in theory, achieve the exact same results as G-SYNC, provided the OEM is willing to make the engineering effort required. For now I'm willing to believe the G-SYNC-Compatible and FreeSync-2 certification programs will motivate the OEMs to make that effort.
 
Last edited:
Associate
Joined
29 May 2018
Posts
146
Hey guys, guess what...

Just looked at LG's page for this monitor again and it now says:

NVIDIA G-SYNC Compatible

Looks like this was always planned as a FreeSync monitor and LG already knew what nVidia had planned. It will be super interesting to see how this performs in tests.

I would however expect a G-SYNC-Compatible monitor to also list FreeSync in the specs. :-/
 
Last edited:
Man of Honour
Joined
12 Jan 2003
Posts
20,564
Location
UK
Hey guys, guess what...

Just looked at LG's page for this monitor again and it now says:

NVIDIA G-SYNC Compatible

Looks like this was always planned as a FreeSync monitor and LG already knew what nVidia had planned. It will be super interesting to see how this performs in tests.

I would however expect a G-SYNC-Compatible monitor to also list FreeSync in the specs. :-/
Hmm. That is an interesting spec, although given the overclocking ability, that implies to me that it will still be a traditional Gsync screen with Gsync module, as I’ve not seen any FreeSync screen support any meaningful overclock yet. I’m also pretty sure that it always said “Gsync compatible” before the NVIDIA announcement and hasn’t changed
 
Associate
Joined
29 May 2018
Posts
146
@JediFragger
@Baddass

So you both think the summary mentioning "G-SYNC Compatible" is a coincidence and it's actually still supporting a G-SYNC module? It being coincidental seems unlikely considering every other LG G-SYNC monitor summary mentions "NVIDIA G-SYNC™ Technology". No mention of "Compatible".
 
Soldato
OP
Joined
31 Dec 2006
Posts
7,224
To be fair, it wouldn't be the first time LG have messed up monitor specs. From past experience, they have their intern's cousin's sister's dog write their marketing blurbs for them. :p
 
Caporegime
Joined
18 Oct 2002
Posts
29,679
@a5cent Chillax dude, what does it matter really in the scheme of things (other than possibly a little added latency). A little time will reveal all :)

(but also note that it has G at the end of its suffice, which all of the other LG Gsync monitors also have)
 
Man of Honour
Joined
12 Jan 2003
Posts
20,564
Location
UK
@JediFragger
@Baddass

So you both think the summary mentioning "G-SYNC Compatible" is a coincidence and it's actually still supporting a G-SYNC module? It being coincidental seems unlikely considering every other LG G-SYNC monitor summary mentions "NVIDIA G-SYNC™ Technology". No mention of "Compatible".
Well I believe that was there on the spec page before the NVIDIA announcement but I guess we will know once the final spec page is published. Right now I think there’s more evidence that it will be a normal Gsync screen to be honest, and only that one mention on the page to suggest anything otherwise. We will see though, looks a very interesting screen either way!
 
Associate
Joined
29 May 2018
Posts
146
@a5cent Chillax dude, what does it matter really in the scheme of things (other than possibly a little added latency). A little time will reveal all :)

(but also note that it has G at the end of its suffice, which all of the other LG Gsync monitors also have)
Completely chill over here buddy. I'm not big on deplomacy or etiquette. If I think your post is nonsense I'll say so directly. That's certainly not the case here (or with any of your posts as far as I recall)

Ultimately, I'm not as interested in the specifics of this monitor (although I suspect I will buy it, but as you say, the specs will be revealed in good time) as I'm interested in learning about the monitor market and people's view of that market, in particular if their view is different from mine.
 
Associate
Joined
25 Apr 2017
Posts
1,095
I get what you're both saying and I agree. IMHO AMD pretty much screwed up with FreeSync for the reasons you both mentioned. I don't expect that to change tomorrow. One year from now however?

It's precisely the fact that so few out of 400 FreeSync monitors made the cut that I think G-SYNC will become obsolete (except in the highest-end FALD market where G-SYNC has no competition). The G-SYNC-Compatible logo will identify those FreeSync monitors with a non-crappy VRR implementation. At that point, for most people, G-SYNC may still be viewed as the premium VRR experience, but in many cases it will actually be FreeSync that provides it. AMD has also started fixing this problem, because for FreeSync 2 they now also mandate a decent VRR range and some quality controls. I suspect most FreeSync 2 monitors will be natural candidates for a G-SYNC-Compatible certification. The gap is already closing and will continue to close.

More importantly, nVidia's G-SYNC-Compatible certification allows monitor OEMs to serve both AMD and nVidia customers with a single monitor without anyone having to sacrifice VRR. That's just too much of a profitability/product improvement to ignore. I'd be surprised if not almost every FreeSync monitor currently in the planning phase is aiming to achieve that certification, and the viability/necessity of a separate G-SYNC model is being questioned.

For G-SYNC not to become obsolete, a very substantial amount of nVidia GPU owners (far more than just us enthusiasts) would have to reject the notion that G-SYNC-Compatible VRR implementations are at least good enough VRR. The difference would have to remain obvious enough that for most people that additional $200 - $400 G-SYNC tax continues to sound like a reasonable proposition. Given how seriously nVidia apparently takes the G-SYNC-Compatible certification, I have a hard time imagining that's how this plays out.
You are assuming prices of the certified Freesync monitors are not going to increase which is likely not going to be the case.
 
Associate
Joined
29 May 2018
Posts
146
You are assuming prices of the certified Freesync monitors are not going to increase which is likely not going to be the case.
I actually do think the price for certified FreeSync monitors will increase. I don't think it will increase by even 1/10th of what is typically incurred by v1 G-SYNC, so I assume it is safe to ignore.
 
Last edited:
Soldato
Joined
4 Jul 2012
Posts
16,911
No.

Adaptive Sync is never supposed to activate automatically. Ever. Not even when connected to an AMD GPU. That is done explicitly by the driver if it is configured to do so.

What you're saying is that nVidia deemed a monitor to be a failure for working correctly. At least that's how I understand it. That makes no sense.
You say no, then confirm that is what you actually meant. So yes you did mean that you don't believe it. My statement makes sense, you just don't think action I've described makes sense.
 
Back
Top Bottom