LG 34GK950G, 3440x1440, G-Sync, 120Hz

I don't like this November release date. I am not really concerned with price because the display is something you buy once per 2-3 years and it is the most important part of the setup, and also 950G looks to be exactly what I need, especially looking at all of these new HDR1000 displays that are so poor and how little chance there is for something meaningfully better to come out in close future. I will keep 950G for 2 years and even after this time I will probably have a hard time with finding a good display to upgrade to. The next upgrade is basically real HDR and 5120x2160 OLED, which will take a long time, FALD is simply not suitable for desktop environment, way too many issues.

The only thing I need is for 950G to freaking come out already :p RTX cards are also taking very long, not only a huge gap between Pascal and Turing release, but a long time to wait for reviews and shipments after release, not to mention that the performance is in big question, especially for the price, also there are no water cooled options yet, and zero hope for alternatives from competition.

I understand high prices, limited options, no competition and etc, but if on top of all of that I also have to wait hell knows how long, then I am getting really annoyed. And now both the display and the GPU/GPUs are doing that.
 
And what happens if you overclock the controller? What happens to the connection over the displayport port and the displayport cables? Do they run at a higher frequenzy than the standard allows?

No. They don't. Absolutely nothing happens to the DP cables. Absolutely nothing happens to the DP connection either.

Even if overclocking the monitor's controller would enable the receiver to run at a higher frequency (it doesn't), it would NOT benefit you at all, because the GPU could NOT transmit the video stream any faster than it did before. Most likely the monitor and GPU would entirely fail to communicate.

DisplayPort runs at one of three predetermined and fixed bit rates: HBR1, HBR2 and HBR3 (introduced in DP1.3). If an HBR3 connection provides more bandwidth than the video signal sent across it requires, then the GPU fills the unused bandwidth with "filler" zeros.

Transmitting or attempting to receive at a speed that doesn't conform to one of those HBR standards would make one participant unable to understand the other.

Could it be that you're confusing DP with HDMI?

I don't know much about HDMI, but in contrast to DP it does support variable bit rates. One of the wires carries a clock signal, so overclocking that very likely would cause both sides to transmit/receive the video signal faster.

For DP, overclocking the controller will influence how quickly a received video signal is processed, but not how quickly the video signal is transmitted/received.
 
Last edited:
Ahhhhhhh....

Sorry. That's the only reasonable response to getting yet another contradictory definition of "monitor overclocking".

Anyway, in contrast to what appears to be popular opinion, we still understand next to nothing about what this monitor being overclocked actually means.

We know it provides the option to enable/disable overclocking in the OSD menu. We know this somehow affects the g-sync module. That's it.

Some suspect the monitor could just as well be set to run at 120Hz at all times and the option in the OSD menu removed entirely, as in contrast to a panel, there is no downside to "overclocking" the g-sync module. In this case, the overclockability "feature" exists only to provide a box wannabe-overclockers can tick, but has no technical merit.

If the article SeriousThing linked to is correct, overclocking could reduce chroma subsampling or limit color depth to 6 bit. In this case the ability to limit the monitor to 100Hz is very welcome, because who wants a 6 bit panel? That would turn this $1400 monitor into the equivalent of a $300 piece of junk, whenever it's run at 120 Hz. However, I also want more than a 100 Hz panel. In this case case I wouldn't touch this monitor with a 10 foot pole.

IMHO we're still in wait and see mode.
 
Last edited:
Some suspect the monitor could just as well be set to run at 120Hz at all times and the option in the OSD menu removed entirely, as in contrast to a panel, there is no downside to "overclocking" the g-sync module. In this case, the overclockability "feature" exists only to provide a box wannabe-overclockers can tick, but has no technical merit.

I (personally) think this is the crux. Driving the legacy Gysnc module at 120 hz (for 3440x1440 resolution) is simply out of spec for its design. It might be true that an overwhelming percentage approaching 100 of the modules are capable of 120hz but if its out of spec its out of spec. Its not guaranteed by the supplier (nvidia). There have been reports of, for example, some small number of the alienware or acer models having flicker issues and its always been assumed in these cases its been a panel fault, but really thats just the best guess. In theory it could just as easily be the gsync module, but there is really no way of knowing for sure. It will actually be this new LG panel that will probably provide clarity. If some small % of monitors based on this new panel still have issues running @ 120hz well then the most obvious culprit would be the gsync module.
 
But with new RTX cards you will not be getting anywhere near those kind of FPS, even at 1080p... ultrawide is going to struggle with ray tracing, and that's where G-Sync will prove incredibly beneficial. Of course, if you have no intention of buying a 20xx card and playing games with ray tracing, then you have a point. Even outside of that though, there will always come a time most gamers eventually find their GPU struggles with the latest titles... and they won't want to downscale the graphics to the point they look like a potato... so again, that's where G-Sync comes in to play when that frame rate dips. So yes, you are right in certain instances assuming one can hit extremely high FPS, but as you also state, the average/minimum frame rates will dictate how possible that actually is.

For some Nvidia owners I'm sure the Freesync version of this monitor may be preferable and absolutely suit their needs perfectly... depending on what games they play. But for others it really won't.

The thing is.. if im playing a game and that game don't want to run faster than 60ish fps with dips in the 40ish.. ill stop playing it.. simple.. cause the experience is not all that fantastic imho even with gsync due to image persistence. Now of course if you are not sensitive to this issue then by all means, gsync will make a sub 60fps gaming experience feel better, but for me personally i cannot stand it and i get a headache within minutes which is why i either stop playing the game or switch to fixed maximum refresh rate if the experience isn't complete s***.

on a side note: I have absolutely no intentions of buying a card with features that will likely take a 2nd gen or 3rd gen card to properly run and considering the same features has been demoed 2 years ago on a cheapo little gpu from a much smaller company, i think this has a very good chance of ending up the same way as physx has, with a few good looking titles that requires more horsepower than logic would dictate and then to be left pretty much for dead. again my personal opinion :), and i would for sure get the 144hz version without a doubt. Wouldn't be surprised if colour performance would be better on it as well.
 
The Freesync version of this monitor looks like a bettet deal and that's coming from someone who preordered a 2080.

I wish we could get a release date/price for the Freesync version already. People have been waiting for a Native 144hz 1440p Ultrawide since forever.
 
Do we know when? This side of Christmas? Thanks!

We don't know, only info is coming after the G version, could be weeks, could be months.

I'll wait for the G version till the end of November and, depending on the reviews, its performance and its price, i'll go for the AW3418DW/X34P on Black Friday.
 
We don't know, only info is coming after the G version, could be weeks, could be months.

I'll wait for the G version till the end of November and, depending on the reviews, its performance and its price, i'll go for the AW3418DW/X34P on Black Friday.

Need that review soon so I can determine if I should get the Alienware instead during a black friday sale.
 
It's typical for a manufacturer to state that a refresh rate is "overclocked" if the user needs to use an "overclock" setting in the OSD to be able to use a refresh rate. I'm assuming that is the case here, it is on other models that I've tested with this G-SYNC module (such as the AOC AG352UCG6). If the overclock was just done completely passively and was always enabled, that's different.

If the panel isn't being overclocked and the Gsync module has no issues at 120hz (assumption), then why don't LG just overclock the monitor for us, and remove the option to overclock entirely?

How is it even an overclock if the Gsync module can run at 120hz with no chroma compression? At that point its just a setting that you choose. Any idea why the Gsync module is being 'overclocked'?

Im so confused and am having doubts that a 120hz overclock will be guaranteed. If it was LG wouldn't give us the option to 'overclock' and just set it at 120hz natively. Somethings up with the Gsync module that is preventing this from being 120hz native.
 
Im so confused and am having doubts that a 120hz overclock will be guaranteed. If it was LG wouldn't give us the option to 'overclock' and just set it at 120hz natively. Somethings up with the Gsync module that is preventing this from being 120hz native.

If the monitor can't handle 120Hz stable and, beside of the 144Hz panel, the gsync module when overclocked induce flickering or reduce chroma, I'll be buying the Alienware for sure on Black Friday.
 
Last edited:
If the panel isn't being overclocked and the Gsync module has no issues at 120hz (assumption), then why don't LG just overclock the monitor for us, and remove the option to overclock entirely?

How is it even an overclock if the Gsync module can run at 120hz with no chroma compression? At that point its just a setting that you choose. Any idea why the Gsync module is being 'overclocked'?

Im so confused and am having doubts that a 120hz overclock will be guaranteed. If it was LG wouldn't give us the option to 'overclock' and just set it at 120hz natively. Somethings up with the Gsync module that is preventing this from being 120hz native.

https://forums.overclockers.co.uk/posts/32082338/

As for "Anti-Glare 3H", that tells you the hardness of the surface, its scratch resistance. And that it has matte anti-glare qualities. Nothing else. It will be the same as the screen surfaces used on comparable UltraWides, so light matte anti-glare with a smooth (non-grainy) finish.
 
@PCM2 do you think you'll have the chance to review it before November?

Unlikely I'm afraid. I'll be on vacation for most of October and I'm currently finishing off some other reviews before then. Unless LG can provide a sample in the next week or so, then I might be able to slot it in. I wouldn't hold your breath for that though. :(
 
If the monitor can't handle 120Hz stable and, beside of the 144Hz panel, the gsync module when overclocked induce flickering or reduce chroma, I'll be buying the Alienware for sure on Black Friday.
Agreed. My main concern with the dell is complaints of the screen turning yellower in overclocked mode and the high gamma out of the box. Ive seen reports from 2.4 to 2.68, nothing lower.
 
@Daniel - LG is probably never coming back to this thread LOL

In all seriousness though. If LG can ensure these things, this monitor will be an unmistaken buy over any other ultrawides

1. Stable 120oc - no flickering, chrome subsampling, w/e else
2. Panel uniformity - It seems to be a big issue with the Dell and X34p using the UW4 panels. Especially in OC ranges.
3. 900+:1 contrast ratio minimum. This 750-850 that i'm seeing with the dell really is unacceptable.

Id pay 200-300USD more if those two are guaranteed. Easily.
 
Last edited:
Back
Top Bottom