LG 34GK950G, 3440x1440, G-Sync, 120Hz

So we dont know what downside the overclock will have, besides that it will have less downsides than the acer or alienware? Would be good to know!

What else i would like to know is:

- Will the G model work in 16:9 (black bars left and right) at a resolution of 2560*1440 out of the box (ingame option menu) or do we need to fiddle with configs for this to work?

- If 16:9 is possible, will we get more than 100/120 hz at 2560*1440 (maybe 144hz)?
 
Note that no G-SYNC model supports scaling (interpolation) via DP. So you will need to rely on GPU scaling instead if using DisplayPort (and why wouldn't you be as a PC user - no G-SYNC otherwise). You can run the monitor in 16:9 with black borders by using GPU scaling, certainly. That also means you're limited to 120Hz regardless of resolution, because the monitor will always be running at 3440 x 1440.
 
So if i allow GPU scaling in the Nivida control center i can choose 2560x1440 in the game options after setting 16:9 aspect ration in the game options.
This does mean that the monitor still runs at 3440x1440 but with black bars (left and right) and i only get 120hz max. But i will get improved fps because the gpu only renders the lower resolution?

Ok so with HDMI i dont know if i get this right. HDMI doesnt allow gsync but without gsync using HDMI i could use scaling and get higher hz (maybe not with HDMI 1.2 but with 2.0?)?
 
The monitor is restricted to 120Hz... no matter what you do, you can't get higher.

If you can set 2560x1440 and 16:9 in game, then you don't have to change anything else... it will automatically work with black bars at the side (at least, this is how Overwatch works on the Alienware). You probably will get higher FPS because less pixels are being pushed.

I don't know why you would want to use HDMI or scaling in this way... unless you can push constant >120FPS, then you are better off using G-Sync, which should smooth out variable FPS.
 
Going to ask what is probably a stupid question. The tech description says Windows 10 - Yes. Does that mean if you having Win7, the monitor wont work ? Probably not the case but thought I ask incase it actually is case. :o Also, some tech descriptions just saying Windows, not Windows 10 hence being curious about it.
 
Going to ask what is probably a stupid question. The tech description says Windows 10 - Yes. Does that mean if you having Win7, the monitor wont work ? Probably not the case but thought I ask incase it actually is case. :o Also, some tech descriptions just saying Windows, not Windows 10 hence being curious about it.

No questions are stupid. And yes, you can use Win7!
 
So we dont know what downside the overclock will have, besides that it will have less downsides than the acer or alienware? Would be good to know!

What else i would like to know is:

- Will the G model work in 16:9 (black bars left and right) at a resolution of 2560*1440 out of the box (ingame option menu) or do we need to fiddle with configs for this to work?

You always have to use config because on-display scaling is often terrible and is stretching everything regradless of what apect ratio you choose in OSD. This is a basic setup you have to do no matter what display you use, 16:9 or 21:9. You go to Nvidia Control Panel and tell the GPU how to scale. You can stretch, maintain aspect ratio or use 1:1 pixel, so for example 1920x1080 fullscreen app will use only 1920 and 1080 pixels, so you will have small square in the middle of the screen with black bars all around, but it will be as sharp as it was native 1080p, without all of the upscaling blur. You can also tell the GPU to ignore on-application scaling. I takes 10 seconds in total and works flawlessly.
 
Last edited:
I still can't find a single reason why people are so hyped about this monitor, it brings nothing new to the table that we haven't seen already. And on the side note x34p and aw3418wd do the same for less
 
I still can't find a single reason why people are so hyped about this monitor, it brings nothing new to the table that we haven't seen already. And on the side note x34p and aw3418wd do the same for less
I'm more hyped for the Freesync version of this monitor, for the sole reason it's 144hz native. No overclocking BS required, meaning it will avoid problematic overclocking issues months down the line that I've seen crop up on x34p/aw3418 complaints. I'm sorry, but dropping a grand on a panel for it only partially work is unacceptable to me, and not worth the risk.
 
I still can't find a single reason why people are so hyped about this monitor, it brings nothing new to the table that we haven't seen already. And on the side note x34p and aw3418wd do the same for less

One reason is because it's a newer panel (UW5) that we haven't seen on any other product - and I think people are hopeful of it bringing about actual improvements in QC and lessening IPS glow and increasing panel uniformity - the better colour gamut offered by Nano IPS is a bonus too. People who don't yet have an ultrawide and are thinking of getting one are probably reluctant to pay the premium prices for what is now an 'old' panel (UW4) when they know a new one is about to be released, or if you're talking about the AU Optronics offerings they have much worse QC than LG panels and I don't think many people care to have to go through the headache that is replacing 5 panels to try to find one that is even remotely 'acceptable'. This LG monitor also doesn't look like a 5 year old designed it, with quite an understated design for a monitor targeted at the gaming market - this is a big positive for a lot of people.

For the most part, I really think a lot of it is down to the general gloomy atmosphere surrounding monitors in general lately... across the board, not just talking about ultrawides, although the problem is probably exacerbated in uw panels. So I think the prospect of a new panel makes people hopeful that it won't be as inconsistent in quality as what is currently available on the market. Nobody wants to spend £800+ on anything, let alone a monitor (that they have to stare into for hours on end) that they are not happy with.
 
I'm more hyped for the Freesync version of this monitor, for the sole reason it's 144hz native. No overclocking BS required, meaning it will avoid problematic overclocking issues months down the line that I've seen crop up on x34p/aw3418 complaints. I'm sorry, but dropping a grand on a panel for it only partially work is unacceptable to me, and not worth the risk.

There's no reason to believe that the "overclock" of the G-Sync version of the monitor will cause any problems. It's completely different in that the panel itself doesn't have to be overclocked.
 
There's no reason to believe that the "overclock" of the G-Sync version of the monitor will cause any problems. It's completely different in that the panel itself doesn't have to be overclocked.
There's plenty of evidence pointing specifically to the problematic issue being the gsync module itself. As we already know, it's one of the main reason the 950g is "overclocked" to 120hz, despite it being a native 144hz monitor. It's being pushed near/passed specs at 1440p UW / 120hz. This is causing degradation issues, which can be easily found by searching online, with the display. A newer panel won't fix this issue, because it's now pretty much confirmed to be related solely to the outdated gsync module.
 
There's plenty of evidence pointing specifically to the problematic issue being the gsync module itself. As we already know, it's one of the main reason the 950g is "overclocked" to 120hz, despite it being a native 144hz monitor. It's being pushed near/passed specs at 1440p UW / 120hz. This is causing degradation issues, which can be easily found by searching online, with the display. A newer panel won't fix this issue, because it's now pretty much confirmed to be related solely to the outdated gsync module.

Please point to a reliable reference claiming that any issues occurring with specific monitor models are caused directly by an overclock of the g-sync module and not by an overclocked panel. We know that the g-sync module being overclocked from 100Hz to 120Hz is the reason the monitor is claimed to be overclocked - but we don't know of any issues with this monitor. What other monitor features the same kind of overclocking that only happens on the g-sync module but not also on the panel itself? This panel won't be overclocked, only the module.
 
Last edited:
I still can't find a single reason why people are so hyped about this monitor, it brings nothing new to the table that we haven't seen already. And on the side note x34p and aw3418wd do the same for less

You have no real data to claim that.

Also there is no hype, thats simply the first "buyable" g-sync ultrawide that won't have a poor panel, poor calibration or embarrassing gamery design, simply no major practical compromises that could put you off from buying. Also there is new UW5 panel that should bring at least some improvements. There are enough reasons to test 950G first, especially that it is just around the corner and both X34 and Alienware are already fully tested and we know they have major issues. If 950G won't give any real picture quality improvement overy my LG 34UC98 then I can always return it and try Alienware that is cheaper and comes with longer warranty. If 950G gets delayed again then I may just try Alienware first, will see. But the design, "gamery gamma" issue and low peak brightness are a big issues, especially that I am using Lightpack which enables me to use much higher brightness levels than in pitch black room, I use full 100% brightness on my UC98 for games (and 0% for web browsing), while Alienware is 50 nits dimmer. Thats a lot.
 
Last edited:
There's plenty of evidence pointing specifically to the problematic issue being the gsync module itself.

Except there is exactly zero evidence pointing to this. 6 months after the release of the 950G then we'll know whether the intermittent problems with the X34 and alienware displays were due to the panel or the module (or both).... but right now its a guess since both the panels AND the module are out of spec. The 950G will be the first monitor that is taking one of the 'out-of-spec' elements out of the equation...
 
The freesync version has it. If you have an NVidia GPU, ask NVidia to support freesync. They could do that with a driver update in like 5 minutes. It's not difficult for them to.

No, it does not. DisplayHDR 400 doesn't have anything to do with HDR. It does not have any kind of local dimming and has a typical peak brightness so it is not different than typical SDR display. It supports wide color gamut so this is enough to watch SDR 10bit content, but it has no spec to provide even a minimal HDR effect. It is simply physically impossible.
 
No, it does not. DisplayHDR 400 doesn't have anything to do with HDR. It does not have any kind of local dimming and has a typical peak brightness so it is not different than typical SDR display. It supports wide color gamut so this is enough to watch SDR 10bit content, but it has no spec to provide even a minimal HDR effect. It is simply physically impossible.

+1000.
DisplayHDR 400 is pure marketing.
 
No, it does not. DisplayHDR 400 doesn't have anything to do with HDR. It does not have any kind of local dimming and has a typical peak brightness so it is not different than typical SDR display. It supports wide color gamut so this is enough to watch SDR 10bit content, but it has no spec to provide even a minimal HDR effect. It is simply physically impossible.

Normal monitors are at 300 nits. The freesync is 400 nits typical and 550 peak. You don't need local dimming to have a nice HDR experience. Personally I prefer not to, as the local dimming gives a lot of issues with fidelity. So it's almost double the peak than a normal ultrawide gsync monitor. Of course, it's subjective what you want. But just because it doesn't meet the DisplayHDR600 spec by 50 nits, doesn't mean it's only min DisplayHDR400 spec (it's not).

To put things into perspective, the 5K ultrawide is 450 nits typical, yet DisplayHDR600, because the peak is over 600 (750 I believe). No local dimming there either.
 
Back
Top Bottom