LG 34GK950G, 3440x1440, G-Sync, 120Hz

I have asked a shop where I got my 2080 Ti Sea Hawk very early if they have any information about the availability and price of 950G, but they don't know anything yet.

Everything is set and waiting, and waiting, and waiting...

I can't even say "if it doesn't come out in early Novemeber then" because there is no then :D
 
Last edited:
They've got until the end of the month, then I need to buy a monitor. Either this or the AW3418DW, depending on reviews and street price (hopefully tftcentral can confirm pricing and availability info).

My new desktop build will be ready as soon as I get my 9900K around 10/25 or so, and I don't want to have to use it on a 13 year old 1900x1200 TN panel any longer than I have to. I just got a new calibrator today and the post-calibration contrast ratio was like 350:1 and the black point 0.337 cd/m2.
 
They've got until the end of the month, then I need to buy a monitor. Either this or the AW3418DW, depending on reviews and street price (hopefully tftcentral can confirm pricing and availability info).

My new desktop build will be ready as soon as I get my 9900K around 10/25 or so, and I don't want to have to use it on a 13 year old 1900x1200 TN panel any longer than I have to. I just got a new calibrator today and the post-calibration contrast ratio was like 350:1 and the black point 0.337 cd/m2.

I doubt that they can confirm anything on pricing except "suggested MSRP", which is not really that useful. The ASUS PG27UQ is already $200 below MSRP, so we won't actually know the real price until it hits the stores. I would love to see it for $1200 at retailers.

I know your pain, lol.. I'm on an 11 year old Planar panel waiting for this monitor to hit.
 
in case there was any doubt about this, the 34GK950F can support full native 3440 x 1440 resolution at 144Hz refresh rate, with 10-bit colour depth - without needing any chroma sub-sampling as well.

I think it's very likely that i will finish testing the F version and then publish the review of that model, then move on to the G model. then within the second review i will be able to do direct comparisons as well. as opposed to waiting for everything to be completed before i publish anything. F review coming along nicely :)
 
Badass, any way you could “tease” how bright the F model gets in strobe mode? ;)
since you asked very nicely Vega, i can tell you that the 1ms MBR strobing mode works at 120Hz and 144Hz refresh rates (nothing lower) but can still reach around 230 cd/m2 at maximum brightness setting (at 144Hz)
 
in case there was any doubt about this, the 34GK950F can support full native 3440 x 1440 resolution at 144Hz refresh rate, with 10-bit colour depth - without needing any chroma sub-sampling as well.

I think it's very likely that i will finish testing the F version and then publish the review of that model, then move on to the G model. then within the second review i will be able to do direct comparisons as well. as opposed to waiting for everything to be completed before i publish anything. F review coming along nicely :)

Awesome! So, is it then just one week for the F review?? ;P
 
in case there was any doubt about this, the 34GK950F can support full native 3440 x 1440 resolution at 144Hz refresh rate, with 10-bit colour depth - without needing any chroma sub-sampling as well.

I think it's very likely that i will finish testing the F version and then publish the review of that model, then move on to the G model. then within the second review i will be able to do direct comparisons as well. as opposed to waiting for everything to be completed before i publish anything. F review coming along nicely :)

Thank you! These are good news!
BTW, I have a 1080Ti and I'm really lost. Older GSync module on the G version makes me angry reading this about the F version.
 
Thank you! These are good news!
BTW, I have a 1080Ti and I'm really lost. Older GSync module on the G version makes me angry reading this about the F version.

I'd say G-Sync version would still be preferable for gamers with Nvidia cards. There is undoubtedly going to be tearing/stutter encountered at some point on the 'F' version otherwise, which would be eliminated on the 'G'. Question is whether the other benefits of the 'F' mitigate that, but I don't see that they would. Tearing/stutter is just horrible, and certainly not something I'd want to be experiencing on a circa £1K monitor!
 
I'd say G-Sync version would still be preferable for gamers with Nvidia cards. There is undoubtedly going to be tearing/stutter encountered at some point on the 'F' version otherwise, which would be eliminated on the 'G'. Question is whether the other benefits of the 'F' mitigate that, but I don't see that they would. Tearing/stutter is just horrible, and certainly not something I'd want to be experiencing on a circa £1K monitor!

Yea, that's true. Hope @Baddass can help us with his reviews.
 
@Baddass Sounds like you're liking it so far then... how's the Nano IPS? Noticeable improvement over the IPS ultrawides without it?
The main and most obvious difference is down to the colour gamut as this is the first wide gamut IPS offering in this space. Really that’s the main driver behind the marketing for it being “nano” IPS.
 
Thank you! These are good news!
BTW, I have a 1080Ti and I'm really lost. Older GSync module on the G version makes me angry reading this about the F version.
I have a 1080ti as well. 950G is the easy choice here. You aren't gonna run most modern games at over 120fps and tearing is absolutely disgusting.
 
The main and most obvious difference is down to the colour gamut as this is the first wide gamut IPS offering in this space. Really that’s the main driver behind the marketing for it being “nano” IPS.
How does this affect calibration? I use DisplayCal so would I still select the normal settings to calibrate it to SRGB? or would I need to calibrate it to the DCI P3 standard? How would that affect games that mainly use SRGB?
 
I'd say G-Sync version would still be preferable for gamers with Nvidia cards. There is undoubtedly going to be tearing/stutter encountered at some point on the 'F' version otherwise, which would be eliminated on the 'G'. Question is whether the other benefits of the 'F' mitigate that, but I don't see that they would. Tearing/stutter is just horrible, and certainly not something I'd want to be experiencing on a circa £1K monitor!

Im thinking of just lowering the Hz of the 'F' to the fps my pc can constantly hold and then playin with vsync. Thanks to vsync a ittle more input lag but considering im still playin fine with 60hz vsync, i think i would be fine. I could also switch to 16:9 to get more fps if needed.

Still hoping that nvidia will someday support freesync or amd actually delivers a card that get close to a ti.

@Baddass
How are the Hz steps on the 'F' version?
Also can you test if you can just switch to 16:9 (2560*1440) ingame and if it looks fine then?
 
Im thinking of just lowering the Hz of the 'F' to the fps my pc can constantly hold and then playin with vsync. Thanks to vsync a ittle more input lag but considering im still playin fine with 60hz vsync, i think i would be fine. I could also switch to 16:9 to get more fps if needed.

Still hoping that nvidia will someday support freesync or amd actually delivers a card that get close to a ti.

@Baddass
How are the Hz steps on the 'F' version?
Also can you test if you can just switch to 16:9 (2560*1440) ingame and if it looks fine then?

I don't quite understand the logic though, other than the Freesync version saves you a bit of money. Still, G-Sync is inherently better. Unless you're dead set on getting an AMD card in the future, you're deliberately setting yourself up for an inferior gaming experience. I guess if you're coming from a non-G-Sync monitor anyway though you won't know what you're missing.
 
Im thinking of just lowering the Hz of the 'F' to the fps my pc can constantly hold and then playin with vsync. Thanks to vsync a ittle more input lag but considering im still playin fine with 60hz vsync, i think i would be fine. I could also switch to 16:9 to get more fps if needed.

Still hoping that nvidia will someday support freesync or amd actually delivers a card that get close to a ti.

@Baddass
How are the Hz steps on the 'F' version?
Also can you test if you can just switch to 16:9 (2560*1440) ingame and if it looks fine then?

Instead of Vsync you could use RTSS. It has under 10ms of input lag and noticeably improved frame pacing (from what I have seen)
 
I don't quite understand the logic though, other than the Freesync version saves you a bit of money. Still, G-Sync is inherently better. Unless you're dead set on getting an AMD card in the future, you're deliberately setting yourself up for an inferior gaming experience. I guess if you're coming from a non-G-Sync monitor anyway though you won't know what you're missing.
Exactly my thinking
 
since you asked very nicely Vega, i can tell you that the 1ms MBR strobing mode works at 120Hz and 144Hz refresh rates (nothing lower) but can still reach around 230 cd/m2 at maximum brightness setting (at 144Hz)
Is this only applicable to the F version? Does the G version have this at 120hz?
 
Back
Top Bottom