LG 34GK950G, 3440x1440, G-Sync, 120Hz

and first Preview:
https://www.monitornerds.com/lg-34gk950g-review/

''the LG 34GK950G might as well have ULMB ''

Monitornerds is a BS website. Basically a couple of know-nothings who use the spec sheet as the source for their flowery, entirely non-technical blog posts.

Besides fumbling absolutely everything related to refresh rates, they also claimed the monitor is likely to be HDR400 compatible. First, it's called DisplayHDR 400, not HDR400. Second, everyone with a half decent understanding of monitors knows a DisplayHDR 400 rating is off the table, because the DP1.2 G-sync module doesn't support HDR10. :-/

This is one of those incompetent websites that give the tech media a bad name.
 
If you turn GSync off. Would you be able to overclock it to 144hz? Or is it still processed by the module and therefore limited?
 
If you turn GSync off. Would you be able to overclock it to 144hz? Or is it still processed by the module and therefore limited?

Yes, it will still be limited. The g-sync module replaces what is normally a 'simple' controller in a non g-sync monitor. The g-sync module will therefore be the limiting factor.
 
If you turn GSync off. Would you be able to overclock it to 144hz? Or is it still processed by the module and therefore limited?
Yes. Still limited. Because this G-SYNC module is a DP1.2 controller, it tells the GPU to send data at the HBR2 data rate. No matter how much of an OC you manage for the controller, the GPU will never switch to HBR3, so you're stuck at 120 Hz.

As pedromaltez said, for >120 Hz, the entire controller is the issue, not just the little part of it that implements G-SYNC.

All you could do is swap out the entire G-SYNC module and replace it with a better DP1.4 controller, thereby achieving 144 Hz, without any overclock whatsoever. LG has already done this for us with the F version.

If you could get your hands on a DP1.4 G-SYNC HDR module it would be an interesting soldering project.
 
Last edited:
Yes. Still limited. Because this G-SYNC module is a DP1.2 controller, it tells the GPU to send data at the HBR2 data rate. No matter how much of an OC you manage for the controller, the GPU will never switch to HBR3, so you're stuck at 120 Hz.

As pedromaltez said, for >120 Hz, the entire controller is the issue, not just the little part of it that implements G-SYNC.

All you could do is swap out the entire G-SYNC module and replace it with a better DP1.4 controller, thereby achieving 144 Hz, without any overclock whatsoever. LG has already done this for us with the F version.

If you could get your hands on a DP1.4 G-SYNC HDR module it would be an interesting soldering project.

Interesting. Thank you for the information.
Would this convince you to buy the F-version or would you buy the G-version?
 
Interesting. Thank you for the information.
Would this convince you to buy the F-version or would you buy the G-version?
Personally, the G-version. I play a lot of single player games so I don't feel the need to turn down any settings to achieve over 120fps. If I was purely multiplayer gaming then I would buy one of the 1080p ultrawides that overclock to 165hz.
 
Personally, the G-version. I play a lot of single player games so I don't feel the need to turn down any settings to achieve over 120fps. If I was purely multiplayer gaming then I would buy one of the 1080p ultrawides that overclock to 165hz.

Yea. Thanks for your input. I too play a lot of single player games so I think the G-version is for me as well.

I am coming from LG 34UC79G so 1080p UW 144hz Freesync. I only had tearing issues in Skyrim since it locks the fps at 60. (No issues in Fallout 4, same engine.)

But I have never really experienced GSync on a everyday basis so maybe I just don't know what I am missing and also keeping 144hz in 1440p UW seems pretty hard even with a 2080 TI, especially in l say two years.
 
Also kosatec.de is listing the 950G at €1250 after tax. But they are a B2B company. So not sure what that mean for us consumers really.

Price examples from Kosatec and amazon.de in parentheses.
Samsung 860 Evo = €80 (€80)
Asus Turbo 1080 TI = €690 (€770)
8700K = €330 (€399)

I could post a screengrab in someone wants confirmation.
 
Interesting. Thank you for the information.
Would this convince you to buy the F-version or would you buy the G-version?

If I needed something by November, and assuming the overclocked G-SYNC module has ZERO negative side effects (the word is still out on that), I'd get the G version.

For many of the titles I play, FPS oscillates quite a bit (depends on scene complexity). In my view, this is where adaptive sync shines, as it makes the resulting animations a lot more fluid. Solving the tearing problem (like V-SYNC does) which is the primary issue most people think adaptive sync solves, I consider merely an added bonus. I'm unwilling to give up my nVidia 1080, and I'm unwilling to give up adaptive sync, so I'm reluctantly tied to G-SYNC. Compared to G-SYNC, the additional 24 Hz refresh rate and the (faux HDR) DisplayHDR 400 rating the F version brings to the table seem minor to me.

If AMD's GPUs were at least somewhat performance and price competitive at the high end, OR I didn't purchase high end GPUs, I'd go for the F version and an AMD graphics card. Unfortunately, AMD recovering in the high end GPU space seems unlikely to happen anytime soon. :-(

That being said, the idealist in me just can't get past the idea of spending >$1000, in Q4 2018, on an overclocked DP1.2 controller. DP1.2 is outdated technology from 2010, an eternity ago for computer hardware. Simultaneously, I'm baffled by the abysmal state of nVidia's DP1.4 G-SYNC HDR module. IMHO, if nVidia can't offer a better DP1.4 G-SYNC implementation, then G-SYNC is dead. IMHO that means nVidia must either release a better DP1.4 controller SOON (lower cost, passively cooled, etc), or enable VESA adaptive sync (a.k.a. FreeSync) on their cards. Neither of those two scenarios motivates me to spend money on current G-SYNC technology.

I'd like to spend money, but nVidia's crappy work in this area is preventing me from doing so. I'm currently twiddling my thumbs and waiting for nVidia to put all their "cards" on the table. :)

Both F and G are good monitors though, and I learned a lot researching them and reading this thread, which is why I'm still here ;-)
 
Last edited:
Interesting. Thank you for the information.
Would this convince you to buy the F-version or would you buy the G-version?

For me, the $64000 question is: "Are there reported issues with the overclocked gsync controller" because it looks like there definitely are with current 120hz gsync monitors. No one knows whether the issues are panel related or controller related though.

This sucks because it means one should technically wait 3 or 4 months before deciding to purchase. If the overclocked controller is deemed solid then I will definitely go gsync.
 
If I needed something by November, and assuming the overclocked G-SYNC module has ZERO negative side effects (the word is still out on that), I'd get the G version.

For many of the titles I play, FPS oscillates quite a bit (depends on scene complexity). In my view, this is where adaptive sync shines, as it makes the resulting animations a lot more fluid. Solving the tearing problem (like V-SYNC does) which is the primary issue most people think adaptive sync solves, I consider merely an added bonus. I'm unwilling to give up my nVidia 1080, and I'm unwilling to give up adaptive sync, so I'm reluctantly tied to G-SYNC. Compared to G-SYNC, the additional 24 Hz refresh rate and the (faux HDR) DisplayHDR 400 rating the F version brings to the table seem minor to me.

If AMD's GPUs were at least somewhat performance and price competitive at the high end, OR I didn't purchase high end GPUs, I'd go for the F version and an AMD graphics card. Unfortunately, AMD recovering in the high end GPU space seems unlikely to happen anytime soon. :-(

That being said, the idealist in me just can't get past the idea of spending >$1000, in Q4 2018, on an overclocked DP1.2 controller. DP1.2 is outdated technology from 2010, an eternity ago for computer hardware. Simultaneously, I'm baffled by the abysmal state of nVidia's DP1.4 G-SYNC HDR module. IMHO, if nVidia can't offer a better DP1.4 G-SYNC implementation, then G-SYNC is dead. IMHO that means nVidia must either release a better DP1.4 controller SOON (lower cost, passively cooled, etc), or enable VESA adaptive sync (a.k.a. FreeSync) on their cards. Neither of those two scenarios motivates me to spend money on what is currently available.

I'd like to spend money, but nVidia's crappy work in this area is preventing me from doing so. I'm currently twiddling my thumbs and waiting for nVidia to put all their "cards" on the table. :)

Both F and G are good monitors though, and I learned a lot researching them and reading this thread, which is why I'm still here ;-)

I greatly appreciate this answer. Basically goes through the thoughts I have about this.
 
Back
Top Bottom