LG 34GK950G, 3440x1440, G-Sync, 120Hz

roughly 949.99 I think

Well that's certainly reasonable price for a new Ultrawide setting a new record spec wise.

Looks like I'll be buying the Freesync version to pair with my 2080 then as there's no way I'm paying a £200 Gsync tax for 24Hz less and no HDR. I'm also of the opinion Gsync is overrated. It's become more of a superstition these days that you *must* have it. But I think the new RTX cards will provide high enough FPS that any tearing will be mostly unnoticeable.
 
If the G-sync module has to be overclocked for 120 Hz operation at this resolution then it may be the same as with any overclocking, slight degradation happens over time and what was stable few months ago may no longer be stable. This is true not only for overclocking but for normal operation too, this is one of the reasons why things like CPUs or GPUs have very conservative stock clocks and use much higher voltage then is need to majority of them, it is done to account for not only the differences in quality and capability between all units of the same model that are coming from production imperfections, but also for degradation or wear down over time. So there is very little chance that something pushed to the limits is going to stay there stable for prolonged intensive use, it will give up relatively quickly and you will have to either increase the voltage (which may not give positive result) or reduce the clocks, or in this case refresh rate.

A few questions I have. Are you actually overclocking the Gsync module or is the Gsync module allowing overclocking of the panel, but can do 120hz given the panel can handle it?

I was reading a bit more and found this tftcentral article: http://www.tftcentral.co.uk/articles/overclocked_refresh.htm#considerations

This is what stood out to me
The ability to overclock is commonly linked to the controller board used in the monitor and some are capable of supporting an overclock while others simply won't allow anything.

is the controller board the same thing as the Gsync module or are they two seperate components?
 
is the controller board the same thing as the Gsync module or are they two seperate components?

Yes. They are the same thing. More strictly:

  • Controller (generic name)
  • Scaler (typically the name of the chip used in FreeSync monitors)
Both of the above refer to the monitor's main logic circuit. This controller receives the video signal from the GPU (according to a standard such as HDMI or DisplayPort ) and "translates" it into correctly timed voltage pulses to most optimally switch the individual pixels on the panel. Some circuits will include additional features like (but not limited to) FreeSync, G-SYNC and/or the ability to scale the received video signal (map it to a smaller or larger resolution).
  • Controller Board
  • G-SYNC module
Both of the above refer to the entire board on which the controller is mounted, including all supporting electronics (the controller itself, RAM used as a video buffers, voltage stabilizers, etc)

Here is a picture of the entire G-SYNC module, a.k.a. controller board:

https://goo.gl/images/H3REEX

The black square with "Altera" written on it is the actual controller (implemented using an FPGA). The entire assembly is the controller board.

As this particular controller board is built by nVidia with the goal of enabling G-SYNC, it's often called the "G-SYNC module".
 
Last edited:
Are you actually overclocking the Gsync module or is the Gsync module allowing overclocking of the panel, but can do 120hz given the panel can handle it?
This has already been mentioned at least half a dozen times in this thread.

It's the G-SYNC module that IS overclocked. At 3440x1440, the DP1.2 G-SYNC module isn't designed to deliver a refresh rate above 100 Hz. Only by overclocking the G-SYNC module can this monitor achieve a refresh rate of 120 Hz. The panel is NOT overclocked. The panel supports a native refresh rate of 144 Hz, so for the panel to achieve a refresh rate of 120 Hz requires nothing special (certainly not an overclock).
 
Last edited:
We all know that the UW5 panel is 144Hz native and brings +100cd/m2 of brightness than the UW4, but if the GSYNC module will make it act like an UW4 in terms of Hz, the 1399€ price is unjustifiable.
 
We all know that the UW5 panel is 144Hz native and brings +100cd/m2 of brightness than the UW4, but if the GSYNC module will make it act like an UW4 in terms of Hz, the 1399€ price is unjustifiable.

Yep and it makes the Freesync version a much more tempting buy for people getting the new RTX cards as the glorious 144hz will put them to the test.
 
Yep and it makes the Freesync version a much more tempting buy for people getting the new RTX cards as the glorious 144hz will put them to the test.

We have to wait for the RTX reviews but I think they will not be as powerful as people thinks in comparison with a 1080Ti. I mean, I think a lot of games won't reach 144fps in 3440x1440 ultra.

14052041_f520.jpg
 
Last edited:
We all know that the UW5 panel is 144Hz native and brings +100cd/m2 of brightness than the UW4, but if the GSYNC module will make it act like an UW4 in terms of Hz, the 1399€ price is unjustifiable.

You and ChrisPyzut are asking if the overclocked G-SYNC module will make the 34GK950G act/look like the older UW4 based monitors. We already know the answer. It's "No". It won't look the same.

Just the fact that LG spent time improving the black levels of their IPS panels and added a Nano IPS backlight means they will not look the same. Even if you ignore that, and an overclocked G-SYNC module does cause issues, it's unlikely those issues will manifest themselves in the exact same way when hooked up to two different panels. The obvious answer to your question simply isn't useful, because despite potentially behaving nothing like UW4 based monitors, overclocking the G-SYNC module may still cause issues.

Forget the UW4 panel.

What you should be asking is this:

What are the negative side effects of overclocking the G-SYNC module on the 34GK950G?

If there are no negative side effects, then the question becomes:

If there are no negative side effects, then what is the point of providing the ability to disable G-SYNC module overclocking in the OSD menu?

I've asked the above a few times already, but @Daniel - LG either doesn't know or can't provide an answer, or at least hasn't so far.
 
Last edited:
We have to wait for the RTX reviews but I think they will not be as powerful as people thinks in comparison with a 1080Ti. I mean, I think a lot of games won't reach 144fps in 3440x1440 ultra.

14052041_f520.jpg

The 2080 perhaps not. I predict 120FPS average for the 2080 @ 3440x1440p, but i think the 2080Ti has a strong chance to average 144hz at that resolution.

I myself currently own a Freesync Ultrawide and preordered the 2080 and am confident I'll be able to max my monitors 100hz refresh rate in all games. If that turns out to be the case I'll sell it and buy one of these new LG ones.
 
This has already been mentioned at least half a dozen times in this thread.

It's the G-SYNC module that IS overclocked. At 3440x1440, the DP1.2 G-SYNC module isn't designed to deliver a refresh rate above 100 Hz. Only by overclocking the G-SYNC module can this monitor achieve a refresh rate of 120 Hz. The panel is NOT overclocked. The panel supports a native refresh rate of 144 Hz, so for the panel to achieve a refresh rate of 120 Hz requires nothing special (certainly not an overclock).

I know this panel isn't overclocked. What I meant to say was in the past with the 100hz native panels, was the Gsync module actually being overclocked (as in pushed beyond its own limits) or was the Gsync module working within its limits but overclocking the panel itself.

Thats what I meant to ask. From your post it seems like in the past the panel and gsync module were working above their limits in OC mode, and now only the Gsync module will. Which means there could very well be issues with not hitting 120hz (given the lack of certainty that comes with overclocking)
 
I know this panel isn't overclocked. What I meant to say was in the past with the 100hz native panels, was the Gsync module actually being overclocked (as in pushed beyond its own limits) or was the Gsync module working within its limits but overclocking the panel itself.

Thats what I meant to ask. From your post it seems like in the past the panel and gsync module were working above their limits in OC mode, and now only the Gsync module will. Which means there could very well be issues with not hitting 120hz (given the lack of certainty that comes with overclocking)

This is unknowable until the monitor is out there. This is the first 144Hz panel that's been used with this module, limiting it to 120Hz. I don't think that's ever been done before. It SHOULD be fine... there's no issue with the panel, but there does seem to be a question mark over the module itself and if that was potentially the reason for flickering with the 100hz panels that it was pushing to 120Hz. It's always been assumed those issues stemmed from the panel being pushed beyond its native refresh rate (and that makes sense), but perhaps the G-Sync module was the issue? Until this monitor has been out in the wild for a while, we can't say with any certainty that it will be 100% stable at 120Hz.
 
Yes. They are the same thing. More strictly:

  • Controller (generic name)
  • Scaler (typically the name of the chip used in FreeSync monitors)
Both of the above refer to the monitor's main logic circuit. This controller receives the video signal from the GPU (according to a standard such as HDMI or DisplayPort ) and "translates" it into correctly timed voltage pulses to most optimally switch the individual pixels on the panel. Some circuits will include additional features like (but not limited to) FreeSync, G-SYNC and/or the ability to scale the received video signal (map it to a smaller or larger resolution).
  • Controller Board
  • G-SYNC module
Both of the above refer to the entire board on which the controller is mounted, including all supporting electronics (the controller itself, RAM used as a video buffers, voltage stabilizers, etc)

Here is a picture of the entire G-SYNC module, a.k.a. controller board:

https://goo.gl/images/H3REEX

The black square with "Altera" written on it is the actual controller (implemented using an FPGA). The entire assembly is the controller board.

As this particular controller board is built by nVidia with the goal of enabling G-SYNC, it's often called the "G-SYNC module".

Thank you for the explanation. So there is definitely a chance that this monitor might not hit 120hz reliably. However there shouldn't be flicker issues, panel uniformity, etc that comes from overclocking the panel.

It seems like it will just be an issue of if the board can hit 120hz or not. Thats what I got from all this.
 
The 2080 perhaps not. I predict 120FPS average for the 2080 @ 3440x1440p, but i think the 2080Ti has a strong chance to average 144hz at that resolution.

Yep and it makes the Freesync version a much more tempting buy for people getting the new RTX cards as the glorious 144hz will put them to the test.

I wouldn't really expect RTX cards to get you anywhere close to 144 Hz at 3440x1440 in demanding games. Despite some benchmarks made on compromised settings and then lying that it was maxed out (because you "don't need" that, this makes "no difference", "human eyes can only see" and other nonsense), 1080 Ti is sometimes not even enough for 60 FPS at 3440x1440, while 40% gain in performance for Turing over Pascal is rather an optimistic estimate. 2080 is probably going to be less than 10% faster than 1080 Ti, so it won't get you too far. 2080 Ti is the only Turing card that has a chance to be over 40% faster than previous gen, because the spec gap between 2080 and 2080 Ti is really huge this time, even bigger than previous for previous gens, so if everything scales properly then it should be 40% faster than 2080 that should at least match 1080 Ti. Although the difference between 2080 Ti and 1080 Ti is way smaller than for 1080 Ti vs 980 Ti, so definitely don't expect similar up to 90% gains, but OC vs OC, if Turing has a bit more headroom than Pascal, 50% could be possible on the high-end, although I am afraid it will be only be like 35%, which would mean that we still won't have a good enough single GPU available on the market.

Average performance doesn't matter, it is very easy to throw in some high numbers from undemanding games and make the average from all games tested exceed 100 FPS, while in reality you will have a problem with hitting 60 in 1/3 of them. The fact that one game runs at 200 FPS won't make other game that runs at 50 FPS run faster than it does. Thats the difference between real performance and the artificial numbers you are giving.

And what is probably the most important, what I have said so far is only about GPU bound scenarios where your only limitation is GPU power. There are also many games where you can have all power you can buy and you will still struggle with performance. It is not uncommon, that is if you don't limit yourself just to the most strict mainstream, of newest games with newest engines, great multithreading support and etc. The Joker's chart you have shown is made exactly from such games, thats an extremely controlled and limited environment to benchmark, diversification there is like none, he might as well benchmarked the same game 10 times instead and it wouldn't make much difference to the usability of this chart. Artificial numbers from controlled and undiversified environment are nowhere near representative of real world performance.

Trading G-sync for these additional 24 Hz that you are never going to hit in any kind of demanding game (especially on 2080 non Ti) is not the best deal, it isn't much more than just actively looking for a justification for paying less.
 
I wouldn't really expect RTX cards to get you anywhere close to 144 Hz at 3440x1440 in demanding games.

That's because you are clueless. The 1080Ti averages 120FPS @ 2560x1440p at ultra settings so its fair to assume the 2080Ti which is 30-40% more powerful will be able to average 144hz or close to it @ 3440x1440p. The 2080Ti is already said to average 100fps @ 4k and that's without DLSS enabled.


Despite some benchmarks made on compromised settings and then lying

Because all the tech review websites are lying and your the one with the "truth" am I right? Lmfao get the **** outta here dude.

1080 Ti is sometimes not even enough for 60 FPS at 3440x1440

Now I'm afraid that's a biggest pile of ******** I've ever heard. 1080Ti is a 100+ fps card @ 3440x1440p. Hell even my Vega 64 averaged 80fps @ 3440x1440p. Theirs no way a 1080ti fails to get 60fps unless the game is still in alpha/beta.

2080 is probably going to be less than 10% faster than 1080 Ti, so it won't get you too far.

It will still be far enough to max my 100hz Ultrawide in all games. :)

which would mean that we still won't have a good enough single GPU available on the market.

Um yes we do... It's called the 2080Ti for 4k and the 2080 for 1440p.

Average performance doesn't matter

Average performance benchmarks do matter as they represent real world performance of a broad range of games tested at the highest settings. Only liar/uninformed idiot would say they don't matter.

while in reality you will have a problem with hitting 60 in 1/3 of them.

What a load of utter nonsense.. I'm convinced you are clueless about how benchmarks are tested.
 
I know this panel isn't overclocked. What I meant to say was in the past with the 100hz native panels, was the Gsync module actually being overclocked (as in pushed beyond its own limits) or was the Gsync module working within its limits but overclocking the panel itself.
  • For monitors using the UW4 panel at 3440x1440@100Hz and using the DP1.2 G-SYNC module, neither the G-SYNC module nor the panel was overclocked.
  • If the panel is overclocked, then it must be the controller that does the panel overclocking (it's the only component in the monitor that drives the panel, so there is no other way to overclock a panel). However, that doesn't necessarily mean the controller itself is overclocked, e.g. any FreeSync monitor with a DP1.3 controller could have easily reached a 120 Hz refresh rate without overclocking the controller.
  • For monitors using the UW4 panel at 3440x1440@120Hz and using the DP1.2 G-SYNC module, both the G-SYNC module and the panel were overclocked. If an overclocked G-SYNC module is necessary to reach 120 Hz for the UW5, then it must also have been overclocked to reach 120 Hz on all UW4 models.
HOWEVER! The G-SYNC module being overclocked doesn't mean it's being pushed beyond its own limits! Being pushed beyond its limits at 3440x1440@120Hz implies it doesn't work as well as it would at 100 Hz. So far we don't know whether it does or does not.
 
My current monitor is Gsync 2560x1440 @165hz(oc). Isn't that more of a burden on the gsync module than 3440x1440 @120hz? Wouldn't that mean that the previous issues were most likely with the panel and not the module and therefore this new panel should overclock to 120hz with no problem?
 
My current monitor is Gsync 2560x1440 @165hz(oc). Isn't that more of a burden on the gsync module than 3440x1440 @120hz? Wouldn't that mean that the previous issues were most likely with the panel and not the module and therefore this new panel should overclock to 120hz with no problem?
Based of this calculator 3440x1440@120 is less than 2560x1440 @165
 
Hello,

Can anyone help? so my monitor has broken! And I need a replacement!
I had a 1080p @ 60Hz. Running a AMD Rx9 Series.

But I would love a 1440 @ 144Hz for life span and I am about to upgrade to a 1070Ti or 1080.

What is the best bang for buck?
 
Back
Top Bottom