LG 34GK950G, 3440x1440, G-Sync, 120Hz

But for example we have the same idea, overall a cheaper Gsync 1.4 module. I cannot see this extreme $500 module being a good solution though because what % of monitor users actually would use a 2000+ monitor? Probably less than 0.1%.
 
HDR will be enticing once Microsoft properly supports it on Windows 10. As it is, it's half baked at best, and buggy normally.
 
An observation after reading the last page of posts, there appears to be an association with display port performance and possible GSync modules in relation to different HDR standards, but I believe the DP and module either support HDR or they don't... Offering HDR400, HDR600 or HDR1000 is dependent on the capability of the panel.
 
It was an example, because Nvidia apparently requires HDR1000 just to use the new module. Currently they need to either reduce the cost of the new module, or have 2 DP 1.4 modules.
 
Last edited:
But for example we have the same idea, overall a cheaper Gsync 1.4 module. I cannot see this extreme $500 module being a good solution though because what % of monitor users actually would use a 2000+ monitor? Probably less than 0.1%.

The module COST price is ~$800 not $500.
$500 is the estimate price of the FPGA CPU only assuming Nvidia gets a big discount from it's street price of $2600.
However the rest of the parts on the board cost $300.
 
Last edited:
The module COST price is ~$800 not $500.
$500 is the estimate price of the FPGA CPU only assuming Nvidia gets a big discount from it's street price of $2600.
However the rest of the parts on the board cost $300.

Yes again as I said mine was example / estimate, but you get my general point here. Currently you have a module that is worse than DP 1.4 freesync, or a module that is extreme HDR1000 only. What is going to happen with the various panels coming out in the next 12 months, either Gsync will be worse or they will need to do something about it.
 
Yes again as I said mine was example / estimate, but you get my general point here. Currently you have a module that is worse than DP 1.4 freesync, or a module that is extreme HDR1000 only. What is going to happen with the various panels coming out in the next 12 months, either Gsync will be worse or they will need to do something about it.

True, completely agree with you mate. You can compare right now the 850G and 850F.
The Freesync monitor is better, even when comes to colours gamut etc.
And 1000 nits on PC monitor is too much.
On 55" TV yes is perfect, because you are few meters away, but at 0.5m away you will become blind.
 
On 55" TV yes is perfect, because you are few meters away, but at 0.5m away you will become blind.

No idea what you're talking about, just sat down to play some PUBG on my 1000 nit monitor...

1000nits.gif
 
And 1000 nits on PC monitor is too much.
On 55" TV yes is perfect, because you are few meters away, but at 0.5m away you will become blind.

Thats not really how it works. Most of the time HDR is actually dimmer than SDR and there are many complaints in TV community that HDR is too dim for bright room viewing. From all HDR movies I have watched so far, average scene luminance was always below 200 nits, with some movies going as low as 120-130 nits. High brightness like mentioned 1000 nits is used only on details, it is not like you are going to get your eyes burned by the brightness, because not only HDR is overall dimmer, but also the current tonemapping trend for mosts manufacturers is to significantly lower APL (average picture level, average brightness) to preserve highlight detail, so the the average brightness is getting even lower than it already is by default (especially for games that are often mastered to 10 000 nits for whatever reason in the world). The overbright picture is typical for screens with very poor specification for HDR and poor dimming algorithms, for example if you get 6 or 8 edge-lit zones with 600 nits of peak brightness like some Samsung HDR displays that are already available. This kind of display is not capable of any HDR at all because it has dimming zones of a size of a truck wheel, so backlight precision is extremely poor, so if it needs to light up any detail to high luminance, it will blow out half of the picture in the process. Real HDR does not look like that and is never overblown or eye hurting. Like I said it is actually dimmer than SDR and it takes time to get used to the dimmer picture and appreciate the precision and details of HDR after watching the content with averaged brightness across the entire picture for so many years, especially if you used high brightness levels for SDR.
 
I was going to say that about HDR10 with static metadata etc. you need a HDR1000 monitor just to display it properly. On TV's with lower max nits, they have a lot of advanced tone mapping going on, which means they still look good, or they have HDR10+ or DV which has dynamic metadata.

I would say HDR400 not really HDR at all, HDR600 you could use HDR and it would be passable, HDR1000 would be actually good HDR. Not sure if these monitors can do any type of processing for example a HDR400 monitor displaying HDR10, probably not I would guess as they are not TV's they are monitors without all of the processing you get in TV's.
 
Last edited:
I have a question, which Panel do you think is better:
  • LG 34GK950F
  • LG 34WK95C
I can't decide.
I will us it for: Work (Development) 70%, 20% watch Netflix (HDR) and 10% Gaming.
I know that the focus of the 34GK950F is Gaming, but there is in principal no reason to use it as a work monitor.
 
I have a question, which Panel do you think is better:
  • LG 34GK950F
  • LG 34WK95C
I can't decide.
I will us it for: Work (Development) 70%, 20% watch Netflix (HDR) and 10% Gaming.
I know that the focus of the 34GK950F is Gaming, but there is in principal no reason to use it as a work monitor.

If 34GK950 is going to be factory calibrated and have the same level of accuracy as previous LG ultrawides like 34UC98 or 38UC99 then there shouldn't be any reason why it wouldn't be good for work. However the LG 34WK95 is probably noticeably better, with 5160x2160 resolution, HDR600 and certainly a new panel, while we still don't know what improvements UW5 panel will bring for 34GK950, if any at all. For your use case 34WK95 is more adequate choice, it is exactly made for work and multimedia, but there are two major issues. First and biggest one is that the 5160x2160 resolution is extremely demanding, GPU requirements are exactly twice as high as with 3440x1440, I have tested that multiple times. So for this 10% gaming you would have to buy the fastest GPU available and you would still struggle with medium settings, so thats far from optimal. Another issue is that this display is flat, and you cannot sit close to the screen this wide if it is flat because it is going to be uncomfortable to use and eye hurting, this is the very reason for a curve. Without the curve the edges of the screen are too far away compared to the center of the screen. But it all depends on your viewing distance, if you like to sit close to the display then you most definitely need an aggressive curve, but if you sit further away, which people normally do when working, you shouldn't have much issue.
 
If 34GK950 is going to be factory calibrated and have the same level of accuracy as previous LG ultrawides like 34UC98 or 38UC99 then there shouldn't be any reason why it wouldn't be good for work. However the LG 34WK95 is probably noticeably better, with 5160x2160 resolution, HDR600 and certainly a new panel, while we still don't know what improvements UW5 panel will bring for 34GK950, if any at all. For your use case 34WK95 is more adequate choice, it is exactly made for work and multimedia, but there are two major issues. First and biggest one is that the 5160x2160 resolution is extremely demanding, GPU requirements are exactly twice as high as with 3440x1440, I have tested that multiple times. So for this 10% gaming you would have to buy the fastest GPU available and you would still struggle with medium settings, so thats far from optimal. Another issue is that this display is flat, and you cannot sit close to the screen this wide if it is flat because it is going to be uncomfortable to use and eye hurting, this is the very reason for a curve. Without the curve the edges of the screen are too far away compared to the center of the screen. But it all depends on your viewing distance, if you like to sit close to the display then you most definitely need an aggressive curve, but if you sit further away, which people normally do when working, you shouldn't have much issue.

Thanks for the fast answer, but you describe the LG 34WK95U and not the LG 34WK95C :)
The C is also Curved, 34 inch big, and will also come this or next month to market and the price will be the same (+-50€).
Because of that facts, the decision is so hard.

But also interesting, because the LG 34WK95U is also on my watchlist. In general, I sit around 80cm away from my monitor. (Eyes to monitor)
 
Last edited:
Thanks for the fast answer, but you describe the LG 34WK95U and not the LG 34WK95C :)
The C is also Curved, 34 inch big, and will also come this or next month to market.

But also interesting, because that monitor is also on my watchlist. In general, I sit around 80cm away from my monitor. (Eyes to monitor)

Oh, I forgot that this one is even around. By the way what a great naming, only one letter at the end of the name changed while the displays are drastically different. 34WK95C is the successor to the LG 34UC98, with the difference being Nano IPS and HDR400 (or rather fake HDR) support. If LG is going with the same scheme as in last generation, 34GK950 is going to be the high refresh rate version of 34WK95C panel, similarly to UW3 vs UW4 panels, only now LG is releasing both at the same time. They should be identical in terms of picture quality, as long as there won't be any non standard issues on the 34GK950 like the "gamery gamma" issue know for Alienware AW3418DW. But if everything goes well, the differences are going to be in looks, refresh rate, quality of adaptive sync implementation (adequate to the gaming oriented screen rather than "slap on" to the non gaming screen like with LG 34UC88/98 or Samung 34CF791, very likely also 34WK95C) and of course the price.
 
Last edited:
Do I understand it correct that we know it is the new UW5 panel since 950g is listed as a nano IPS and UW4 is not a nano IPS panel?
 
Do I understand it correct that we know it is the new UW5 panel since 950g is listed as a nano IPS and UW4 is not a nano IPS panel?

The nano IPS layer can be applied to the UW4 panel (which was the original plan by LG), so this does not dictate anything. In reality, the only source that the new UW5 panel will be used in the 950G is from @Daniel - LG here on this forum... all other information from LG that I can find on the web indicates that the UW4 panel will be used, including the product page on the LG Hong Kong site, which clearly states "120Hz (Overclock)" under the key features.
 
The nano IPS layer can be applied to the UW4 panel (which was the original plan by LG), so this does not dictate anything. In reality, the only source that the new UW5 panel will be used in the 950G is from @Daniel - LG here on this forum... all other information from LG that I can find on the web indicates that the UW4 panel will be used, including the product page on the LG Hong Kong site, which clearly states "120Hz (Overclock)" under the key features.

Thank you for the information Stu. Much appreciated.
 
At this point there is so much confusion around these monitors that it is best to just wait for official specs and availability, the displays should be in stock by the end of a month. Also the new GPUs are supposedly going to be unveiled on 20th, so we should get full clarification for both the displays and new GPUs by the end of the month.
 
Back
Top Bottom