LG 34GK950G, 3440x1440, G-Sync, 120Hz

At this point there is so much confusion around these monitors that it is best to just wait for official specs and availability, the displays should be in stock by the end of a month. Also the new GPUs are supposedly going to be unveiled on 20th, so we should get full clarification for both the displays and new GPUs by the end of the month.

Your typo has proven true thus far... "the displays should be in stock by the end of a month"... the question is which month?! :p
 
Your typo has proven true thus far... "the displays should be in stock by the end of a month"... the question is which month?! :p

German shops that had these displays listed few days ago claimed 3 weeks shipment time, so I assume that by this time some shops should already have them in stock. Unfortunately these offers are no longer available, so I don't know how reliable they were, but on the other hand why would they do something like this if the display wasn't about to release anytime soon. I don't know, the thing about displays is that they don't have any release dates or launch events, at some point they will just start to appear in shops out of nowhere, sometimes even few weeks before any review samples are sent to reviewers. But you don't really need to wait for reviews, because one good thing about massive stagnation and predictability of desktop display market is that you can just order a newly available display, compare with your current one (if it is not too old) and if it does well then you can keep it and you can be sure that that there won't be anything meaningfully better released in like 2-3 years (in similar price category). Especially that the LG made IPS is the only 'almost quality' option on mainstream market, as Samsung SVA panels and AUO's AHVA and AMVA are consistently very poor for desktop use, so the chances for something much better to suddenly come out or be announced are extremely low.
 
Last edited:
Daniel has reported that the 34GK950G (and the 34GK950F) will use the newer LM340UW5 panel (the Alienware AW3418DW and Acer X34P use the LM340UW4 panel), which is native 144Hz, but it will be limited to 120Hz due to DP1.2 bandwidth limit.

Then why not use DP 1.4 instead?. Graphics cards strong enough for this monitor already have DP 1.4, others can use it in 1.2 mode, LG already did SAME trick with 34UM95 and DP 1.1/1.2.
Reason they using 1.2 is to make another revision of this monitor ("upgraded" "pro version" with 1.4) half year l8r for more $$.

Dont buy it, this monitor is already old tech.
 
Then why not use DP 1.4 instead?. Graphics cards strong enough for this monitor already have DP 1.4, others can use it in 1.2 mode, LG already did SAME trick with 34UM95 and DP 1.1/1.2.
Reason they using 1.2 is to make another revision of this monitor ("upgraded" "pro version" with 1.4) half year l8r for more $$.

Dont buy it, this monitor is already old tech.

Because there is no DP1.4 Gsync module that supports SDR. If you want to have Gsync and DP1.4, you need to meet the HDR10 certification.
 
Because there is no DP1.4 Gsync module that supports SDR. If you want to have Gsync and DP1.4, you need to meet the HDR10 certification.

Thats not exactly true. DP 1.4 module supports SDR. There are simply 2 modules, older $200 one with DP 1.2 and no HDR, and this new "$500" one with HDR10 that is going to be licensed only to real HDR displays, not fake ones with HDR600 and 6 dimming zones, let alone HDR400 ones with no zones so in reality no HDR at all. This is just NVIDIA's way of doing things, on one hand you get serious and professional products, the opposite to what you get from lackluster competition, but this "all or nothing" approach to standards makes things expensive and isn't exactly realistic on mainstream market where big compromises are simply required to make the price attractive. Thats still better than being fooled into fake standards, but it takes time and money
 
Thats not exactly true. DP 1.4 module supports SDR. There are simply 2 modules, older $200 one with DP 1.2 and no HDR, and this new "$500" one with HDR10 that is going to be licensed only to real HDR displays, not fake ones with HDR600 and 6 dimming zones, let alone HDR400 ones with no zones so in reality no HDR at all. This is just NVIDIA's way of doing things, on one hand you get serious and professional products, the opposite to what you get from lackluster competition, but this "all or nothing" approach to standards makes things expensive and isn't exactly realistic on mainstream market where big compromises are simply required to make the price attractive. Thats still better than being fooled into fake standards, but it takes time and money

In some respects though, there is perhaps plus side depending how you look at this. People have been (rightly) complaining for a long term about the HDR specs being muddied with TV's emblazoning their marketing with that claim when the panels aren't anywhere near the 1000-nits required to display HDR properly. So it might be a step in the right direction in some ways to have a G-Sync module that can't be utilised and exploited in the same way on monitors that aren't actually proper true HDR. Yes they will be expensive but at least it will be the full experience and consumers won't be misled in to buying something they think they're getting when they're not.
 
In some respects though, there is perhaps plus side depending how you look at this. People have been (rightly) complaining for a long term about the HDR specs being muddied with TV's emblazoning their marketing with that claim when the panels aren't anywhere near the 1000-nits required to display HDR properly. So it might be a step in the right direction in some ways to have a G-Sync module that can't be utilised and exploited in the same way on monitors that aren't actually proper true HDR. Yes they will be expensive but at least it will be the full experience and consumers won't be misled in to buying something they think they're getting when they're not.

I also take it as a good thing, this is why I mentioned fake HDR and being fooled into fake standards. However, this $500 is not very reasonable, not only because of the price, but also availability. Chances of getting it inside some serious display, by this I mean not in some overpriced and underdeveloped embarrassing gamery garbage from poor manufacturers like Asus or Acer, are very low. This is why I am almost certain I am going to buy 950G, because another display that is going to have factory calibration, quantum dot, g-sync and non-embarrassing design is not going to happen for years. Screw all of these fake HDRs and whatnots, I will think about HDR once there are any desktop displays with actual HDR support available on the market, HDR movies on my OLED TV are enough for now, games are completely different world. And considering how merciless the desktop environment and viewing distance is with exposing every single flaw of the display, there won't be any proper HDR desktop displays until some self emissive technology enters the market. I don't believe any FALD can cut it in such environment, especially that 90% of issues are coming from viewing angles and with those FALD does nothing about.
 
Last edited:
I'm sure we'll find out more when it becomes widely available, but LG will probably price themselves out of the UW Gsync market anyhow.

uwgs-price.jpg
 
Last edited:
but I am not sure either of us have enough knowledge to say exactly what the Gsync module requires.

Yes, you can't know how much I (a random person on the internet) understands about these issues, and you should be skeptical. That's always a good policy.

Note however that I'm not the one playing armchair engineer (although I actually am one). I've not claimed to know how nVidia's DP1.4 G-SYNC module can be made any cheaper. You have, by postulating that nVidia should make it cheaper by making an alternate "version" which omits support for HDR10.

I don't know how to fix nVidia's cost problem, because I don't know why nVidia made the design decisions they did.

What I DO KNOW is what HDR10 is (a protocol, defined by a set of standards, collectively also known as in ITU-R BT.2100), and how irrelevant that protocol is in terms of contributing to the overall cost of the G-SYNC module. If you have any formal background in computing at all, you'll be able to read up on that and the DP1.4 standard and also conclude that this isn't what contributes to cost. I might be incorrectly assuming that you have some background in computing (otherwise you do seem to know a thing or two), so I was hoping that's what you'd do.

As far as the DP1.4 G-SYNC module is concerned, all HDR10 requires (in comparison to DP1.4 without HDR) is the following:
  • adds ~25% to the DP bandwidth requirements (an extra two bits per pixel). In terms of cost this is irrelevant, because DP1.4 always transfers data from the GPU to the monitor at the maximal transfer rate, with or without HDR (for HBM3 data is always transferred at 25.92 Gbps). For bandwidth that isn't used the GPU just transfers zeros until the next frame is ready to be sent to the monitor.
  • adds <0.1% for some HDR related information (information about what is being transferred)
That's it. All that is solved in software/firmware. In terms of cost, it just doesn't matter what is sent across the connection. Compared to DP1.2, all the extra cost is incurred by having to support higher bandwidths, but since transfer rates are fixed, that cost is incurred with or without HDR support.

HDR support impacts the cost of the monitor primarily by requiring a much more capable backlight assembly, particularly if that uses FALD and things like QLED or nano-IPS coatings. Those things are simply independent of the DP1.4 G-SYNC module.

I'm sorry to harp on about this. I've just noticed a lot of confusion here about what HDR is and how it works, which is further evidenced by statements like:

"the DP1.4 G-SYNC module doesn't support SDR"
Nonsensical, because SDR is a subset of HDR. You can't have HDR without also having SDR supported implicitly.

And 1000 nits on PC monitor is too much.
Nonsensical, because as others explained, peak brightness is reserved only for very small highlights. For some reason it's a very widespread misconception that a HDR monitor would bombard you with 1000 nits at all time, when not even televisions do that.

etc.
etc.
etc.

I was hoping to help improve that, and thought your point was as good a place to start as any. I'll leave it at that.

Otherwise, we fully agree that the current situation is unsustainable. If nVidia can't make their G-SYNC module less expensive then nVidia might as well eliminate G-SYNC now. If $500 - $800 is a correct estimate then it's simply priced way out of the market and isn't competitive at all.
 
Last edited:
Then why not use DP 1.4 instead?. Graphics cards strong enough for this monitor already have DP 1.4, others can use it in 1.2 mode, LG already did SAME trick with 34UM95 and DP 1.1/1.2.
Reason they using 1.2 is to make another revision of this monitor ("upgraded" "pro version" with 1.4) half year l8r for more $$.

Dont buy it, this monitor is already old tech.

It's not that simple/conspiratorial. Even if LG wanted to use DP1.4 and G-SYNC, they can't do so without making the monitor much more expensive, at least $500, likely more, because that is apparently how insanely expensive nVidia's DP1.4 G-SYNC module is. That's simply not a realistic solution. LG is forced to choose between:
  • release the monitor using DP1.4 G-SYNC, at a price nobody will pay, or
  • use DP1.2 G-SYNC, and loose part of the enthusiast community (as they aren't willing to accept the lower refresh rates DP1.2 entails).
Neither choice is great. This problem is specific to nVidia and G-SYNC. AMD doesn't have this problem, because their implementation solves most of the adaptive sync issues within the GPU, rather than adding a separate computational module to the monitor. Up until now this wasn't as much of a problem for nVidia, as their DP1.2 module "only" added an extra $200. DP1.4's higher bandwidth requirements meant nVidia needed a far more capable FPGA, and that was apparently only available at a much higher cost.
 
Last edited:
Basically Nvidia are controlling which monitors can claim HDR and G-Sync, and they do this be controlling who gets the new G-Sync module. They only want G-Sync associated with HDR1000, and will only allow the module to be used in monitors that can achieve this HDR standard, to avoid poor HDR performance being associated with Nvidia.

The new G-Sync module will of course work with SDR, HDR400 and HDR600 panels; but Nvidia is not allowing it by controlling access to the module.
 
Back
Top Bottom