LG 34GK950G, 3440x1440, G-Sync, 120Hz

Normal monitors are at 300 nits. The freesync is 400 nits typical and 550 peak. You don't need local dimming to have a nice HDR experience. Personally I prefer not to, as the local dimming gives a lot of issues with fidelity. So it's almost double the peak than a normal ultrawide gsync monitor. Of course, it's subjective what you want. But just because it doesn't meet the DisplayHDR600 spec by 50 nits, doesn't mean it's only min DisplayHDR400 spec (it's not).

To put things into perspective, the 5K ultrawide is 450 nits typical, yet DisplayHDR600, because the peak is over 600 (750 I believe). No local dimming there either.

Correct me if I am wrong, but the Gsync version already has 400nits due to the UW5 panel.
 
Correct me if I am wrong, but the Gsync version already has 400nits due to the UW5 panel.

Yes but it can't do HDR as a result of the older 1.2 G-Sync module. The Freesync version doesn't have this problem. It's an odd situation with these two monitors... you're paying more for less with the G-Sync model.
 
Man this thread was a wild ride

What do you guys think? should i wait 2 months for this (hoping it wont get delayed) or just buy aw34 now? im only interested in gaming and nothing else. do i need DCI-P3 and nano ips if im only using this monitor for games ?
 
Man this thread was a wild ride

What do you guys think? should i wait 2 months for this (hoping it wont get delayed) or just buy aw34 now? im only interested in gaming and nothing else. do i need DCI-P3 and nano ips if im only using this monitor for games ?
LG.Display IPS LM375QW2 37" curved 2300R 144Hz HDR1000 3840x1600 Q1 2019
 
tbh i would wait 2 months for black friday in this regard.
unfortunately in ireland that doesnt apply and discounts for electronics are ******* abysmal. regardless whether its uk or ireland, they give discount on selected items that no one buys


but my question still stands. would 950g make any difference to someone who owns aw34, and is only using it for games?
 
LG.Display IPS LM375QW2 37" curved 2300R 144Hz HDR1000 3840x1600 Q1 2019
I am just pulling out of my ass but I very much doubt this will be released in 2019 and it will most definitely cost you and arm and a leg.

And a fan cooling the GSync module... So dumb.

It does sound like the dream panel for a monitor though.
 
I
unfortunately in ireland that doesnt apply and discounts for electronics are ******* abysmal. regardless whether its uk or ireland, they give discount on selected items that no one buys


but my question still stands. would 950g make any difference to someone who owns aw34, and is only using it for games?
Personally I do not see the point in upgrading from a AW to a 950G. It is an upgrade for sure but not by a huge amount. Seems like waste of money to me.

But if you are deciding between the two and have the extra cash to spend. The 950g seem to be the better choice.
 
but my question still stands. would 950g make any difference to someone who owns aw34, and is only using it for games?

Will likely be slightly better color/picture quality but the main attraction (imo) is to see whether any of the flicker issues being sporadically reported by X34P and AW34 owners is eliminated entirely. In other words whether it will be rock solid at the 120hz overclock all the time (and months/years down the road)...
 
I am just pulling out of my ass but I very much doubt this will be released in 2019 and it will most definitely cost you and arm and a leg.

And a fan cooling the GSync module... So dumb.

It does sound like the dream panel for a monitor though.

Why is it dumb? It generates a lot of heat. Is having a fan on your GPU or CPU dumb?
 
Man this thread was a wild ride

What do you guys think? should i wait 2 months for this (hoping it wont get delayed) or just buy aw34 now? im only interested in gaming and nothing else. do i need DCI-P3 and nano ips if im only using this monitor for games ?

Don’t think you really NEED anything as such. It’s up to you. If you want to have something now then the AW34 is the obvious choice. The panel in the 950G monitor should be technically better than the AW34, but the main reason to wait is because we know there are reports of flickering/etc with the AW34, and we don't know what the 950G will bring yet. Of course there's also price differences to take into consideration, and design which is down to personal preference.
 
Why is it dumb? It generates a lot of heat. Is having a fan on your GPU or CPU dumb?

I am not questioning the function of the fan.
A moving part like that is bound to break or have issues, not in all monitors of course but issues with this will appear, I am certain.

In a computer I can just change it. In a monitor it's going to be a bigger issue. Hopefully they have a easy way to access it for us, but probably not. And if the warranty has run out (LG is usually two years if I am not mistaken) that is going to come out of my wallet or time.

I would rather see them choosing a less hot FPGA or solve it some way with passive cooling.
 
Why is it dumb? It generates a lot of heat. Is having a fan on your GPU or CPU dumb?

It's dumb that the new module needs a fan to begin with... Nvidia should develop actual ASICs instead of using expensive FPGAs. An ASIC could be passively cooled.

It could kill 2 birds with one stone - lower the cost of G-Sync and remove the ridiculous need to put fans in monitors that would otherwise not need to have them.
 
I

Personally I do not see the point in upgrading from a AW to a 950G. It is an upgrade for sure but not by a huge amount. Seems like waste of money to me.

But if you are deciding between the two and have the extra cash to spend. The 950g seem to be the better choice.
so i made a mistake in ordering an aw34 then.. hmm i can always refund
 
so i made a mistake in ordering an aw34 then.. hmm i can always refund

I am just a dude with opinions. I am not necessarily right. :)

I can in a lot of ways see that the DW is a much better purchase. Especially if you get it for a good price.

The DW is a better "bang for the buck". But the 950G will be a slightly better monitor, by the looks of it.
 
There are answers to some questions on LG US 950G product page, including this:

9YgWyw6.png


Hard to say how reliable is this info, but if it is true then any spec on these LG product pages may be false or not up to date.

Man, this release is a mess. Not a bigger mess than Alienware's terrible out of the box picture setup and no OSD controls provided to fix it, hence why I am still waiting, but really LG, wtf. It is looking just like some CPU or GPU release, hype, leaks, estimated/rumored release dates, conflicting information from different sources... It is really getting tiring.

Normal monitors are at 300 nits. The freesync is 400 nits typical and 550 peak. You don't need local dimming to have a nice HDR experience. Personally I prefer not to, as the local dimming gives a lot of issues with fidelity. So it's almost double the peak than a normal ultrawide gsync monitor. Of course, it's subjective what you want. But just because it doesn't meet the DisplayHDR600 spec by 50 nits, doesn't mean it's only min DisplayHDR400 spec (it's not).

To put things into perspective, the 5K ultrawide is 450 nits typical, yet DisplayHDR600, because the peak is over 600 (750 I believe). No local dimming there either.

Why do you keep lying that FreeSync version have 550 nits peak? There is zero data to confirm that.

Also you clearly don't understand what HDR is. The entire point of HDR is to have the ability to control brightness of different parts of the screen independently, ideally with pixel level accuracy. For LED LCD screen, if you don't have local dimming then it means you have only global control over brightness, so it is physically impossible for such display to have HDR.

Also 5K ultrawide has to have local dimming. Read VESA requirements here:

https://displayhdr.org/performance-criteria/

It clearly states that local dimming has to implemented for HDR600 and HDR1000 for LCD LED displays to be able meet contrast requirements.
 
Last edited:
Back
Top Bottom