LG 34GK950G, 3440x1440, G-Sync, 120Hz

I haven't really followed any monitor launches this closely before. Is it usual that they are this.. hmm.. unorganized? Sporadic releases in a few countries in Europe; Poland, Latvia and Italy. And then F-version releases before G-version in US even though we expected to wait for the F-version, at least I'm UK.

At least it seems the release in Europe is close. So that is pretty great.
 
It’s true that the colours of a wide gamut backlight will look more vivid and saturated but remember that a lot of more casual users will actually prefer this. Especially for general gaming and multimedia.

Not having a reliable sRGB emulation mode is a problem if you need to do any colour critical work, photo editing and that kind of thing with standard sRGB content. Otherwise it’s not really an issue for general users and you might find it’s actually preferable.

Consumer unawareness shouldn't be the baseline. It'd also go a long way if monitor reviewers looked at the PQ output outside of what the results table show.

Having good numbers is one thing but real content might show things like:
posterization
banding
black level uniformity
while level uniformity
picture depth (a combination of color performance against black levels)
HDR limitations

Personally knowing that a large and growing number of gaming studios are now using calibrated OLED panels as reference for their color grading and HDR, you're in the same position as on TV's where the panel should be measured against reference monitors while using a baseline of content for evaluation beyond just the measurements.

If you're in the UK, it'd be worth attending next years HDTVTest shootout. You'll get to meet a lot of industry insiders. I do think there is a fair amount of room for improvement in monitor reviews compared to what we get with TV'.s Considering the high end monitors are at the cost of these TV's or damn near, we should expect the same level of depth from the reviews.
 
Awesome ! U guys let me know how it is. My order is in at BH Photo so hopefully they will get their stock in the next day or two!
First new monitor in last 10 years!!
 
Consumer unawareness shouldn't be the baseline. It'd also go a long way if monitor reviewers looked at the PQ output outside of what the results table show.

Having good numbers is one thing but real content might show things like:
posterization
banding
black level uniformity
while level uniformity
picture depth (a combination of color performance against black levels)
HDR limitations

Personally knowing that a large and growing number of gaming studios are now using calibrated OLED panels as reference for their color grading and HDR, you're in the same position as on TV's where the panel should be measured against reference monitors while using a baseline of content for evaluation beyond just the measurements.

If you're in the UK, it'd be worth attending next years HDTVTest shootout. You'll get to meet a lot of industry insiders. I do think there is a fair amount of room for improvement in monitor reviews compared to what we get with TV'.s Considering the high end monitors are at the cost of these TV's or damn near, we should expect the same level of depth from the reviews.

I'm not really sure what you meant about consumer unawareness, that doesn't make much sense to me. all i was doing was pointing out that to most consumers the fact the screen has wide gamut support is probably preferrable.

always happy to hear feedback on suggestions on what to include in reviews, but reviewing TV's and reviewing monitors is a fairly significant difference. A massive amount of time and effort goes in to the current reviews i do, i'm not sure there would be any way i could cram more in to those without something else having to give. Some of what you've mentioned is already covered like testing for banding/colour gradients, white level uniformity, HDR limitations also. There's only so much time in the day and only so much content/detail the majority of people really want to read anyway. Measurements provide a reliable means to quantify and measure performance while making comparisons between different displays over time. They are always paired with anlysis and commentary though.

Can you point me towards some of the good TV reviews you are referring to please that are really in depth?
 
I do think there is a fair amount of room for improvement in monitor reviews compared to what we get with TV'.s Considering the high end monitors are at the cost of these TV's or damn near, we should expect the same level of depth from the reviews.

I have no idea what TV reviews you're reading, and it seems you haven't read a single one of Baddass' reviews (or PCM's either), as they are both far more in-depth and detailed then I've seen any TV review get in to. Which stands to reason given how much more there is to cover with a monitor in respect to its various functions and possible use cases.
 
I'm not really sure what you meant about consumer unawareness, that doesn't make much sense to me. all i was doing was pointing out that to most consumers the fact the screen has wide gamut support is probably preferrable.

always happy to hear feedback on suggestions on what to include in reviews, but reviewing TV's and reviewing monitors is a fairly significant difference. A massive amount of time and effort goes in to the current reviews i do, i'm not sure there would be any way i could cram more in to those without something else having to give. Some of what you've mentioned is already covered like testing for banding/colour gradients, white level uniformity, HDR limitations also. There's only so much time in the day and only so much content/detail the majority of people really want to read anyway. Measurements provide a reliable means to quantify and measure performance while making comparisons between different displays over time. They are always paired with anlysis and commentary though.

Can you point me towards some of the good TV reviews you are referring to please that are really in depth?

John does a good job in his write ups as he describes the viewing experience in relation to the numbers, compares it other sets where it makes sense: https://www.forbes.com/sites/johnar...cloudy-with-a-chance-of-awesome/#4d5325cbb346

Vincent goes through a fair amount of detail on his video reviews covering key areas. He's a stickler for accuracy and always looking to match a reference panel so his viewpoints are about consumer sets match those panels: https://www.youtube.com/watch?v=7TY9GaggRqg

One thing that really helps both is having a set of reference content that's repeated so the reader know what they means when going from display to display. Example; if you always use a certain section of Witcher 3 to look at shadow detail then you can let the reader know what while the numbers look good, display B keeps does a better/worse job of showing the details on the leaves in the moonlight.
 
I have no idea what TV reviews you're reading, and it seems you haven't read a single one of Baddass' reviews (or PCM's either), as they are both far more in-depth and detailed then I've seen any TV review get in to. Which stands to reason given how much more there is to cover with a monitor in respect to its various functions and possible use cases.

I've been following TFT central for a long time and I've had calibrated TV's since around 2008's. I enjoy my A/V and greatly respect those in the industry who go out of their way to review these sets (tv's and monitors) in great detail so we can argue about it on the internet.

With that said, there is a balance between what you measure and what your eyes see. I'll give you a recent example without going into too much detail. A certain manufacturer loves to talk up their brightness and ability to lower their blacks. On measurements following established standards, they're absolutely right. No dispute whatsoever.

Then something happens when put content on but put it next to reference monitors and other competing panels. Those blacks that are measuring well are actually crushing details and the nits you measured were tuned to know when a pattern is being shown to maximize the output. Then in any low apl scenes, the brightness is turned way down to keep the FALD algorithm working which means the picture looses all it's punch. In this instance, what reference point should we use? our eyes or the straight numbers?

One of the issues I have with rtings reviews are their "numbers only" approach. I can't recall rtings surfacing any issues set by their own discovery from viewing content because as long as the numbers give an output, they use it to render a rating. Actual content should alway be used to validate the numbers.
 
I've been following TFT central for a long time and I've had calibrated TV's since around 2008's. I enjoy my A/V and greatly respect those in the industry who go out of their way to review these sets (tv's and monitors) in great detail so we can argue about it on the internet.

With that said, there is a balance between what you measure and what your eyes see. I'll give you a recent example without going into too much detail. A certain manufacturer loves to talk up their brightness and ability to lower their blacks. On measurements following established standards, they're absolutely right. No dispute whatsoever.

Then something happens when put content on but put it next to reference monitors and other competing panels. Those blacks that are measuring well are actually crushing details and the nits you measured were tuned to know when a pattern is being shown to maximize the output. Then in any low apl scenes, the brightness is turned way down to keep the FALD algorithm working which means the picture looses all it's punch. In this instance, what reference point should we use? our eyes or the straight numbers?

One of the issues I have with rtings reviews are their "numbers only" approach. I can't recall rtings surfacing any issues set by their own discovery from viewing content because as long as the numbers give an output, they use it to render a rating. Actual content should alway be used to validate the numbers.

This is what I was talking about when I was to some extent questioning usability of display reviews to the end user, because they are all just numbers made with devices and they feel like the reviewer never actually used a screen in practice, which in the end results in omission of some essential characteristics of the display for actual practical usage. Thats the opposite of amateur reviews, which are much more focused on practical usage, but they lack the numbers almost entirely which isn't good either because numbers are also very important. It is extremely hard to find a review that would be extremely detailed with numbers, like TFT Central reviews for example, but at the same time contained similarly detailed description of practical usage, ideally also based on at least three retail units with varying manufacturing dates.
 
TFT Central helps as a point of number reference decisions to buy would either be by purchase on recommendation or if lucky enough by first seeing the product on display
As everyone knows all monitors are as different as humans none the same so figures do play a part in any purchase thus IMO TFT Central serves as a good place of reference
I have 9 monitors in my office all the same make none are the same and I never expected them to be
Anyway I for one saw this at a show in South Korea and gamed on it for 20 mins so I will be purchasing one when they are available at a good reliable outlet

Excuse my English Google Translate is slow :)
 
Last edited:
TFT Central helps as a point of number reference decisions to buy would either be by purchase on recommendation or if lucky enough by first seeing the product on display
As everyone knows all monitors are as different as humans none the same so figures do play a part in any purchase thus IMO TFT Central serves as a good place of reference
I have 9 monitors in my office all the same make none are the same and I never expected the to be
Anyway I for one saw this at a show in South Korea and gamed on it for 20 mins so I will be purchasing one when they are available at a good reliable outlet

Excuse my English Google Translate is slow :)

If they are high end, they should be very close to each other post calibration. Part of what you're paying a high end premium is for reducing the range of variance. If not, there needs to be tighter tolerances by the manufacturer during production.

I think it just comes down to PC users needing to have better expectations from manufacturers, holding them to a higher standard and not just settling because they "option B is just as bad!"

I rather pay a premium for better QC and IQ than someone spending that time and budget doing a RGB lit RoG circle on the back of a monitor.
 
One of the issues I have with rtings reviews are their "numbers only" approach. I can't recall rtings surfacing any issues set by their own discovery from viewing content because as long as the numbers give an output, they use it to render a rating. Actual content should alway be used to validate the numbers.

it's always a bit of a balancing act between providing quantitive or subjective content for reviews. Personally the foundation for me reviews at TFTCentral are largely based on quantitive measurements, things i can test and compare easily, measurements i can take with equipment and software that the average user does not have access to. There are always plenty of first hand experiences from users on forums you can read, and more subjective analsysis from other review sites and particularly on youtube. i think there's plenty of those kind of reviews out there and plenty of people who will share their own experience, so my focus has always been on providing the other stuff that isn't readily available. actual measurements for gamma, white point, contrast, luminance etc using spectrophotometers/colorimeters, precise measurements of PWM/backlight regulation, precise measurements for response times, blur reduction strobing, input lag etc. I am against providing a review with just numbers, as your rtings example you mention, even though that would make my life a lot easier! :) i do try to provide some subjective observations for all the measurements and provide some commentary to accompany and explain what the measurements mean, and what they will mean to most users.

it's tricky to get more subjective as a lot of the time that is going to vary depending on the user, their expectations, previous screen experience, susceptability to different things, eye sight, lighting/room conditions etc. I also only have a certain amount of time to write these reviews and so there's not always time to go any more in depth sadly. I'd probably get much less new content written if i were to do that. At the end of the day, some readers might value the additional content, some might not need it. I like to think of it as a resource to get accurate quantitive measrements and analysis, to supplement what you might find online from subjective reviews, user feedback and your own personal testing and experience if you can see a screen in person.

also based on at least three retail units with varying manufacturing dates.

thats just not realistic to expect i'm afraid, and if you think about it or were in the position of someone trying to review these things you'd see why. firstly you have no idea how much space in my office is used up most of the time with displays, boxes, cabling etc as it is? :) I'm sure PCM probably has the same challenge. I've got enough monitor stuff lying aorund as it is thanks!

Secondly, to try and purchase, examine and use a screen for 2 - 3 weeks typically at a time, and then return multiple screens at my own expense and hassle is not viable. It would cost me around £4000 out of my own pocket to buy 3x LG 34GK950G's for instance. That would also likely need returns to multiple different stores if you wanted an even split of samples from different stock/manufacturing lines. There's then almost certainly challenges around them accepting returns in this way, if not the first time then certainly by the second or third time you do it to an online shop, and they cotton on to what you're doing. they're not here to loan you a screen for a few weeks then take it back and give you a full refund when it's in a state not quite "as new" given it's been unboxed and used (and often dismantled). it would also add significantly to the testing and review time, as i just dont have the time or inclination to test the same screen over and over again with different samples to see if there's any variation in the numbers. those tests all take time to do, record and then write up for the review so multiplying that by 3, even if it's only for certain parts of a review, is just too much. Finally, if that were to the the normal process it would mean i could only actually review a screen when it reached the distribution channel and was available to buy and that means i would no longer be able to give keen readers a first look at a new or forthcoming screen, regularly before it's even available to buy anywhere - as with the 34GK950 models here.
 
I'm sure PCM probably has the same challenge. I've got enough monitor stuff lying aorund as it is thanks!

Absolutely. I've got a guest bedroom room pretty much dedicated to monitor boxes at the moment. Just lucky I don't have any guests at the moment! ;)
 
Just received 38UC99. It is even slightly better than UC98, as it has virtually no bleed while viewing angles and IPS glow are very similar. Even when I test some really dark scenes, glow is never distracting, because it is just some very small amount of dark grayish-silverish tint just a little bit in the corner, instead of massive yellow glow on 950G and Alienware that was literally covering the picture so you could not see what is beneath the glow. For a screen this huge this is an extremely impressive bleed/glow performance, the best I have seen. And it is bigger than UC98 while having less curve, so viewing angles to the corners are worse, on 34" 1900R it would be even better. From issues I can see some small DSE and the curve is smaller than on my UC98 which combined with bigger size in making a difference and will take some time to get used to. The size too, the difference vs 34" is bigger than I expected

So I am officially out of this monitor lottery nonsense
 
Last edited:
Back
Top Bottom