Soldato
Does anyone knows if the 950F going to be HDR400 or 600?
HDR 400.Does anyone knows if the 950F going to be HDR400 or 600?
Does anyone knows if the 950F going to be HDR400 or 600?
DisplayHDR 400 only. The panel maxes out at 550 nits, where 600 is needed. Really dumb. It's 8+2 bit FRC, which is enough for HDR600. I just don't understand why they didn't add 50 more nits since the 5K ultrawide has that.
I think it's not 8+FRC.
Given the 98% DCI-P3 do you believe it won't have 8+FRC?
https://gzhls.at/blob/ldb/1/0/9/9/24e296efdcac616e642efb237023c93a1723.pdf
The above PDF is the spec of the monitor (34GK950F). I know in German but seems the only official spec list we have.
Ofc on the spec states DP1.2. I assume thats wrong for 144hz and Daniel said 1.4 yes?
Ofc on the spec states DP1.2. I assume thats wrong for 144hz and Daniel said 1.4 yes?
An IPS panel always leeks light, which is often called IPS glow. That an IPS panel can get to 550 nits without insane IPS glow and/or BLB is already really good (assuming contrast is still above 1000:1).DisplayHDR 400 only. The panel maxes out at 550 nits, where 600 is needed. Really dumb. It's 8+2 bit FRC, which is enough for HDR600. I just don't understand why they didn't add 50 more nits since the 5K ultrawide has that.
*shakes head* No!PD: The G version should be DisplayHDR 400 aswell because has the same specifications required to be HDR 400 like the F version.
*shakes head* No!
For HDR, a monitor must be able to process the HDR10 protocol. The DP1. 2 G-SYNC module, as used by the G version, doesn't understand the HDR10 protocol, so it will NOT do HDR of any kind.
HDR10 is defined as part of the DP1. 4 spec, so the monitor must use nVidia's newer DP1. 4 G-SYNC HDR module, if it wants to support HDR.
An IPS panel always leeks light, which is often called IPS glow. That an IPS panel can get to 550 nits without insane IPS glow and/or BLB is already really good (assuming contrast is still above 1000:1).
Maybe IPS panels will improve contrast in the future, but for now anything much above 550 nits will require a VA panel (or a willingness to live with really crappy contrast).
In a nutshell, it's a tradeoff between max. brightness and contrast, and LG had to balance the two without resorting to FALD, which gives you the best of both but at a much higher cost.
Now you understand
I just added some more info to my last post before seeing yours. See above. There is an excuse, unfortunately. Better panel technology will maybe give us better options in the future, but the sad fact is that TFT tech is rather crappy overall.The LG 34WK95U is DisplayHDR600 with an IPS panel (LM340RW1) with a peak brightness of 750 cd/m2. We are even talking same gen. panels here. So there's no excuse.
DisplayHDR 400 specs... Significant step up from SDR baseline:
- True 8-bit image quality – on par with top 15% of PC displays today
- Global dimming – improves dynamic contrast ratio
- Peak luminance of 400 cd/m2 – up to 50% higher than typical SDR
- Minimum requirements for color gamut and contrast exceed SDR
Did he [Daniel - LG] say that [34GK950F will reach 144]?
Daniel-LG did say that. It also states 144Hz on the spec-sheet which @Panos linked to.
Of course, you're right that the 34GK950F can't provide 3440x1440@144Hz using DP1.2 (see post #163 for the math), so something on the spec-sheet is obviously wrong. However, it's clear that 34GK950F will use DP1.4. This is why:
This is not controversial. It's only the 34GK950G that is confusing, but at least these two bits of information are set in stone, where the second must follow from the first:
- Daniel-LG previously mentioned the 34GK950F will use DP1.4
- The spec-sheet also mentions DisplayHDR 400 and FreeSync 2, both of which are based on DP1.4. You can't have either using DP1.3, so the monitor must support DP1.4
- Every scaler on the market that supports FreeSync 2 is based on DP1.4. A monitor OEM literally can't build a monitor that supports FreeSync 2 without supporting DP1.4
The only open question is why, for the 34GK950G , LG states: "100 Hz native overclocked to 120Hz". That is the only thing that is contradictory. Unfortunately, for this statement, we lack enough certain information to infer what is actually true.
- The 34GK950G will use nVidia's DP1.2 G-SYNC module
- Due to the lack of bandwidh of DP1.2, the 34GK950G will not go beyond 120Hz at 3440x1440
Sorry for the out of context question but I really need an answer
https://picload.org/view/dlccwdpw/c8e6d3b8-0cd5-4bfa-a4f5-19ce02.jpg.html
LG will be at IFA Berlin in 4 days.