Associate
- Joined
- 28 Jul 2018
- Posts
- 116
DisplayHDR 400 only. The panel maxes out at 550 nits, where 600 is needed. Really dumb. It's 8+2 bit FRC, which is enough for HDR600. I just don't understand why they didn't add 50 more nits since the 5K ultrawide has that.
Because HDR600 requires at least some kind of local dimming, while HDR400 does not. They would have to implement different backlighting.
There is no step up, this is a basic spec for any considerable display on the market, especially the one for the price of HDR certified displays that are coming out. 8-bit is already the minimum and only budget displays don't have it (and HDR certified displays won't reach this segment for many years from now), global dimming is an absolute bs and actually a degradation to a picture, 400 nits of peak brightness is not really useful vs current typical 300-350 if you don't have pixel level control or very precise local dimming, which you don't, for any Vesa certification, even HDR1000 was given to some displays with like 32 edge-lit zones, which is putting the credibility of all of these certifications into big question. Contrast requirements are below 1000:1, 955:1 from what I remember, so this a very basic requirement too, and still a horrible contrast. HDR400 means nothing. It only means that your display can read HDR signal and scale it back to SDR, because this is what effectively happens when HDR content is displayed on displays with no real HDR capability, like entry level TVs for example.Significant step up from SDR baseline:
- True 8-bit image quality – on par with top 15% of PC displays today
- Global dimming – improves dynamic contrast ratio
- Peak luminance of 400 cd/m2 – up to 50% higher than typical SDR
- Minimum requirements for color gamut and contrast exceed SDR