HDR is underrated

HDR is a mixed bag for me ; when done well it's a nice addition, but sadly many games have an awful/broken implementation. The industry could with a standardised approach to HDR.
 
Stalker 2 isn't massively better lol. It's a different look that you either like or don't given there is no raised black floor in the sdr presentation and is reflective of true Stalker from the OG games (with corrected black floor).

It took a long time faffing with the settings, but done correctly Stalker 2 is massively better in HDR

current-gen oled monitors simply don't get bright enough to provide enough luminance difference in higher APL scenes, hence only being certified for TB400 which is effectively SDR+.

You realize the current gen OLEDs are HDR1000 right?
They have TB400 certification because some people prefer it (not me) and argue that it's more accurate, but all the latest QD-OLED monitors from the last few years do 1000nits perfectly fine
 
You realize the current gen OLEDs are HDR1000 right?
They are not HDR1000 certified, Peak 1000 is just a metric and no way anything derived by the HDR consortium! On Gen 1 QDs Peak 1000 was fine as the APL balancing was done well, it regressed on gen 3 slightly and which mode you use is strictly determined by the type of content you consume and thus prefer one over the other. the gist is as noted by the OLED community comments on reddit:

If you play dark games with a low APL, HDR1000 mode will look and measure better.

In high APL content, HDR400 will come ahead due to more favourable ABL behaviour - this is just data. Swap if you care enough.

Average picture level... basically the percentage of the screen that is bright, 0% APL being full black and 100% APL being full white.

The new panels that should be coming out later this year onwards will be certified for 1000nits+ though.
 
Last edited:
They are not HDR1000 certified, Peak 1000 is just a metric and no way anything derived by the HDR consortium! On Gen 1 QDs Peak 1000 was fine as the APL balancing was done well, it regressed on gen 3 slightly and which mode you use is strictly determined by the type of content you consume and thus prefer one over the other. the gist is as noted by the OLED community comments on reddit:





The new panels that should be coming out later this year onwards will be certified for 1000nits+ though.

Certification or not, they're not limited to 400nits as Amatsubu was was insinuating and they do achieve 1000nits
 
Last edited:
Yeah but with a compromise as per the above. the APL vs ABL differential can often be too jarring to make 1000 mode acceptable, again this all comes down to dynamic range accuracy and personally I find TB400 the most accurate, as even in 1000 mode you're never going to get 1000 nits on an APL that is larger than a few % of total screen size, so are actually better off leveraging a more consistent ABL when factoring in APLs by using TB400 which has that consistency.
 
Yeah but with a compromise as per the above. the APL vs ABL differential can often be too jarring to make 1000 mode acceptable, again this all comes down to dynamic range accuracy and personally I find TB400 the most accurate, as even in 1000 mode you're never going to get 1000 nits on an APL that is larger than a few % of total screen size, so are actually better off leveraging a more consistent ABL when factoring in APLs by using TB400 which has that consistency.

I find specular highlights more important than APL.
APL can be adjusted easily enough when fine tuning HDR settings, it won't be 100% accurate because eye site is not a calibration machine, but I can get the overall picture to how I like it and how I feel it should be.

I get that everybody is different and has different preferences, it would be boring if we were all the same.
When I tried TB400 it only felt slightly better than SDR and didn't wow me at all
 
It took a long time faffing with the settings, but done correctly Stalker 2 is massively better in HDR



You realize the current gen OLEDs are HDR1000 right?
They have TB400 certification because some people prefer it (not me) and argue that it's more accurate, but all the latest QD-OLED monitors from the last few years do 1000nits perfectly fine

They don't do 1000 nits "perfectly fine" because their EOTF curves are borked and you're getting way dimmer, less accurate scenes than you should in most cases because ABL kicks in before you land your ideal situation and manage to see the screen maybe reach 1000 nits for a second on one specular highlight in a 2% window, if even. You barely get to see 1000 nits on these.

If it was so perfect, it wouldn't have been brought to attention by so many people and YT channels.

And yes, I do realise because I wrote myself that I sometimes use Peak 1000:P

The thing that you don't realise, however, is that Peak1000 is NOT HDR1000. It's a gimmick mode and a compromise you're willing to make depending on content. Mrk is absolutely correct about this.

No OLED monitor on the market is HDR1000 certified because they can't do it as it requires 1000 nits in a peak 10% window and 600 nits in a sustained 100% window IIRC. Yep, that's HDR1000 and that's how far Peak 1000 is from it.
It's like comparing TB400 with HDR400.

That's why they don't have the certification, not because of "preference". I never said they're limited to 400 nits, I said they're certified for TB400 because that is the only thing they can RELIABLY and accurately do.

And I also mentioned TB400 is basically SDR+ so we agree but unfortunately Peak 1000 is far from perfect itself and in many cases worse.
You just get to have brighter specular highlights when some conditions are met at the cost of overall brightness and accuracy. I don't believe I've said anything false.
 
Last edited:
Back
Top Bottom