HDR is underrated

Associate
Joined
19 Jul 2015
Posts
547
I came to this conclusion after playing STALKER 2, which makes particularly good use of HDR (albeit exaggerated by using auto exposure too). Beyond the purely aesthetic effect of making things brighter and more colourful, it can actually have an effect on gameplay.
  1. Assuming a bright enough monitor, it makes lighting part of tactical decisions, for instance it becomes a disadvantage to attack someone while facing into the sun, or to look at a bright light while creeping through the dark.
  2. Increased bit depth makes it easier to see in the dark by adding more shades of grey instead of everything dark becoming black.
Are there any other games where turning on HDR affects gameplay?
 
Almost every game and tv show / movie is enhanced by HDR. It's effect is a lot more pronounced on non-OLED monitors that struggle with contrast. On OLED, the difference can be marginal, until you see something with staggering contrast between dark and light (ori for example).
 
Huge HDR aficionado here, and sorry to **** on your chips, but Stalker 2 has a very shoddy HDR implementation, like many games. If you use the default HDR in-game settings it completely crushes blacks and destroys all details in shadows, especially indoors.

With a few tweaks you can get it reasonable but even then the in game HDR settings are 'wrong' - for example peak HDR brightness is actually controlled by the SDR brightness setting, not the setting it should be. I've tested it all with Reshade and Lilliims's HDR shaders to monitor. I'm a bit of an HDR geek.

I consider a proper HDR setup more important than Ray tracing, especially on OLED, but devs are just so lazy/ retarded these days it's mind blowing.
 
Last edited:
I honestly think it's more a case of very few people having the equipment to really appreciate it.

The vast majority don't own an OLED or even a LED with FALD and decent peak brightness etc, most are on cheaper tv's and monitors that claim to support HDR at best but in reality don't.

Makes me think of "HD-Ready" TV's back in the day that would only do 720P rather than 1080P.
 
Last edited:
HD-Ready meant 720p. 1080p sets were marketed as ‘Full HD’.

Part of the issues with HDR is it’s not as simple as pixel count. So many different standards and implementations of varying quality that mean it’s actually not trivial to get it working well. Even when you do, changing between different content means often you’ll be adjusting settings constantly.

I have an OLED TV and it can look stunning sometimes on HLG, other times pants. Most Dolby Vision content is forgettable and occasionally blinding in a dark room. The PlayStation calibration and settings never seem ‘right’ either.

With respect to the forum, I haven’t actually tried HDR for PC games as my monitors don’t support it and since I’m using Linux now I wonder if it would be easy to get working anyway.

In short, I could probably live without it.
 
HD-Ready meant 720p. 1080p sets were marketed as ‘Full HD’.

Aye, I'm aware, I believe HD-Ready also supported 1080i now I think about it.

The difference between HD-Ready and Full-HD did confuse a lot of people back in the day however. I've seen similar with people buying HDR400 monitors only to have no appreciable benefit, technically they are reading HDR, but might as well not be in most cases.
 
Can’t disagree!

Maybe @wunkley can tell us what to look out for in a PC monitor for HDR usage.

Monitors are a bit of a minefield in general, it's much easier to research for TV's imo.

That said, pretty much any of the OLED's should deliver good HDR, it's the LED FALD displays where things become murky as they seem to be very hit and miss. I'm on a Philips Evnia 34" Ultrawide OLED at the moment and it's spot on tbh, only cost me a little over £500 at that which I feel is very reasonable for a good monitor.

Sites like RTings definitely help.
 
Last edited:
Almost every game and tv show / movie is enhanced by HDR.
Indeed, and considering it also has 0 performance impact it's a shame that more games don't implement it.

Huge HDR aficionado here, and sorry to **** on your chips, but Stalker 2 has a very shoddy HDR implementation
That implies that other games are better. Can you name some, and describe how their HDR implementation changes the way you play the game?

it completely crushes blacks and destroys all details in shadows, especially indoors.
That's definitely not what it looked like to me on my monitor. I never had any trouble picking out details from shadows, and found myself not using a light in the dark sometimes, for the same reason you can sometimes see better without one IRL.

I honestly think it's more a case of very few people having the equipment to really appreciate it.
It costs quite a bit to get a decent HDR monitor, so that's definitely a problem. However I reckon it has more impact per £ than raytracing, and that gets far more attention.
 
HDR is one of those things that some people get really weird about and tell you why what you have looks terrible or doesn't 'represent the intent of the designers'.
Just stick with what you think looks good and let the 'purists' have a melt down.
It can get as insufferable as the 'audiophile' lot telling you why what you have actually isn't good and yada yada, obnoxious lot.
 
Last edited:
HDR is one of those things that some people get really weird about and tell you why what you have looks terrible or doesn't 'represent the intent of the designers'.
Just stick with what you think looks good and let the 'purists' have a melt down.

It's the same with audio tbh.

You get a lot of people claiming X or Y is the best, or that they have magical hearing at age 40-50 + despite the obvious decline due to age, which also tends to be the group of people most inclined and able to spend money on those things.

In any instance where sensory data needs to be appreciated, people tend to discount the fact that we all view the world differently. I'm not saying that as some philosophical thing either, it's simply a fact that some people have better senses than others and that said senses decline with age. Some of us can even see more colours than others, and that tends to be far more common in women than men funnily enough given it's men that tend to be tech obsessed.


But even ignoring that the way we view things it's often very, very different from person to person. I love HDR when done well, but (correct me if I'm wrong) it's more about the vividity of scenes and contrast which should be noticeable to pretty much anyone. So I'd argue it's usually more a case of a lack of product knowledge mixed with dodgy marketing, mixed with limited budgets from people that simply can't afford or aren't willing to invest for the sake of a higher quality image. The average UK person that watches the likes of Eastenders and reality TV isn't going to care, and it's not like those shows support the formats anyway. When it comes to gaming that becomes an even bigger issue as stock settings if they exist at all are very often poor.
 
Last edited:
The only reason I upgraded to Windows 11 was for auto-HDR. I have an IPS ultrawide and its peak brightness can really singe my eyes (in a good way).

As far as affecting gameplay, I guess it causes me to stop and take in the view more :cry:

Stalker 2 has its moments if taking in a well-lit vista. Once night falls, I run into IPS specific / limited zone lighting issues. Strangely never notice these in Dishonored 2, an older game that really benefits from auto-HDR, but I get the feeling games in the good old days were generally brighter (like TV series / movies used to be, while many of the newest are now ridiculously dark for some reason).
 
Huge HDR aficionado here, and sorry to **** on your chips, but Stalker 2 has a very shoddy HDR implementation, like many games. If you use the default HDR in-game settings it completely crushes blacks and destroys all details in shadows, especially indoors.

With a few tweaks you can get it reasonable but even then the in game HDR settings are 'wrong' - for example peak HDR brightness is actually controlled by the SDR brightness setting, not the setting it should be. I've tested it all with Reshade and Lilliims's HDR shaders to monitor. I'm a bit of an HDR geek.

I consider a proper HDR setup more important than Ray tracing, especially on OLED, but devs are just so lazy/ retarded these days it's mind blowing.

Any good sites out there that help with specifics on running HDR. I’ve found it mostly underwhelming on most games I’ve used it on
 
I think gamingtech on youtube is quite over rated and can be hit and miss. From personal experience he's made some glaring errors in the past on stuff.
There is one guy I really like on youtube - PlasmaTVforGaming. His enthusiasm and deep dives into stuff is quite infectious. Some of his older videos go deep into alternatives and fixes for improving HDR in busted HDR games, but if you're not into general modding, icc colour profiles, using Reshade, SpecialK etc etc then it won't be for you.

My very simple advice to start would be:

1. Get an OLED TV/ Monitor. Newer tech is here/ coming but don't bother with anything less.
2. If a game's built-in HDR is not looking up to par to your eyes - try using RTX HDR instead if you're on Nvidia! It really is very good now. But even that isn't as easy as it should be (eg you must disable AutoHDR either globally or for the game's profile in Windows/graphics settings), and HDR needs to be set off in the game itself. Also it doesn't work with all games (eg Vulkan API games for a start, or didn't a while ago, may do now).

HDR on PC is needlessly complicated unfortunately and still hasn't reached a proper industry standard after all these years. If you're not prepared to put the effort in and learn a bit, HDR is a rabbit hole. If you can be bothered, and I can, I think it's superb.
And people who moan about audiophiles and 'HDRphiles' if you like - they're just moribund. Same sort that moan about good food in good restaurants. Mediocrity is not something to celebrate.
 
Last edited:
HDR on PC is needlessly complicated unfortunately and still hasn't reached a proper industry standard after all these years. If you're not prepared to put the effort in and learn a bit, HDR is a rabbit hole. If you can be bothered, and I can, I think it's superb.
And people who moan about audiophiles and 'HDRphiles' if you like - they're just moribund. Same sort that moan about good food in good restaurants. Mediocrity is not something to celebrate.

I'm still on Windows 10 and it's nightmarish with default settings, but I also use my PC as a media centre. MPC with the right settings is great for video when sending data to my TV, but gaming is so hit and miss it's unreal.

I would very much enjoy your advice on how to buy a good monitor for HDR as per @alex24's request?

What should people look out and aim for, or is it just a case of going OLED?
 
Last edited:
1. Get an OLED TV/ Monitor. Newer tech is here/ coming but don't bother with anything less.

I have a 4K VA TV that always has a noticeable improvement with HDR on. For my IPS Ultrawide, HDR is somehow absolutely revolutionary in games.

I recall someone making a point regarding OLED, about its infinite contrast ratio being redundant when the lowest brightness pixel on the screen is 1 nit / not absolute pitch black. Between 1 nit and maximum brightness, suddenly OLED's contrast is average at best.

Sounds like OLED can't yet replicate the spectacular almost wince-inducing mega-brightness HDR on IPS can. My VA TV also can't quite do mega-brightness, but on the other hand its blacks are basically OLED-like to me and its still spectacularly colourful, like my IPS with HDR on.

Maybe I'm just lucky having a top drawer VA TV and IPS monitor respectively. Either way, I'd say having OLED is definitely not a pre-requisite to getting game-changing HDR.
 
Back
Top Bottom