HDR is underrated

I have a 4K VA TV that always has a noticeable improvement with HDR on. For my IPS Ultrawide, HDR is somehow absolutely revolutionary in games.

I recall someone making a point regarding OLED, about its infinite contrast ratio being redundant when the lowest brightness pixel on the screen is 1 nit / not absolute pitch black. Between 1 nit and maximum brightness, suddenly OLED's contrast is average at best.

Sounds like OLED can't yet replicate the spectacular almost wince-inducing mega-brightness HDR on IPS can. My VA TV also can't quite do mega-brightness, but on the other hand its blacks are basically OLED-like to me and its still spectacularly colourful, like my IPS with HDR on.

Maybe I'm just lucky having a top drawer VA TV and IPS monitor respectively. Either way, I'd say having OLED is definitely not a pre-requisite to getting game-changing HDR.

I have two very expensive monitors sat next to each other on my desk. One is OLED and the other is IPS, and the difference is night and day. HDR is stunning on the OLED and, by comparison, dismal on the IPS. The dynamic range of the OLED is so much higher.
I can believe that VA is very good, though.
 
Last edited:
RTX hdr is just another killer feature on nvidia for games which don't support it :cool:

Well done comrade, +1 NVIDIA token has been added to your account.
I'm sure some melt would disagree it's not "proper HDR", which has always struck me as funny considering there isn't actually a standard for it.

BTW, I use it for everything as I CBA to turn off the filter game by game :D Life's too short to be manually tweaking values for nugatory gains. I think I read there is an fps hit but that hasn't been a problem on any game I've played.
 
Last edited:
RTX hdr is just another killer feature on nvidia for games which don't support it :cool:
I tried AutoHDR on Win11 but the nVidia RTX HDR filter has more adjustability so tends to work better.

First experience of HDR that worked was on a Dell VA 32" display with the fake 400 nit mode enabled. Only managed to get it working well in Horizon Zero Dawn but in doing so messed up the desktop or any other games so was very specific. I took note though and from then was looking for a decent setup.

When the Alienware 34" OLED monitor came out with the True Black 400 mode (also has a less accurate 1000 mode) that was the answer for me and now I look for HDR games with ray tracing a close second.

Currently playing Enshrouded which is helped by RTX HDR as there are no brightness or contrast adjustments in-game. I have the nights set to pitch black so point light sources really pop. That with incredibly rich colours across the environment makes it very painful to play on another non-OLED/HDR setup.
 
Last edited:
I've never figured out how to get HDR looking right on my PC games. It always ends up looking grey, the opposite of dynamic.
Same 9 times out of 10 HDR makes the game look washed out and worse. Don't get me wrong when HDR works correctly its amazing. But getting it to work is the problem. With the opposite of dynamic often being the case. Often its not a problem with the screens, its a problem with how windows implants and fails to correctly switch over to the HDR profile for games. Yet will work for YouTube videos.
 
Last edited:
HDR is under rated because unlike the world of AV, trying to use it on PC for gaming is (or at least was, last time I mustered up the energy) a complete pain the arse to make work correctly and half the time seems to completely break stuff, make it look bizarrely washed out or completely blown out etc.

It would have a far bigger and more positive profile if it simply worked and gave people consistently better results, with no faff, no messing around.
 
Last edited:
RTINGS have a page on HDR monitors for gaming, and their tests are usually very in-depth. They try to keep it up to date and complete, but with the number of panels out there and how quickly new ones come in, there are sometimes gaps. I would think it should give a good idea of decent technologies to look for and brands etc. OLED looks like a good bet of course, and they have a 4k ASUS and a 1440p MSI choice. Probably between those depending on what your GPU can drive (I'm still at 1080p!).
 
Same 9 times out of 10 HDR makes the game look washed out and worse. Don't get me wrong when HDR works correctly its amazing. But getting it to work is the problem. With the opposite of dynamic often being the case. Often its not a problem with the screens, its a problem with how windows implants and fails to correctly switch over to the HDR profile for games. Yet will work for YouTube videos.
Yes it's annoying how in consistent it can be implemented.

I've had to use a combination of shaders and filters such as RTX HDR to fix the game defaults. The worst games are those with minimal brightness/contrast/gamma adjustments.

Cyberpunk was another game that RT worked well but enabling HDR always lifted the blacks for me.

My current Alienware 32" 4K OLED monitor has Dolby Vision but Dell had to add a toggle menu option via a firmware patch because Windows doesn't work well with it.
 
Last edited:
I use it on games that natively support it, toggle using the keyboard shortcut before launching a game.
 
I think also there is a misunderstanding of what it's supposed to do. The amount of times I've seen people make comments about how "the colours really pop", and you see they have a combination of settings that has grossly saturated the image and crushed blacks etc, but the point is some people think that looks good and that's fine. I also read on just about every game "HDR is broken in this game" and it's always met with skepticism as it depends on settings, hardware etc. There are just as many comments saying HDR made their game look 'washed out'.

That said, I think there are obvious cases (can't recall one right now) where HDR has made games look worse for whatever reason, I think AC Valhalla may have been one such case. When I had a PS5 I went as far as watching a video by Vincent (HDTVTEST) on what settings I should be using (HgIG etc, calibrating in the PS5 menu), and I think that was a happy medium as I wasn't prepared to put the time and effort in to learning how to calibrate the TV itself, nor would I pay someone to do it.

Doubt we will ever get a standard as such (outside of min luminescence) which is a shame as it can absolutely change the atmosphere of a game.
 
Last edited:
is it under rated? personally i think HDR is superb. IF it is under rated by PC users i suspect i may be because for the longest of times it was broken on PC and my ps4pro blew it away in that regard and made non HDR content look washed out and rubbish and many would not be bothered to configure individually all the time. However it does seem much better these days.

i always turn it on if it is an option and also have my pc set to auto enhance games where possible.

Destiny 2 i always thought made good use of HDR.
 
I honestly think it's more a case of very few people having the equipment to really appreciate it.

The vast majority don't own an OLED or even a LED with FALD and decent peak brightness etc, most are on cheaper tv's and monitors that claim to support HDR at best but in reality don't.

Makes me think of "HD-Ready" TV's back in the day that would only do 720P rather than 1080P.
You may be onto something there, I've owned HDR capable monitors for several years and always thought nothing of the HDR on it when I turned it on, very underwhelming, couldnt see what the fuss was about. Recently I bought a really good OLED monitor and HDR since then has been amazing in the games that I've tried it on, having said that, just the move to an OLED made a big difference to the look of my games (with or without HDR)
 
I agree its a bit of a mixed bag even with OLED. But for games that support it well such as God of War its an absolute must for me. Plenty don't support HDR and look phenomenal in OLED SDR anyway, especially pixel games like Animal Well.
 
Back
Top Bottom