Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.
Reddit user has worked out how RTX HDR settings work so if you want an accurate image you need to change the default settings
* When you enable RTX HDR, the saturation slider is set to 0. This is adding extra color saturation. If you want a color correct image, set the slider to -50. At minus 50 the colors are accurate.
* The default contrast slider is 0, this translates to a gamma of 2.0. Most people use 2.2 gamma, so change the slider to +25. This gamma curve option is the real reason why RTX HDR looks better than Windows AutoHDR, AutoHDR doesn't use a gamma curve and is why it raises the blacks.
* The mid grey slider controls the exposure of the image. This is similar to a HDR paper white setting that some games have and you calculate the correct value to use: the formula is: targetPaperWhiteNits * (0.5 ^ targetGamma), so if you usually enjoy gaming with a paper white of 200 nits and a 2.2 gamma then this mid grey slider should be set to 200 * (0.5 ^ 2.2) = 44, so I should set the Mid Grey slider to +44 and if I wanted 200 nits paper white and a 2.0 gamma then I should set the slider to +25 etc
For most people the about correct values to use would be: Mid grey = 44, Contrast = 25 and Saturation = minus 50
I see there is now also an app that adds a gamma curve to AutoHDR, which fixes the raised blacks but can still result in crushed blacks in certain games.
The whole HDR scene really needs proper standardisation though, it's been a mess for years and it really isn't that difficult.
with rtx hdr "it just works".
Yes and no - the official one "just does not work" for me as I have a multi monitor setup.
The RTX HDR mod however does work for me, but not in everything, and the TrueHDRTweaks files are also hit and miss depending on the game. So it's still all a mess really!
You know what I mean
Out of all the options we have, rtx hdr is by far the most easy and less faff method there is now if you want to get an even remotely good hdr experience (as in windows auto hdr is easier to apply but as we all know, it's not worthwhile with the raised blacks etc.), of course, if you want absolutely perfect output and to get things exactly how it should be then special k with its filters etc. is there but as said, it's a mess that based on my experience, it more often causes issues than is worth the faff.
Obviously nvidia have work to do to make it even better but for a first iteration, it's very good. They have confirmed the multi monitor thing with non hdr monitors will be sorted. The only thing I want to see is the ability to use rtx hdr with dldsr and less of a performance hit although this seems very game dependent, dying light 2 has the biggest performance hit where as avatar is hardly any performance hit.
It's such a good game
Weirdly, I have just played Arkham Asylum and Arkham City to completion with the HDR fix and upgraded textures and new launcher from Nexus mods - using completely new save games again to start from scratch.
Going to have to do a replay of batman arkham series from the start I think!
A big tick on my next upgrade.
I’ve got a LG C9 calibrated. I’m wondering is there any point in using the new RTX HDR over HGIG for native HDR supported games? I have a windows profile for HDR so don’t use auto hdr. I’ve paper white at 200 and peak brightness at 800 in my Windows HDR profile.
Anyone able to explain to a non-HDR user how I enable RTX HDR and configure it for Helldivers 2? Would like to give it a try.
(Yes I do have a HDR enabled monitor, I just never faff about with these things normally!)