• Competitor rules

    Please remember that any mention of competitors, hinting at competitors or offering to provide details of competitors will result in an account suspension. The full rules can be found under the 'Terms and Rules' link in the bottom right corner of your screen. Just don't mention competitors in any way, shape or form and you'll be OK.

RTX HDR for games!

Hmm, still gonna play in SDR :p

Watching the DF video (in HDR), whites are way too overexposed. Whilst it's a cool feature, it's clearly not final hence why it's locked away in the driver. It's not something I'd be using because the "HDR" looks too unnatural, too overexposed, which is exactly what Windows AutoHDR was doing too but to a greater extent.

I want accurate colours and luminance, not overblown or overly vibrant. Natural/Accuracy is how I like my OLEDs.
 
Last edited:
You'll never use hdr anyway screenshot snob :p

YouTube videos never translate/show HDR well, also, I'm not sure how Alex has captured this as to my knowledge there isn't any way to properly capture this rtx HDR.

I haven't encountered any issues with highlights/whites other than slower exposure change, it's not cp 2077 hdr level of quality but it's so close and far superior to just sdr.
 
  • Like
Reactions: TNA
Hmm, still gonna play in SDR :p

Watching the DF video (in HDR), whites are way too overexposed. Whilst it's a cool feature, it's clearly not final hence why it's locked away in the driver. It's not something I'd be using because the "HDR" looks too unnatural, too overexposed, which is exactly what Windows AutoHDR was doing too but to a greater extent.

I want accurate colours and luminance, not overblown or overly vibrant. Natural/Accuracy is how I like my OLEDs.

Lol. Dude. It's games, not photography.

Plus you even said you liked the extra punchy colours in the Cyberpunk mod or whatever it was recently no?
 
I use Nova LUT mod in Cyberpunk which balances out the colours, not amplifies them. The last post about it was that the latest Pure version of the mod fixes the blue and yellow tones that were muted, hence preferring the accurate primary colours now in the updated version of the mod.

I like what I like, I just prefer accurate tonal ranges. Blown out highlights are not accurate, or pleasing, they are distracting and give the illusion of "wow". At least native HDR like in Cyberpunk everything is balanced and the bright sources of light are convincingly bright where they need to be with no detail loss on sunlit white surfaces etc.
 
Last edited:
This doesn't work if the game window loses focus so you can easily compare on and off by pressing the Start key.
 
Last edited:
DF/Alex has spoken, nvidia are indeed killing it with #RTXON! :cool:


I hope Nvidia have it as a driver toggle soon and I hope they can resolve its shortcomings. It seems better than windows AutoHDR but I find both windows AutoHDR and thisnis inferior to the Auto HDR on the PlayStation 5 - hopefully DF does more comparisons. It looks nice in doom 3 but then lost ark it's clearly blowing out the highlights beyond what can reasonably be expected and just crushing all the details, additionally it's applying crazy HDR highlights to HUD and UI elements in games which is a nice way to get burn in. To resolve these issues they need to work on the scaling, give users a peak brightness setting to modify and try and do something about the UI
 
This doesn't work if the game window loses focus so you can easily compare on and off by pressing the Start key.

Doesn't always result in Apple to Apple comparison. For example you have to turn windows HDR on to engage this tool, now on my monitor when I do this the monitor and windows switches color profile, so even if no HDR content is displayed the colors on screen look different to being back in SDR, so carrying on, if I do what you suggested the "hdr off" image after i press enter is still using an incorrect color output for SDR content
 
Getting back into Elite Dangerous, and have been using this to give it a nice wee upgrade. Works well.

Downside is it doesnt seem to work with DSR on, and I'd been using that to deal with Elite's terrible aliasing. Hope both can be made to work together eventually.
 
What do you mean when blowing out highlights?

Sorry, that's a photography term really, but it basically means lack of detail in the highlights / any bright areas in an image. When talking specifically about HDR though, This RTX HDR mod must have a locked peak brightness which is above what many OLED screens can currently show. For example my current main OLED monitor tops out at around 600nits. If this mod is locked at c1000nits, which I'm guessing it is, then it tries to show any highlights in a scene at that brightness, but my screen is then 'missing' 400nits and there's no tone mapping, so it will miss any details that may be in there and just show 'white'.

This is why for now this mod is a no go really until it gets some peak brightness etc settings to tweak with. SpecialK HDR is vastly superior if you can get it work with a game, and I actually prefer AutoHDR vs this RTX mod as AutoHDR matches your Windows 11 HDR calibration profile (if you create one and make sure it's set up right) so you don't get blown out highlights. You can also easily use Reshade with AutoHDR to tweak the slightly raised black floor/ gamma curve that unfortunately AutoHDR often has.

Also, that video Alex at DF put out about this RTX mod is awful. I watched it in HDR and it's clearly over exposed and blowing out highlights all over the place in many of the games. I usually like his work but he's had a bad day there or knows very little about HDR which I find hard to believe.
 
Last edited:
Gaming tech shows the figures of peak brightness and so on here:


I've done the windows hdr calibration too and even then, I find RTX hdr way better than auto hdr (in supported games). Perhaps end results are different depending on the display though. Haven't tried it on my lg e7 yet, only the aw 34 qd-oled.

Youtube is just awful for HDR, both from uploading it to playing it back, so much that I give up on recording HDR and watching HDR content on YT.
 
Gaming tech shows the figures of peak brightness and so on here:


I've done the windows hdr calibration too and even then, I find RTX hdr way better than auto hdr (in supported games). Perhaps end results are different depending on the display though. Haven't tried it on my lg e7 yet, only the aw 34 qd-oled.

Youtube is just awful for HDR, both from uploading it to playing it back, so much that I give up on recording HDR and watching HDR content on YT.

Pretty sure that Gaming Tech vid is only for the official RTX HDR video feature that's in the new drivers, not the mod for games we've been talking about. There's no way the gaming mod is locked at 650nits (as described in the video description) from my testing, as that's only a little bit above my monitors peak HDR brightness, and it wouldn't look so blown out if so. And to be honest, these days I don't always take the words and findings of any youtuber including this guy and DF vs my own testing and my own eyes.

Your monitor has a peak of 1000nits doesn't it? If so you won't be seeing the issues that I am with the mod. All it needs is a few tweaks to be able to control the peak brightness, and also the 'saturation' levels as these are falsely overly boosted in some games too, and it has a ton of potential....

EDIT - looks like there is a way to control peak brightness and saturation now!...

TrueHDRTWeaks v0.6 "Allows tweaking TrueHDR peak brightness/saturation/contrast/quality..."

See the optional files at:

 
Last edited:
Pretty sure that Gaming Tech vid is only for the official RTX HDR video feature that's in the new drivers, not the mod for games we've been talking about. There's no way the gaming mod is locked at 650nits (as described in the video description) from my testing, as that's only a little bit above my monitors peak HDR brightness, and it wouldn't look so blown out if so. And to be honest, these days I don't always take the words and findings of any youtuber including this guy and DF vs my own testing and my own eyes.

Your monitor has a peak of 1000nits doesn't it? If so you won't be seeing the issues that I am with the mod. All it needs is few tweaks to be able to control the peak brightness, and also the 'saturation' levels as these are falsely overly boosted in some games too, and it has a ton of potential....

EDIT - looks like there is a way to control peak brightness and saturation now!...

TrueHDRTWeaks v0.6 "Allows tweaking TrueHDR peak brightness/saturation/contrast/quality..."

See the optional files at:


Should be using similar concept/methods I would have thought but might be a few differences tbf. Probably is down to my monitor being HDR 1000, there is a 400 mode but it's not the best mode to use for HDR regardless anyway.

Yes there is that tool. It's by the same guy who made the control HDR mod, which he posted his thoughts on the tool etc. a few times e.g.

Damn, thank you so much for this. I had tried RTX HDR on youtube videos and wasn't happy with how much it shifts colors and adds saturation. It seems like it does the same in games, but it can also look wonderful :), and it's on the right path to beating MS AutoHDR.
I wrote a comprehensive (but fast) review after trying it for a bit

  • From a quick test, it looks better than AutoHDR
  • Good gamma (~2.2, maybe a mix between 2.2 and sRGB) (massive advantage over AutoHDR as there's no raised blacks anymore)
  • Shifts colors a bit too much. The brighter the color, the more it's shifted. Hopefully they will add a toggle for that, as I personally don't like it, it often ruins the artistic look
  • It has a paper white of about 250-300 nits based on my perception (that doesn't mean that SDR 1 1 1 becomes 250-300 nits, it's remapped around mid gray or so)
  • The peak seems to be between 750-1000 nits based on my perception. I don't know if it follows the Windows 11 HDR calibration profile
  • The SDR to HDR Windows brightness slider value is ignored, there's no way to change the average picture brightness
  • It's purely a straight forward pixel by pixel remapping, with no temporal or positional awareness
  • The UI doesn't seem to be detected by AI or anything like that, so pure white UI will go to 1000 nits
  • It doesn't seem to try to generate missing/clipped detail on highlights (in case you thought AI could do that)
  • Cannot be captured by Game Bar screenshots nor NV screenshots (probably it happens at the end, at driver level)
  • Games need to start in SDR for it to ever engage. Enabling native HDR in the game breaks the native HDR, but you can always toggle back to SDR and RTX HDR starts applying again
  • For it to engage, you need to enable it with emoose tool before starting the game. Having RTX HDR enabled in the NV control panel is not necessary
  • It only works when the game is borderless fullscreen and fullscreen exclusive (at least in DX11 and 12). Any kind of overlay temporarily disables it
  • I could not see any banding on the few games I tried, it's possible it adds debanding filters due to upgrading 8 bit buffers to 10 bit
  • It works with DX9, DX10, DX11 and DX12 games. Vulkan and OpenGL are untested: supposedly they also work if they are presented with DXGI swapchain
If anybody wants to discuss this further, they are welcome to join our HDR focused discord: https://discord.gg/55ySUgZ7YD
 
Just downloaded the RTX TrueHDR mod tweaks files I linked above - in the config file in there is full documentation of what it all does and what some of the settings are.

Peak brightness: "Default: 1000 (TrueHDR), display-dependant in VideoHDR (VideoHDR has max cap of 650 nits, can be overridden here)"

Looks like I was right, for the gaming mod HDR peak brightness is set to 1000nits (like I guessed using just my god given eyes) by default, and for video it's 650nits.

# TrueHDRTweaks by emoose - https://www.nexusmods.com/site/mods/781
# ASI plugin to help modify the default/hardcoded parameters used by TrueHDR

# Many thanks to Pumbo for helping document these parameters & providing suggested defaults!

# To make use of this, extract the ASI/DLL/INI files next to the game EXE, and rename the Ultimate ASI Loader winmm.dll to a DLL filename supported by your game
# A list of possible filenames can be found at https://github.com/ThirteenAG/Ultimate-ASI-Loader/releases/tag/v7.1.0
# If the tweaks ASI has been loaded in correctly a truehdrtweaks.log file will be created on game startup next to the game

# By default TrueHDRTweaks will watch this INI for any modifications while its running & try applying any changes made
# This could cause issues with some apps that have weird permissions (UWP etc), for those you can set DisableIniMonitoring below
# Note that some settings (eg Quality) will only take effect properly after restarting the app

[Values]
### Values set to -1 in this section will leave the setting unmodified

### Quality: seems to impact TrueHDR's performance cost, with higher levels resulting in greater performance requirements
### Uncertain what effect this may have on image quality
### 0: lowest FPS cost, 1: medium cost, 2: highest cost
### Default: 0/1/2 (TrueHDR - usually sets it to 2), 2 (VideoHDR)
### Note: this will need a game restart to fully apply the setting
Quality = -1

### EnableIndicatorHUD: enables indicator squares on top-left of screen
### If the squares are visible that means HDR is definitely active
### But if they aren't visible, that doesn't always mean that HDR is disabled - some games may fail to draw the squares but still have working TrueHDR
### Default: 0 or 1
EnableIndicatorHUD = -1

### PeakBrightness in nits: suggested range is 750 to 1000 (or the peak brightness of your display anyway).
### Don't set this lower than ~400.
### Default: 1000 (TrueHDR), display-dependant in VideoHDR (VideoHDR has max cap of 650 nits, can be overridden here)
PeakBrightness = -1

### Paper white multiplier, expressed in an unknown format (it can go beyond 100, maybe 50 is neutral and 100 is 2x, 150 is 3x...).
### Supposedly it uses mid gray as scaling pivot. At 50 it's roughly 200 nits (or 203!), matching a Windows SDR->HDR brightness slider value of ~30.
### At 0 it's pitch black. Suggested value is 50, but it's based on preference - if image appears dim, it's worth trying to increase this first
### Default: 50 (TrueHDR), display-dependant in VideoHDR
Paperwhite = -1

### Contrast: higher is more contrast, leave at 1 for original/neutral contrast.
### Default: 0.85 (TrueHDR), display-dependant in VideoHDR
Contrast = 1.0

### Saturation: higher is more saturation, leave at 1 for original/neutral saturation.
### This is no "smart" gamut expansion, it seems to shifts all colors equally.
### Default: 1.1 (TrueHDR), display-dependant in VideoHDR
Saturation = 1.0

### Strength of the HDR highlights curve? Though it also seems to affect the average brightness and possibly shifts colors a bit (at least on negative values).
### Leave at 1 or more.
### Default: 1.3
Strength = -1

### Gamma? Seems like it's not a raw gamma formula but more of an indicator than pure gamma as it doesn't shift colors much, just their luminance.
### There's no way to set it to sRGB. 2.2 is the way to go for nearly all games.
### Default: 2.2
Gamma = -1

### HDRVisualization: draws HDR debugging visualizations
### 0: disabled, 1: HDR overbright pixels?, 2: unknown
### Default: 0 (VideoHDR), 0/1/2 (TrueHDR)
HDRVisualization = -1

### HDRDisplayMode: selects a debug display mode for TrueHDR to use, mainly for comparisons
### 0: SDR (acts similar to Windows "SDR brightness" slider)
### 1: SDR->HDR (default)
### 2: SDR & HDR split-screen
### 3: HDR & SDR split-screen (mirror of the mode above)
### The SDR modes used by this get their brightness set by SDRBrightness below
### Default: 1 (SDR->HDR)
HDRDisplayMode = -1

### SDRBrightness: brightness value used for HDRDisplayMode's SDR modes
### Doesn't seem used for anything else
### Default: 150
SDRBrightness = -1

### AdaptiveBrightness: with this enabled TrueHDR seems to dim the screen when full-frame brightness is past a certain point
### (making bright/full-white screens appear dim on most displays)
### Likely some kind of adaptive brightness / full-frame-brightness limiter, setting to 0 can disable this.
### Default: 1
AdaptiveBrightness = -1

### The following settings have unknown/untested effects
### Possible default values used by VideoHDR/TrueHDR are listed

### Unknown_21 default: 0
Unknown_21 = -1

### Unknown_22 default: 1
### Possibly some kind of "OnlyDrawOnGameWindow" bool, setting to 0 lets TrueHDR draw parts of the effect on top of any windows that are on top?
Unknown_22 = -1

### Unknown_23
### Seems related to bit-depth, either from the display or from game/app
### In VideoHDR this is set to 1 when a certain value is not 8
### In TrueHDR this is set to 1 when a certain value is 24
### Uncertain where the value it's checking originates from
### Default: 0 or 1
Unknown_23 = -1

### Unknown_24
### Seems to use same value as Unknown_23 above, based on bit-depth?
### Default: 0 or 1
Unknown_24 = -1

### Unknown_28 default: 0.90 (VideoHDR), 0 (TrueHDR)
Unknown_28 = -1

### Unknown_34 default: 1 (VideoHDR), 0/1 (TrueHDR)
Unknown_34 = -1

### Unknown_40: Doesn't seem to affect the image, it could be some internal parameter, like how many zones it divides the image in for analysis, or the LUT resolution
### Default: 64
Unknown_40 = -1

### Unknown_44: Possibly something to do with game UI/foreground detection
### When Unknown_22 is set to 0 and TrueHDR is drawing on top of other windows, changing this seems to affect how much gets over-drawn there?
### Only seems to have any effect when Quality is set to 2
### Default: 5
Unknown_44 = -1

### Unknown_49 default: 1 (VideoHDR), 0 or 1 (TrueHDR)
Unknown_49 = -1

### NVProfile flags that can be ORed/added together and set via Nvidia Inspector
### 0x01 = changes EnableIndicatorHUD to 1, otherwise 0
### 0x02 = enables TrueHDR (in older pre-535 drivers, seems to act like flag 0x80 instead)
### 0x04 = changes Quality to 0
### 0x08 = changes Quality to 1 (if neither 0x4 or 0x8 are set, will use quality 2)
### 0x10 = changes Unknown_49 to 1, otherwise 0
### 0x20 = changes HDRVisualization to 1
### 0x40 = changes HDRVisualization to 2
### 0x80 = changes internal driver value from 10 to 24, bit-depth related? (unsure if related to Unknown_23/Unknown_24 above)

[TrueHDRTweaks]
### VerboseLogging: enables extra debug logging into the TrueHDRTweaks.log file
VerboseLogging = false

### DisableIniMonitoring: tweaks will try monitoring the INI file and automatically apply any changes
### If that causes issues it can be disabled here
DisableIniMonitoring = false
 
Last edited:
Just tried the mod tweaks file in Cyberpunk - changing any of the parameters had zero effect at all. It should be 'working' though as it creates a log file like it says it should in the documentation and everything checks out in that. No idea now, stumped. Think we just have to wait....
 
Last edited:
Now in the new NV driver CP! Still beta atm though but wo wo we ah very nice! :D

iTpjtHS.png
 
Now in the new NV driver CP! Still beta atm though but wo wo we ah very nice! :D

iTpjtHS.png

Ha! I was just about to post I've been experimenting with the TrueHDR tweaks files and managed to get it working in some games so I could control peak brightness etc etc. However for some games it just won't work at all.

Which drivers are these then???
 
Ha! I was just about to post I've been experimenting with the TrueHDR tweaks files and managed to get it working in some games so I could control peak brightness etc etc. However for some games it just won't work at all.

Which drivers are these then???
 
Now in the new NV driver CP! Still beta atm though but wo wo we ah very nice! :D

iTpjtHS.png

I'm not seeing any RTX HDR options anywhere. If I go to graphics on the left, then program settings and check various games the only RTX option I see is "RTX Dynamic Vibrance".

EDIT - think I may have found the issue from a post on Nexus Mods:
"Seems the new app doesn't allow enabling it if multiple monitors are connected neither, while NvTrueHDR allows it (though there can be issues depending on whether they have HDR enabled or not)"

My secondary monitor isn't HDR - so that's probably why I'm not seeing the RTX HDR options anywhere.

And, installing this beta app has totally uninstalled Geforce Experience and the Nvidia Shield GAMESTREAM stuff that's part of it, so now streaming to Moonlight on the Steam Deck doesn't work! **** that, what a joke. Uninstalling immediately. Thank god I make backups of stuff.
I'm aware Nvidia are going to pull the plug on Gamestream soon anyway and it like this new "App" is what's in store then. I'll have to look at 'Sunshine' earlier than I thought for streaming via Moonlight it seems.....
 
Last edited:
Got this working at last with the Nvidia App after unplugging my other monitor, disabling AutoHDR then doing a reboot.

It looks nice but has a massive performance hit for me on my 4090. On Starfield for example, Afterburner is reporting the usual 100+ FPS but it feels more like 30 FPS.

Tried a few other games like Singularity and Dark Souls and it's the same with high reported FPS but with some weird lag/jerkyness.

Does anyone else have that?
 
Back
Top Bottom