# TrueHDRTweaks by emoose -
https://www.nexusmods.com/site/mods/781
# ASI plugin to help modify the default/hardcoded parameters used by TrueHDR
# Many thanks to Pumbo for helping document these parameters & providing suggested defaults!
# To make use of this, extract the ASI/DLL/INI files next to the game EXE, and rename the Ultimate ASI Loader winmm.dll to a DLL filename supported by your game
# A list of possible filenames can be found at
https://github.com/ThirteenAG/Ultimate-ASI-Loader/releases/tag/v7.1.0
# If the tweaks ASI has been loaded in correctly a truehdrtweaks.log file will be created on game startup next to the game
# By default TrueHDRTweaks will watch this INI for any modifications while its running & try applying any changes made
# This could cause issues with some apps that have weird permissions (UWP etc), for those you can set DisableIniMonitoring below
# Note that some settings (eg Quality) will only take effect properly after restarting the app
[Values]
### Values set to -1 in this section will leave the setting unmodified
### Quality: seems to impact TrueHDR's performance cost, with higher levels resulting in greater performance requirements
### Uncertain what effect this may have on image quality
### 0: lowest FPS cost, 1: medium cost, 2: highest cost
### Default: 0/1/2 (TrueHDR - usually sets it to 2), 2 (VideoHDR)
### Note: this will need a game restart to fully apply the setting
Quality = -1
### EnableIndicatorHUD: enables indicator squares on top-left of screen
### If the squares are visible that means HDR is definitely active
### But if they aren't visible, that doesn't always mean that HDR is disabled - some games may fail to draw the squares but still have working TrueHDR
### Default: 0 or 1
EnableIndicatorHUD = -1
### PeakBrightness in nits: suggested range is 750 to 1000 (or the peak brightness of your display anyway).
### Don't set this lower than ~400.
### Default: 1000 (TrueHDR), display-dependant in VideoHDR (VideoHDR has max cap of 650 nits, can be overridden here)
PeakBrightness = -1
### Paper white multiplier, expressed in an unknown format (it can go beyond 100, maybe 50 is neutral and 100 is 2x, 150 is 3x...).
### Supposedly it uses mid gray as scaling pivot. At 50 it's roughly 200 nits (or 203!), matching a Windows SDR->HDR brightness slider value of ~30.
### At 0 it's pitch black. Suggested value is 50, but it's based on preference - if image appears dim, it's worth trying to increase this first
### Default: 50 (TrueHDR), display-dependant in VideoHDR
Paperwhite = -1
### Contrast: higher is more contrast, leave at 1 for original/neutral contrast.
### Default: 0.85 (TrueHDR), display-dependant in VideoHDR
Contrast = 1.0
### Saturation: higher is more saturation, leave at 1 for original/neutral saturation.
### This is no "smart" gamut expansion, it seems to shifts all colors equally.
### Default: 1.1 (TrueHDR), display-dependant in VideoHDR
Saturation = 1.0
### Strength of the HDR highlights curve? Though it also seems to affect the average brightness and possibly shifts colors a bit (at least on negative values).
### Leave at 1 or more.
### Default: 1.3
Strength = -1
### Gamma? Seems like it's not a raw gamma formula but more of an indicator than pure gamma as it doesn't shift colors much, just their luminance.
### There's no way to set it to sRGB. 2.2 is the way to go for nearly all games.
### Default: 2.2
Gamma = -1
### HDRVisualization: draws HDR debugging visualizations
### 0: disabled, 1: HDR overbright pixels?, 2: unknown
### Default: 0 (VideoHDR), 0/1/2 (TrueHDR)
HDRVisualization = -1
### HDRDisplayMode: selects a debug display mode for TrueHDR to use, mainly for comparisons
### 0: SDR (acts similar to Windows "SDR brightness" slider)
### 1: SDR->HDR (default)
### 2: SDR & HDR split-screen
### 3: HDR & SDR split-screen (mirror of the mode above)
### The SDR modes used by this get their brightness set by SDRBrightness below
### Default: 1 (SDR->HDR)
HDRDisplayMode = -1
### SDRBrightness: brightness value used for HDRDisplayMode's SDR modes
### Doesn't seem used for anything else
### Default: 150
SDRBrightness = -1
### AdaptiveBrightness: with this enabled TrueHDR seems to dim the screen when full-frame brightness is past a certain point
### (making bright/full-white screens appear dim on most displays)
### Likely some kind of adaptive brightness / full-frame-brightness limiter, setting to 0 can disable this.
### Default: 1
AdaptiveBrightness = -1
### The following settings have unknown/untested effects
### Possible default values used by VideoHDR/TrueHDR are listed
### Unknown_21 default: 0
Unknown_21 = -1
### Unknown_22 default: 1
### Possibly some kind of "OnlyDrawOnGameWindow" bool, setting to 0 lets TrueHDR draw parts of the effect on top of any windows that are on top?
Unknown_22 = -1
### Unknown_23
### Seems related to bit-depth, either from the display or from game/app
### In VideoHDR this is set to 1 when a certain value is not 8
### In TrueHDR this is set to 1 when a certain value is 24
### Uncertain where the value it's checking originates from
### Default: 0 or 1
Unknown_23 = -1
### Unknown_24
### Seems to use same value as Unknown_23 above, based on bit-depth?
### Default: 0 or 1
Unknown_24 = -1
### Unknown_28 default: 0.90 (VideoHDR), 0 (TrueHDR)
Unknown_28 = -1
### Unknown_34 default: 1 (VideoHDR), 0/1 (TrueHDR)
Unknown_34 = -1
### Unknown_40: Doesn't seem to affect the image, it could be some internal parameter, like how many zones it divides the image in for analysis, or the LUT resolution
### Default: 64
Unknown_40 = -1
### Unknown_44: Possibly something to do with game UI/foreground detection
### When Unknown_22 is set to 0 and TrueHDR is drawing on top of other windows, changing this seems to affect how much gets over-drawn there?
### Only seems to have any effect when Quality is set to 2
### Default: 5
Unknown_44 = -1
### Unknown_49 default: 1 (VideoHDR), 0 or 1 (TrueHDR)
Unknown_49 = -1
### NVProfile flags that can be ORed/added together and set via Nvidia Inspector
### 0x01 = changes EnableIndicatorHUD to 1, otherwise 0
### 0x02 = enables TrueHDR (in older pre-535 drivers, seems to act like flag 0x80 instead)
### 0x04 = changes Quality to 0
### 0x08 = changes Quality to 1 (if neither 0x4 or 0x8 are set, will use quality 2)
### 0x10 = changes Unknown_49 to 1, otherwise 0
### 0x20 = changes HDRVisualization to 1
### 0x40 = changes HDRVisualization to 2
### 0x80 = changes internal driver value from 10 to 24, bit-depth related? (unsure if related to Unknown_23/Unknown_24 above)
[TrueHDRTweaks]
### VerboseLogging: enables extra debug logging into the TrueHDRTweaks.log file
VerboseLogging = false
### DisableIniMonitoring: tweaks will try monitoring the INI file and automatically apply any changes
### If that causes issues it can be disabled here
DisableIniMonitoring = false