Been following a series of videos on "fixing" Unreal Engine's output with AgX, etc. but there seems to be something fundamentally flawed with rendering HDR and SDR content on full-range PC monitors with the Unreal Engine beyond just the tonemapper - unfortunately at a technical level I don't know what they've missed but it results in that hazy flat look where white is somewhere at around 247 RGB instead of 255 like rec.709 is incorrectly being mapped to full-range RGB but instead of being clipped and blowing out highlights it is reducing contrast.
I think Unreal Engine uses ACES as default (I can't recall off the top of my head) but that is what many game engines tend to use out-of-the-box. ACES tends to be preferred due to its more cinematic bias that can result in more pleasing visuals for gaming, whereas AgX is more PBR-neutral and handles large dynamic variance really well before things start to get clipped. AgX also does a much better job of preserving hue where ACES can shift colour. AgX can seemingly produce a flatter looking image that might appear to have reduced contrast, but it also represents a better option for workflows due to more predictable exposure and contrast adjustments. It doesn't have the cinematic bias that ACES has. I can't really speak from experience in terms of AgX with Unreal Engine as I haven't tried playing around with any custom tone mapping or LUTs.
I am wondering if it might be a case of
levels vs transfer function vs tonemapper confusion. This can produce hazy/low-contrast results where the white-point doesn't look white.
I'm no expert in the colour-theory domain but when you say white is around RGB 247 (not 255) then this suggests the system is:
- reserving headroom above a nominal white-point
- output range is being incorrectly mapped
- tone curve is compressing highlights before display mapping
- SDR is being treated video-range instead of full-range
You mention that contrast appears reduced rather than clipping and blowing-out of highlights - which suggests range compression.
The two common SDR RGB encodings are full-range (PC RGB) where 0-255 maps to 0-255 (0 = black, 255 = white) and limited/video-range (Rec.709) where the nominal range is 16-235. Rec.709 reserves 0-15 and 236-255 for headroom/overshoot. If video range is assumed but output is full-range (or vice-versa) you can end up with:
- "reduced" whites and "lifted" blacks
- lower contrast
- washed-out look
White may appear somewhere around 235-247 instead of 255. Also, tone mappers are not required to map scene 1.0 directly to display 1.0 - ACES deliberately leaves headroom.
Without knowing more, the output you are seeing could be plausible if not quite "right" looking.
Unreal probably does something ordered along the lines of:
- Internal scene-linear HDR
- ACES/AgX/whatever tonemapper
- Output transformer
- sRGB OETF
- backbuffer
2 is the tonemapper that compresses HDR dynamic range into display range (still linear at this point!)
3 maps the linear render colour space to display colour space (e.g. Rec.709)
4 this is a fancy term for gamma correction / encoding
Only 2 is the tonemapping bit, the other is for display encoding.
Sorry - rambling.

Not sure!

There are a few places where things could be a bit off as there can be quite a bit to it under the hood.
(Sorry, not been much help).