What are you coding?

Dude, that room is so tidy! Call yourself a developer? I'm surrounded by empty cans of Coke and Milka choc wrappers. :D

Oh, don't worry! I am true to my developer roots... it's like a bomb has hit it now! :cry: It only looks tidy in the pic because I took it immediately after cleaning up to add the TV and move things around! It didn't stay like that for long. It's now full of Lego, books, post-it notes, empty envelopes, discarded hoodies, and biro pens that dried up years ago.
 
Plenty of time on my hands at the moment so I've reworked some areas of my real time path tracer, and started working on a few new features too.

I started by adding a couple more tone mappers - which are responsible for taking the rendered HDR image and transforming them into LDR for display on regular display devices (and for output to non-HDR file formats such as regular JPG and PNG). I had a naïve [0,1] clamp and a Reinhard luminance in place, but decided to add something better - ACES and AgX.

Not the best 3D scene to show the differences... but they are noticeable!

55118091921_efb989d947_o.png


I also decided to rework my post-process bloom effect as I wasn't happy with my old version. It's a lot quicker now, and produces much better results. It is not physically based (far too expensive for a real time renderer) but it does the job and is user-controllable. Works well with the new tone mappers.

55118298208_c21061c61b_o.png

55117223517_68bce4b09a_o.png


I also added a reworked "sheen". It's based on "Practical Multiple-Scattering Sheen Using Linearly Transformed Cosines" (Zeltner, Burley, and Chiang, SIGGRAPH 2022) that introduces a method for real-time, physically-based rendering of fuzzy or dusty materials. It models sheen as a volumetric layer containing fibre-like particles (using the SGGX distribution) and approximates the resulting multiple scattering using an LTC fit, replacing expensive simulation. Good for clothing and things like velvet.

55118302218_0d6724cd25_o.png


In addition, I reworked my transmission code that models light passing through a surface for things like curtains, or strong sunlight shining through leaves. Good for backlighting! Here is an example of a bright light under a piece of fabric.

55118095231_0eb0375040_o.png


And bright lights shining through stained glass:

55118138776_a2f3e675e1_o.png


Another area to update was the coat (clearcoat); this is good for adding a gloss/lacquer layer on top of an existing surface/material. For example, the lacquer layer over car paint, or even moisture/wetness on skin. Or varnish on wood. That sort of thing. I gave the McLaren in the image below a dull Papaya Orange paint layer with high roughness - which meant it appeared diffuse with no highlights - matte orange if you like. I then applied the coat on top which simulates the lacquer clear coat, and suddenly the paintwork has reflections and gloss as you would expect to find in the real world.

55118300188_0b760d3f21_o.png


A woman:

55118298043_618a7a4c07_o.png


A slightly sweaty woman:

55118523775_51fea8e5dc_o.png


Bored yet? :cry:
 
I started by adding a couple more tone mappers - which are responsible for taking the rendered HDR image and transforming them into LDR for display on regular display devices (and for output to non-HDR file formats such as regular JPG and PNG). I had a naïve [0,1] clamp and a Reinhard luminance in place, but decided to add something better - ACES and AgX.

Not the best 3D scene to show the differences... but they are noticeable!

Been following a series of videos on "fixing" Unreal Engine's output with AgX, etc. but there seems to be something fundamentally flawed with rendering HDR and SDR content on full-range PC monitors with the Unreal Engine beyond just the tonemapper - unfortunately at a technical level I don't know what they've missed but it results in that hazy flat look where white is somewhere at around 247 RGB instead of 255 like rec.709 is incorrectly being mapped to full-range RGB but instead of being clipped and blowing out highlights it is reducing contrast.
 
Last edited:
Been following a series of videos on "fixing" Unreal Engine's output with AgX, etc. but there seems to be something fundamentally flawed with rendering HDR and SDR content on full-range PC monitors with the Unreal Engine beyond just the tonemapper - unfortunately at a technical level I don't know what they've missed but it results in that hazy flat look where white is somewhere at around 247 RGB instead of 255 like rec.709 is incorrectly being mapped to full-range RGB but instead of being clipped and blowing out highlights it is reducing contrast.

I think Unreal Engine uses ACES as default (I can't recall off the top of my head) but that is what many game engines tend to use out-of-the-box. ACES tends to be preferred due to its more cinematic bias that can result in more pleasing visuals for gaming, whereas AgX is more PBR-neutral and handles large dynamic variance really well before things start to get clipped. AgX also does a much better job of preserving hue where ACES can shift colour. AgX can seemingly produce a flatter looking image that might appear to have reduced contrast, but it also represents a better option for workflows due to more predictable exposure and contrast adjustments. It doesn't have the cinematic bias that ACES has. I can't really speak from experience in terms of AgX with Unreal Engine as I haven't tried playing around with any custom tone mapping or LUTs.

I am wondering if it might be a case of levels vs transfer function vs tonemapper confusion. This can produce hazy/low-contrast results where the white-point doesn't look white.

I'm no expert in the colour-theory domain but when you say white is around RGB 247 (not 255) then this suggests the system is:
  • reserving headroom above a nominal white-point
  • output range is being incorrectly mapped
  • tone curve is compressing highlights before display mapping
  • SDR is being treated video-range instead of full-range
You mention that contrast appears reduced rather than clipping and blowing-out of highlights - which suggests range compression.

The two common SDR RGB encodings are full-range (PC RGB) where 0-255 maps to 0-255 (0 = black, 255 = white) and limited/video-range (Rec.709) where the nominal range is 16-235. Rec.709 reserves 0-15 and 236-255 for headroom/overshoot. If video range is assumed but output is full-range (or vice-versa) you can end up with:
  • "reduced" whites and "lifted" blacks
  • lower contrast
  • washed-out look
White may appear somewhere around 235-247 instead of 255. Also, tone mappers are not required to map scene 1.0 directly to display 1.0 - ACES deliberately leaves headroom.

Without knowing more, the output you are seeing could be plausible if not quite "right" looking.

Unreal probably does something ordered along the lines of:
  1. Internal scene-linear HDR
  2. ACES/AgX/whatever tonemapper
  3. Output transformer
  4. sRGB OETF
  5. backbuffer
2 is the tonemapper that compresses HDR dynamic range into display range (still linear at this point!)
3 maps the linear render colour space to display colour space (e.g. Rec.709)
4 this is a fancy term for gamma correction / encoding

Only 2 is the tonemapping bit, the other is for display encoding.

Sorry - rambling. :cry: Not sure! :cry: There are a few places where things could be a bit off as there can be quite a bit to it under the hood.

(Sorry, not been much help).
 
Back
Top Bottom