ASUS 42" ROG Swift PG42UQ

One more setting that influences image quality a lot is: Monitor->Uniform Brightness, this should be off... otherwise it will darken the image somewhat, 5% to 10% would be my estimate.
 
I went back to brightness 90 and contrast 80%, so sites are just a little bit over bright/exposed, better this way hehe...

I do notice sRGB looks better with HDR off.

So turning HDR off gives best of both worlds: sRGB/SDR looks good, HDR also still looks good as far as I can tell ! ;)

I will continue trying to develop a HDR shader but it's difficult, x86/x64 don't have 16 bit floating points, except in SSE which is overkill for now...

Many different HDR formats, quite a mess, many HDR youtube videos have colors distorted, OBS studio also has many different settings, further complexifing things.

So far DCI-P3 seems to be what this monitor supports concerning color gamut/range.

I also came agross sgRGB which is some "virtual color" thing... of Microsoft... a bit whacky... got my opengl code working, tomorrow, I might work on it some more, trying to get a HDR pixel format working.

Either OpenGL or DirectX has to be used for real time HDR graphics.

I also found a site on the internet which can display HDR pictures to test the monitor that was a little bit interesting, without windows 11 hdr on things cannot be seen:

Here it is, I find this weird, how edge does not automatically switch on HDR for this website:


I am still confused if the videos in youtube windows are automatic HDR or not...

I could try and record this website via obs studio and then play it back, but that's how I opened another can of worms, difficult to understand what advanced color setting to use for OBS Studio.

Further discovered Consoles only use range 16 to 234 for color components. I noticed that once in Call of Duty 4 Modern Warfare... through my own video codec/analysis... I thought it was because of changing gamma or something, but I kinda new it was by designed.

This new seems confirmed... obs studio has this weird stuff called: "partial" for colors, instead of full... this might also explain why my wows recordings where a bit meh... Bizar to see console crap/television crap/strange dc connections enter into the PC world.... BIZAR.

Conclusion for now: HDR is a total mess... monitors are a total mess with many different nits and so forth ! LOL.

I did find this video, it's one of the best looking videos out there... I thought it might have been RGB recording, but the author just responded apperently it's "4.2.2. 10bit." I did not expect that... interesting ! ;) =D (maybe youtube/edge browser automatically switches to HDR playback if it detects HDR content ?)

This video looks great in HDR off and HDR on mode... hard to tell any difference.

 
Last edited:
Installing this International Color Consortium profile helps to achieve some HDR effect, even when HDR is off in windows.

It basically installs a wide color gamut. It does improve the colors somewhat, I think I am going to like this:


DCI-P3-D65.icc

(Basically this is "old" ICC color profile technology, windows 11 has now "advanced color profile"... hmmm)

should be the correct one for monitors.

However the Kitra software shows it has a small color gamut, so it's not that big of a deal, it's actually shifted somewhat... hmm...

In case troubles installing it here is a tutorial:


I am now going to watch the same videos as before to see if I can see slightly differences, lol, probably not, but maybe, it's worth a try.

I absolutely did see a difference on the test page:

Now I can see the W, I could not see it before when HDR was off, but now I can even if HDR is off:


I don't notice any difference for the test image:

"Can your monitor and your browser display a HDR image?"

I do notice differences in the color imagers both SDR and HDR, kinda strange hehe. I like what I see in SDR, then again it's kinda weird.

I also find opengl weird/confusing... I now have 64 bit color pixel format, created with the normal GDI functions...

The shading looks different, can't really tell if it's any better. I don't understand if it's suppose to be 1.0 for maximum color or something else.

I will also experiment with Krite HDR software, painting program.

Strangely enough the NVIDIA control panel allows to set output to 12 bpc for output color depth... I wonder why... isn't this monitor supposed to be 10 bit only ? hmmm... I wonder if setting 12 bpc can damage monitor, for now it seems ok ?! I don't think it's safe anymore lol.

After this demo, the monitor<->graphics communication started miss behaving, monitor went quite dark and screen started blinking:

 
Last edited:
For God's sake can you please stop posting constant information in here that no-one is asking for, it's just a stream of consciousness at this point!

I understand that you'd like to share info, but please if you could save it all up for one big post in the future that would be great. Thanks.
 
Back
Top Bottom