Welcome to HDR gaming on PC.The best 'bright' whites in my experience with this monitor using the 75% Brightness / 75% Contrast settings are with Creator > DCI-P3 with Gamma set to 2.2 and the Custom Colour profile as previously stated. The other profiles all look warmer/yellowish to me but this is probably more realistic since my TV is set to use the Warm colour presets for movies and so on. For games and desktop use though I actually prefer more vibrant colours that make the image "pop" more.
I was playing Assassin's Creed: Mirage yesterday for the first time with HDR enabled - HDR Peak 1000 - and I was shocked at how bright the whites were on the text and how vivid the fire and light sources looked in the game. It was really impressive, especially as the rest of the colour palette was quite subdued and dull, being a desert setting. This is the kind of contrast that I just never saw on my previous IPS monitor despite years of tinkering with the settings. I am not exaggerating when I say that it feels like I am seeing all the games I've played properly for the first time and as the developers intended them to be viewed.
The only issue I have with HDR and why I haven't left it enabled all the time in Windows is that when I exit a game the desktop colours are very muted and dull. Turn off HDR and the whole image brightens and the colours become more pleasing. I get that no-one wants bright HDR highlights on their desktop but I feel this monitor takes it too far and makes the picture look horrible. As such I am stuck with manually enabling and disabling HDR for now which is disappointing coming from the PS5 and Xbox Series X were HDR is completely automatic.
Welcome to HDR gaming on PC.
I always toggle it on and off when I'm gaming/not gaming.
It's a pain, but I'm more than used to it now as I've had HDR OLED screens for a while.
I'll check later, I've just been revising it too much to fine tune itYou guys getting the 32, are they coming with the latest firmware installed already ?
The best 'bright' whites in my experience with this monitor using the 75% Brightness / 75% Contrast settings are with Creator > DCI-P3 with Gamma set to 2.2 and the Custom Colour profile as previously stated. The other profiles all look warmer/yellowish to me but this is probably more realistic since my TV is set to use the Warm colour presets for movies and so on. For games and desktop use though I actually prefer more vibrant colours that make the image "pop" more.
I was playing Assassin's Creed: Mirage yesterday for the first time with HDR enabled - HDR Peak 1000 - and I was shocked at how bright the whites were on the text and how vivid the fire and light sources looked in the game. It was really impressive, especially as the rest of the colour palette was quite subdued and dull, being a desert setting. This is the kind of contrast that I just never saw on my previous IPS monitor despite years of tinkering with the settings. I am not exaggerating when I say that it feels like I am seeing all the games I've played properly for the first time and as the developers intended them to be viewed.
The only issue I have with HDR and why I haven't left it enabled all the time in Windows is that when I exit a game the desktop colours are very muted and dull. Turn off HDR and the whole image brightens and the colours become more pleasing. I get that no-one wants bright HDR highlights on their desktop but I feel this monitor takes it too far and makes the picture look horrible. As such I am stuck with manually enabling and disabling HDR for now which is disappointing coming from the PS5 and Xbox Series X were HDR is completely automatic.
Is there any point playing around with HDR settings yet due to the DV bug? I know people have said using HDR400 True Black will limit how much DV affects it... But with HDR1000, there's no point since DV caps the nits at lower than half that? I could be wrong with these points so please correct me if i'm wrongWith DCI-P3 gamma should be at 2.4 or 2.6 for accurate P3 representation (this is according to the calibration sheet that comes with the monitor).
I've set mine to 2.4 because I feel 2.6 crushes some blacks
Also, if you want your desktop to look nice in HDR you need to go to HDR settings in windows and adjust the desktop slider, you can make it look really nice but I'm just sticking to SDR on the desktop because I fear burn-in too much.
The best 'bright' whites in my experience with this monitor using the 75% Brightness / 75% Contrast settings are with Creator > DCI-P3 with Gamma set to 2.2 and the Custom Colour profile as previously stated. The other profiles all look warmer/yellowish to me but this is probably more realistic since my TV is set to use the Warm colour presets for movies and so on. For games and desktop use though I actually prefer more vibrant colours that make the image "pop" more.
...
The only issue I have with HDR and why I haven't left it enabled all the time in Windows is that when I exit a game the desktop colours are very muted and dull. Turn off HDR and the whole image brightens and the colours become more pleasing. I get that no-one wants bright HDR highlights on their desktop but I feel this monitor takes it too far and makes the picture look horrible. As such I am stuck with manually enabling and disabling HDR for now which is disappointing coming from the PS5 and Xbox Series X were HDR is completely automatic.
Is there any point playing around with HDR settings yet due to the DV bug? I know people have said using HDR400 True Black will limit how much DV affects it... But with HDR1000, there's no point since DV caps the nits at lower than half that? I could be wrong with these points so please correct me if i'm wrong
Is there any reason even if the DV bug was resolved to use Trueblack setting over the HDR1000?
@no_1_dave thanks. Also can confirm that nvidia driver works a treat, HDR calibration hitting full 1000 nits now!
I thought a USB-C-to-3.5mm convertor worked?Still need to find some speakers for my PS5 and the dell that will work but won't cost me a fortune, so annoying not having a 3.5mm jack.
I thought a USB-C-to-3.5mm convertor worked?