New Alienware Q3 2020

I thought things had improved a lot since you wrote that with Windows 10 bringing out advanced colour support in version 1703 and they have improved on it in every version since?

Well, things did not improve, at least not for my non-HDR wide-gamut monitor. It may be, that if you enable HDR in Windows 10, then Windows will automatically convert SDR content to sRGB color space. Because that's what "Windows Advanced Color" is about, doing HDR. Not without its own issues, as even Dell suggests to turn the HDR mode only to watch HDR content. https://www.dell.com/support/kbdoc/...ynamic-range-is-enabled-in-windows-10?lang=en
EDIT: nVidia too: https://nvidia.custhelp.com/app/ans...iderations-for-playing-games-with-hdr-enabled


Unfortunately I can't test it myself, as I don't own HDR monitor. I'm looking to buy one, hence why I'm here :)
BTW, the part you quote is from a manual on how to write HDR apps, so it's definitely not applicable if you have turned HDR off in Windows. https://docs.microsoft.com/en-us/windows/win32/direct3darticles/high-dynamic-range

Are these the settings you guys keep talking about (I don't really understand what these settings even do

Don't know the monitor, but I assume that yes, that's the setting.
 
Last edited:
well i have no idea what my screen is doing then.

I have calibrated it to srgb 100% and to me using windows and browsers and stuff, all the colours look fine and correct. Thats using windows sdr.

If I then enable HDR in windows the brightness and colours on the desktop become more saturated as though it is now using the wide gamut.

I had always being led to believe that if you switched on windows hdr (and hence wide gamut) and you were viewing a none hdr source then your desktop would become muted and dull.

The opposite happens.
 
well i have no idea what my screen is doing then.

I have calibrated it to srgb 100% and to me using windows and browsers and stuff, all the colours look fine and correct. Thats using windows sdr.

If I then enable HDR in windows the brightness and colours on the desktop become more saturated as though it is now using the wide gamut.

I had always being led to believe that if you switched on windows hdr (and hence wide gamut) and you were viewing a none hdr source then your desktop would become muted and dull.

The opposite happens.
Have you factored in whether your monitor is switched to HDR on or off? I would presume that was an option but not sure. I’m not sure whether the lack of sRGB mode means that there is no HDR on and off mode. Have a quick check maybe?

Rewinding back to the concern, it is known that a wide gamut monitor putting out sRGB content will look overly saturated, so it seems like you are duplicating that effect somehow, but I really don’t know.
 
Have you factored in whether your monitor is switched to HDR on or off? I would presume that was an option but not sure. I’m not sure whether the lack of sRGB mode means that there is no HDR on and off mode. Have a quick check maybe?

Rewinding back to the concern, it is known that a wide gamut monitor putting out sRGB content will look overly saturated, so it seems like you are duplicating that effect somehow, but I really don’t know.

I honestly dont know if the monitor has a HDR setting.........will check tonight. I think i left it on auto.

And yeah a wide gamut monitor putting out sRGB content should look over saturated hence why uncalibrated in games it does,

However if you enable HDR in windows and are viewing none HDR sources eg your desktop then everything should become dull. Its the opposite which I cant get my head around.

I am going to have to play around with it some more but at the moment I have a normal looking calibrated desktop and apps and browsers and when I want to play hdr games I just flick it over to hdr in windows. AC Odessey looked stunning last night in HDR
 
I honestly dont know if the monitor has a HDR setting.........will check tonight. I think i left it on auto.

And yeah a wide gamut monitor putting out sRGB content should look over saturated hence why uncalibrated in games it does,

However if you enable HDR in windows and are viewing none HDR sources eg your desktop then everything should become dull. Its the opposite which I cant get my head around.

I am going to have to play around with it some more but at the moment I have a normal looking calibrated desktop and apps and browsers and when I want to play hdr games I just flick it over to hdr in windows. AC Odessey looked stunning last night in HDR
Yes I played the Evil Within 2 on my Samsung tv (few years old but a great model) and it looked absolutely amazing from my PS4, so I’m super hyped for more HDR!

After all my agonising (and with the new lockdown) I decided that I needed a small monitor for now for home working so today I went full..... ‘enthusiast’ (...) and ordered a 360hz 1080p potato :o

I’m interested to see how I get on with it for desktop use and then for gaming in the upcoming months - I’ll then reassess the mega monitor situation!
 
Hi everyone!

I have just got my new AW2721D after 2 weeks. And I have noticed the new energy label... class G which is the same as the current class D! but the question is... is it really this power inefficient? I am surprised nobody talked about this, I guess nobody cares about energy consumption!

l84HWRl.jpg
 
Huh that is surprisingly bad!

Not really, they have moved the scale massively. Anything that was A+++ on the old scale is now a D on the new scale.

So on the old scale that monitor would have been rated an A.

They needed more space as A+++ was already getting silly.

In fact I dont think anything has been given anything better than a C yet on the new scale.

I mean its only using 31W per hour. We used to have 60W in every room not that long ago.
 
Not really, they have moved the scale massively. Anything that was A+++ on the old scale is now a D on the new scale.

So on the old scale that monitor would have been rated an A.

They needed more space as A+++ was already getting silly.

In fact I dont think anything has been given anything better than a C yet on the new scale.

I mean its only using 31W per hour. We used to have 60W in every room not that long ago.
Oh right, thanks for explaining :)
 
Not really, they have moved the scale massively. Anything that was A+++ on the old scale is now a D on the new scale.

So on the old scale that monitor would have been rated an A.

They needed more space as A+++ was already getting silly.

In fact I dont think anything has been given anything better than a C yet on the new scale.

I mean its only using 31W per hour. We used to have 60W in every room not that long ago.

Oh, I misread the scale, so it seems as if class G is the old B. Which is in line with all monitors of this class... Ok, not bad then! That was frightening for a second.
The EU really doing terror on energy efficiency levels with this new scheme eh
 
Hi everyone!

I have just got my new AW2721D after 2 weeks. And I have noticed the new energy label... class G which is the same as the current class D! but the question is... is it really this power inefficient? I am surprised nobody talked about this, I guess nobody cares about energy consumption!

l84HWRl.jpg
Monitors are getting much brighter these days mainly due to HDR so this is normally causing them to draw more power.

Maybe am wrong :D as Dell website says it only 31watts
 
Last edited:
Monitors are getting much brighter these days mainly due to HDR so this is normally causing them to draw more power.

Maybe am wrong :D as Dell website says it only 31watts

and that is now a G on the energy scale. Amazing to think you can run two of the monitors for what one single lightbulb used to use
 
Would you guys mind giving me your other settings?

Genuinely isn't 23% just very dark?

Is it possible I have a damaged panel? Or is my perception for brightness different potentially?

Are we taking about taking this thing out of the box and just setting brightness to 23%? That's all you've done?
 
When you guys talk about 120cd. Are you calibrating it with hardware calibrator or something?

Yes. 120cd is meant to be the perfect brightness for a darkened room (which I am in all the time). 120 is quite dim though.

200cd is for a normal light room. Is probably around the 30% brightness level most people are using in here.

There are online visual calibration tests you can do to set yours by eye anyway

https://www.maketecheasier.com/calibrate-monitor-display/

and windows 10 has one built in

https://www.laptopmag.com/uk/articles/calibrate-monitor-windows-10

There is no right answer though. I know a guy who runs his monitor on 75% brightness. That would give me headaches but he claims he is fine with it.
 
Back
Top Bottom