World first QD-OLED monitor from Dell and Samsung (34 inch Ultrawide 175hz)

Feel like this is going to sell out almost instantly. I don't expect to get it next day delivery.. maybe by June which is the USA stock due date.
 
Thanks this is something I’m not too familiar with but I think that a 4:2:2 image only retains a half of the chroma samples that a 4:4:4 image does. So for the best colour that you can see on the screen you can only go up to 144hz? Not sure then why anyone would want the extra 21 hz to get up to 175hz if you are sacrificing image quality as the only thing I can think of is competitive fps games but then you would probably want a different monitor anyway.
Yeah I think its not worth it. Keep it at 144Hz, set the max frames in Nvcp to 140fps and you are golden.
 
Let’s get some perspective, it’s still over £1k for a monitor so it’s not exactly cheap :p

I get there are loads of similar style LCD monitors for that sort of money but many of those are just a complete rip off ‘cuz gaming init’.

Only £880 if you are a student ;)
 
Students and employer codes did not work on US site so I doubt they will work for us.
Thanks for the tip, I didn’t hold out much hope that the discount codes would work so I won’t be too disappointed if they don’t work on release day then. There’s always cash back sites and I think Amex have a cash back offer.
 
You can also run HDR using 8 bit + dithering which to my eyes at least gives no discernible difference

That's correct. 10-bit is always used for HDR10 content and if that can't be supported monitor-side due to bandwidth (e.g. AW3423DW @175Hz), the GPU fills in the gaps very effectively with dithering. I've tested this on a range of monitors I've reviewed, including carefully observing fine gradients you might expect to show some differences. And the GPU dithering provides an extremely similar experience to the monitor handling the signal itself. RTINGS has also performed testing which suggests GPU dithering works very effectively - https://www.rtings.com/monitor/discussions/o_4-KLlEIj71cQq9/8-bit-banding-question. People should feel free to run the monitor or another which handles the signal in this way and compare for themselves. The'll most likely see there's absolutely no reason to avoid using 175Hz under HDR (or SDR, unless you specifically create content with a 10-bit workflow).
 
How come? Students always get 20% off alienware monitors, it worked on their last release.
If my code does not work I won't be bothering. Will just wait for the LG OLED to inevitably drop in price. Can't see why it would not work though. I think in the US they were using the Premier link which is not for consumers as I understand, hence why it may not have worked.
 
If my code does not work I won't be bothering. Will just wait for the LG OLED to inevitably drop in price. Can't see why it would not work though. I think in the US they were using the Premier link which is not for consumers as I understand, hence why it may not have worked.

Pretty sure this.

Either way if it doesn't work, I'll still go for it, amex £100 back should still work.
 
That's correct. 10-bit is always used for HDR10 content and if that can't be supported monitor-side due to bandwidth (e.g. AW3423DW @175Hz), the GPU fills in the gaps very effectively with dithering. I've tested this on a range of monitors I've reviewed, including carefully observing fine gradients you might expect to show some differences. And the GPU dithering provides an extremely similar experience to the monitor handling the signal itself. RTINGS has also performed testing which suggests GPU dithering works very effectively - https://www.rtings.com/monitor/discussions/o_4-KLlEIj71cQq9/8-bit-banding-question. People should feel free to run the monitor or another which handles the signal in this way and compare for themselves. The'll most likely see there's absolutely no reason to avoid using 175Hz under HDR (or SDR, unless you specifically create content with a 10-bit workflow).

With the AW3821DW I will normally switch from 10-bit for single player games @ 120Hz and 8-bit for online FPS games @ 144Hz.
It's not he best HDR monitor so the difference it not that noticeable, although I try and convince myself the extra 24Hz has some benefit.

I've not experienced any issues with text clarity or it being garbage @ 8-bit 144Hz.

I would expect the AW3423DW would perform similarly.
 
With the AW3821DW I will normally switch from 10-bit for single player games @ 120Hz and 8-bit for online FPS games @ 144Hz.
It's not he best HDR monitor so the difference it not that noticeable, although I try and convince myself the extra 24Hz has some benefit.

I've not experienced any issues with text clarity or it being garbage @ 8-bit 144Hz.

I would expect the AW3423DW would perform similarly.
Text would look garbage in 10bit with chroma subsampling. 8bit would be full rgb so it wont affect text clarity. These are 2 different things, there are basically 3 options:
175hz 8 bit full rgb - text great, dithering on gpu side, may affect some hdr effects
175hz 10 bit chroma subsampling, no dithering, text garbage
144hz 10 bit, full rgb - less hz, best possible hdr.
 
Students and employer codes did not work on US site so I doubt they will work for us.

They were working when they first went up for order. They stopped them a little later for some reason. I don't know if they ever stopped them working in other countries. Some Oz buyers managed to stack discounts for over 40% off. They're getting ripped off though, so it seemed fair.
 
Last edited:
Back
Top Bottom