World first QD-OLED monitor from Dell and Samsung (34 inch Ultrawide 175hz)

Does the AW have speakers? I read that they don't. if I wanted to use a fire stick on it how would I get sound? Currently using an Acer Predator UW, which has speakers, but would like to upgrade to an UW Oled.

No, it doesn't. It does have an audio in from the graphics card, but this is only to provide output for headphones. I don't know for sure, but it's possible that you could use the headphone socket to drive a pair of speakers?
 
Last edited:
Ah, now see no one has confirmed that until now. That's good news. I was aware that adaptive sync works but I wasn't sure that HDR with Adaptive works. It's definitely the AW3423DW then!

Can't speak for any other AMD card, but I can confirm that the AW3423DW definitely, absolutely, 100% supports adaptive sync & HDR simultaneously when using an MBA RX 6800.

To throw an extra spanner in the works however, to my untrained eye there is no discernible difference between 8 bit and 10 bit, so I keep mine set to 8 bit 175Hz
 
Can't speak for any other AMD card, but I can confirm that the AW3423DW definitely, absolutely, 100% supports adaptive sync & HDR simultaneously when using an MBA RX 6800.

To throw an extra spanner in the works however, to my untrained eye there is no discernible difference between 8 bit and 10 bit, so I keep mine set to 8 bit 175Hz

I took a look at this, and from what I can see, 8 bit dithered or 10 bit is irrelevant. As long as the source material is 10-bit then it is indistinguishable using 8-bit with dithering. The misconception is that using 8-bit with dithering reduces you to SDR. It doesn't. The picture is still HDR. The example I was given is as follows - they achieve the 10-bit by cycling alternate frames between two values. You have a 10-bit source that has a pixel that is brightness 2 then tell the monitor for one frame to display brightness 3 and the next you display brightness 1. The overall effect is brightness 2. In that way you can dramatically reduce the data transfer to the monitor. True, that's cheating, it's using science when it's not really needed, but apparently the important thing is that as long as your monitor has the dynamic range then you can't tell the difference between 10-bit and 8-bit with dithering.

If that's the case, then yes, then really there is no problem with the F monitor. And since you say the non-F works with AMD, then the choice is really academic.
 
Last edited:
No, it doesn't. It does have an audio in from the graphics card, but this is only to provide output for headphones. I don't know for sure, but it's possible that you could use the headphone socket to drive a pair of speakers?
I have a pair of B&W MM1s that I use for PC sound connected via usb. Would that work if I connected to the monitor? Assuming it has usb ports.
 
If that's the case, then yes, then really there is no problem with the F monitor. And since you say the non-F works with AMD, then the choice is really academic.

Pretty much.

You might get a slightly better experience with the original non-F when using an Nvidia card, but you gain a fan.
You'll get 10hz less with the F version, but also lose a fan, and save a few ££

I have a pair of B&W MM1s that I use for PC sound connected via usb. Would that work if I connected to the monitor? Assuming it has usb ports.

I believe you'd need to use the 3.5mm connection from the monitor, the monitor USB is purely a hub and for controlling the AlienFX lighting, so you'd have no way of assigning the output to the speakers' USB "soundcard"
 
Last edited:
I have a pair of B&W MM1s that I use for PC sound connected via usb. Would that work if I connected to the monitor? Assuming it has usb ports.
I use Adam A5X's and simply connect them to my STX II soundcard. Nothing connected to the AW3423DW
 
I use Adam A5X's and simply connect them to my STX II soundcard. Nothing connected to the AW3423DW
Thanks for the reply. Was looking how to get sound if a firestick was plugged into the monitor. Might wait for a UW Oled that has built in speakers as I don't want to change my MM1s and I don't want 2 sets of speakers.
 
Thanks for the reply. Was looking how to get sound if a firestick was plugged into the monitor. Might wait for a UW Oled that has built in speakers as I don't want to change my MM1s and I don't want 2 sets of speakers.

The sound from the Monitor's USB ports will only be from the PC when it is connected via USB too. It doesn't come from the monitor itself, I believe :-)
 
Thanks for the reply. Was looking how to get sound if a firestick was plugged into the monitor. Might wait for a UW Oled that has built in speakers as I don't want to change my MM1s and I don't want 2 sets of speakers.

Those speakers look like they have an "Aux In" on the back, all you need is a 3.5mm - 3.5mm cable from the monitor headphone port
 
Thanks for the reply. Was looking how to get sound if a firestick was plugged into the monitor. Might wait for a UW Oled that has built in speakers as I don't want to change my MM1s and I don't want 2 sets of speakers.
At least on the G-sync version there is a line out on the back of the monitor as well as the headphone jack on the underside of the monitor. You should be able to use that line out on the back to a set of speakers.

Edit: Looking at the specs the Freesync model also has a line out as well as a headphone jack too.
 
Last edited:
Obviously a sign as I cannot get my card to work on Dell site lol.

Ok finally worked, if the monitor is any less than perfect I shall send back and try again once more before giving up.

Edit, got an email to say issue with payment again, fml.
 
Last edited:
Can't speak for any other AMD card, but I can confirm that the AW3423DW definitely, absolutely, 100% supports adaptive sync & HDR simultaneously when using an MBA RX 6800.

To throw an extra spanner in the works however, to my untrained eye there is no discernible difference between 8 bit and 10 bit, so I keep mine set to 8 bit 175Hz
I'm in the same boat about 8 bit and 10 bit. I can't see a difference, but can very slightly see the small near black gamma raise that happens when changing from 175 to 144 (it is very small but i can at least see the difference, it starts to become more obvious at 120hz and is in your face at 60hz :D)

But it is this topic that is showing me that people must see very different to one another as I see a lot of people choose to run at 144hz as the change from 10bit vs 8 bit must be noticeable and important enough to override the small gamma raise and the lower fps. They don't notice the gamma raise, i don't notice the 10 bit to 8 bit change. So, if someone recommends 144hz 10 bit as better PQ, i have to balance it from the people sensitive to black levels side. in my camp 175hz has better fps and is actually the better PQ.
 
Last edited:
I'm in the same boat about 8 bit and 10 bit. I can't see a difference, but can very slightly see the small near black gamma raise that happens when changing from 175 to 144 (it is very small but i can at least see the difference, it starts to become more obvious at 120hz and is in your face at 60hz :D)

But it is this topic that is showing me that people must see very different to one another as I see a lot of people choose to run at 144hz as the change from 10bit vs 8 bit must be noticeable and important enough to override the small gamma raise and the lower fps. They don't notice the gamma raise, i don't notice the 10 bit to 8 bit change. So, if someone recommends 144hz 10 bit as better PQ, i have to balance it from the people sensitive to black levels side. in my camp 175hz has better fps and is actually the better PQ.

By all accounts the F seems to be able to go to 120Hz in 10-bit and the non-F can go to 144Hz. The FRC that I mentioned above is certainly used by NVIDIA which is apparently so good that most normal humans can't see the difference between 8-bit and 10-bit. There are, however, small shifts in other things between refresh rates that people are perhaps confusing with differences between 8-bit and 10-bit. I don't know about AMD but judging by other comments they are the same. I think the real problem here is that people are expecting 8-bit to carry significantly comtain less dynamic range but this simply isn't the case, the FRC takes care of that.
Oh, by the way, there are now several confirmations that the F will run at 120Hz 10-bit, just in case there are still some people worrying about it.

 
By all accounts the F seems to be able to go to 120Hz in 10-bit and the non-F can go to 144Hz. The FRC that I mentioned above is certainly used by NVIDIA which is apparently so good that most normal humans can't see the difference between 8-bit and 10-bit. There are, however, small shifts in other things between refresh rates that people are perhaps confusing with differences between 8-bit and 10-bit. I don't know about AMD but judging by other comments they are the same. I think the real problem here is that people are expecting 8-bit to carry significantly comtain less dynamic range but this simply isn't the case, the FRC takes care of that.
Oh, by the way, there are now several confirmations that the F will run at 120Hz 10-bit, just in case there are still some people worrying about it.

Well I think its also differentiation in terms of use cases? For HDR the gpu uses FRC/dithering anyway which indeed most reviewers seemed to suggest it wouldn't be noticeable to most. If you wanted to use 10 bit for work purposes that isn't HDR then I think that's where the issues come in? I remember there being a debate about how AMD allow you to enable 10bit dithering in the driver settings normally but nvidia doesn't so you'd need to select 10bit on the monitor, unless that's actually been changed anyway. Of course not many applications actually support 10 bit colour(unless it's for HDR purposes), so the use cases for this is likely very specific and probably limited to things like image editing, video editing or so on. Use cases which you might not use this monitor for anyway, since I wouldn't say a curved screen is ideal for that kind of work anyway.
 
Order finally went through, look forward to the forthcoming battles with Dell over QC control, fan noise lottery, pixel lottery, bubble wrap marks on screen lottery and whatever else
 
Back
Top Bottom