Alienware announces the AW2725DF and AW3225QF (worlds first 4k 240hz and worlds first 1440p 360hz QD-OLED monitors - launches January 2024)

Should I use hdmi or dp for the AW3225QF?

Isn't one better than the other?

I've gone hdmi.

Three reasons:

1. The instructions indicate hdmi.
2. The hdmi cable is only one with the high bandwidth sticker.
3. The hdmi port is the one that supports eARC.
 
Last edited:
So to use HDR we have to enable it every time we want to game and then disable it when we are back in windows?

Also are people auto hiding the windows taskbar with these monitors?
 
Last edited:
OK, so I have set up my AW3225QF today.

The only thing that didn't fit and I've left off was the back plate. I just couldn't get it to snap closed properly with all the wires at the back.

Played Ghost of Tsushima for about 10 mins as a quick test. Looks good.

 
chill-daddy.gif
 
Is the drop in FPS worth it going from 1440p to 4k?

I’m seriously thinking of upgrading my AW3418DW and am torn between the 27 inch and the 32 inch.

Having recently experienced 1440p at 165hz it felt so smooth, I just worry I’ll not see that kind of FPS at 4k.
 
Is the drop in FPS worth it going from 1440p to 4k?

I’m seriously thinking of upgrading my AW3418DW and am torn between the 27 inch and the 32 inch.

Having recently experienced 1440p at 165hz it felt so smooth, I just worry I’ll not see that kind of FPS at 4k.
I have a 4080 and went from 3440x1440 and playing games maxed at 120-170fps to 4k and getting 40% fps less on average. Not worth it especially when you are used to playing at 120fps and higher, unless you plan to upgrade to 5080/5090 when they come out. I liked playing at higher fps so much that have just ordered 360Hz 27 inch oled. 120fps is my playable minimum right now.
 
I have a 4080 and went from 3440x1440 and playing games maxed at 120-170fps to 4k and getting 40% fps less on average. Not worth it especially when you are used to playing at 120fps and higher, unless you plan to upgrade to 5080/5090 when they come out. I liked playing at higher fps so much that have just ordered 360Hz 27 inch oled. 120fps is my playable minimum right now.

You see I was quite content with 120hz until I experienced 165hz and the difference is quite big!

Thanks for the feedback.
 
Is the drop in FPS worth it going from 1440p to 4k?

I’m seriously thinking of upgrading my AW3418DW and am torn between the 27 inch and the 32 inch.

Having recently experienced 1440p at 165hz it felt so smooth, I just worry I’ll not see that kind of FPS at 4k.

Not sure if you have read back a couple of pages but few of us have posted our thoughts on that exact move.

In terms of fps and smoothness, I was worried about this the most too but it's not actually "that bad" (I also gamed a lot with dldsr + dlss perf on 1440 so performance was lower than what I get with 4k native dlss perf now) and as mentioned because of taa based methods, even if you're getting a slightly lower fps because of the higher res. motion looks cleaner/sharper. 2 games, which run noticeably worse though are AW 2 and CP 2077, maxed out with dlss and frame gen was playable on 1440 with dlss perf but 4k with dlss perf, the fps isn't high enough for me.

As mentioned in my OP, the aw34dw is definetly smoother for lower fps of 40-60 fps though, less micro stuttery probably thanks to the gsync module.

This is with a 3080 btw.

Both aw32 and aw34 are great monitors, the main factor and difference is really just 16.9 4k vs 21.9 1440p, a lot of games 21.9 is simply more immersive but then in a lot of newer games, 4k simply looks better. I'm somewhat adapting to the size of 32" and the tallness now but still missing the extra width and fov you get with the 34"
 
On topic of vrr flickering, pcmonitors.info has done a decent post:


I’ve now reviewed or at least used at least a dozen QD-OLED and WOLED models combined, and they’ve all suffered from a degree of VRR flickering which has the potential to bother some people. And none have been as ‘offensive’ in that respect as your average VA LCD in this domain, at least for normal ‘in-game’ fluctuations that might occur. If there is such a thing as a normal in-game fluctuation (of course there isn’t – it very much depends on the game and your system). My overriding general feeling is that, for whatever reason, I tend to find WOLED flickering more noticeable when gaming on my system. The best performer in this respect was actually the first model I tested and one I owned for a few years – the Dell Alienware AW3423DW. This one benefited from a G-SYNC module, which performs some degree of compensation for the gamma fluctuations occurring in a VRR environment. This reduces but does not eliminate VRR flickering.

Also, one game where it is very obvious is starfield just on the menu, aw32 flickering is very obvious, aw34 didn't have this even at 175hz (with the most recent firmware), so far outside of cp 2077, I haven't faced any issues though.
 
Last edited:
I’ll post a link from Reddit which I believe mentions Cyberpunk and “flicker” interestingly there’s a link further down that specifically mentions “issues” regarding HDR Peak 1000 , that hopefully can be fixed in a firmware update ( apparently they did so with the DWF) not too sure if both things are related im a tech noob :cry: just thought I’d post too see if it was of interest or could help anyone with flickering issues in certain games .. I haven’t had chance to try yet.

 
Back
Top Bottom