World first QD-OLED monitor from Dell and Samsung (34 inch Ultrawide 175hz)

Switch to 175 and compare 175 to 144 and tell me you don't see the difference :p


But yeah, it's not a huge issue like but definitely in windows, it's noticeable windows etc. aren't quite as snappy.
Either you have something set up incorrectly there or wtf because both when I just set the DW to 175 and that set to 144 look identical :p
 
I've had mine on 175hz since I got it early last year. Mainly because there isn't that many applications that support 10bit colour unless you're using HDR from what I saw. Even when using HDR there didn't seem to be that much of a difference between native 10bit and 8bit + frc.
 
It is a bit pointless having it on 175hz mode tbh given very few games even with a 4090 can run anywhere at a constant 100+ so not really getting the benefit and losing out on true native 10 bit for nothing really but based on tftcentral, pcmonitors.info and vincent stating there is no obvious benefit to 10 bit vs 8 bit + frc, I just keep it there for now for where a game can push 140+ fps, saves having to switch refresh rate and fps caps
 
It is a bit pointless having it on 175hz mode tbh given very few games even with a 4090 can run anywhere at a constant 100+ so not really getting the benefit and losing out on true native 10 bit for nothing really but based on tftcentral, pcmonitors.info and vincent stating there is no obvious benefit to 10 bit vs 8 bit + frc, I just keep it there for now for where a game can push 140+ fps, saves having to switch refresh rate and fps caps

Can't you just specify refresh rate in game? I do in counter strike.

Also as I recall we said flicker in dark scenes was less on 144.
 
Can't you just specify refresh rate in game? I do in counter strike.

Also as I recall we said flicker in dark scenes was less on 144.

Not all games allow you to change refresh rate in game and also means you have to change fps cap in nvcp (or whatever you use) before launching game too unless you use in game (which is not recommended and also very rare too)

VRR flicker is definitely less obvious at 144hz in those dark grey scenes and low fps scenarios though but quite rare this I find.
 
Yeah it's the vrr flicker, I also notice it in Photoshop only at 175Hz as my UI is dark grey canvassed, and zooming in/out of an image results in random flickering/stuttering sometimes. I just tried it now though with the calibrated profile and settings and could not get it to flicker in Photoshop so maybe calibrating actually solved it but further testing needed.

HDR does use 10-bit though as the expanded colour depth is where all the goodies happen. Windows does 8bit+frc/dithering, and nvidia's driver does 8bit+dithering too, nvidia's driver sets out to reduce banding that previously affected dithered outputs whereas in modern times it's a non issue to the point nobody would be able to tell the difference between 8bit GPU dithered and 10bit native when on an nvidia card I said it before but I just like the peace of mind of being all native anyway, plus in a colour critical scenario even if 8bit has no obvious banding across the full rgb range, having the option to use 10bit native is too hard not to use at 144Hz :p

From a while ago Benq of all brands actually had a really good article on what both are and mean in the real world, the end line sms it up really:

 
Yeah it's the vrr flicker, I also notice it in Photoshop only at 175Hz as my UI is dark grey canvassed, and zooming in/out of an image results in random flickering/stuttering sometimes. I just tried it now though with the calibrated profile and settings and could not get it to flicker in Photoshop so maybe calibrating actually solved it but further testing needed.

HDR does use 10-bit though as the expanded colour depth is where all the goodies happen. Windows does 8bit+frc/dithering, and nvidia's driver does 8bit+dithering too, nvidia's driver sets out to reduce banding that previously affected dithered outputs whereas in modern times it's a non issue to the point nobody would be able to tell the difference between 8bit GPU dithered and 10bit native when on an nvidia card I said it before but I just like the peace of mind of being all native anyway, plus in a colour critical scenario even if 8bit has no obvious banding across the full rgb range, having the option to use 10bit native is too hard not to use at 144Hz :p

From a while ago Benq of all brands actually had a really good article on what both are and mean in the real world, the end line sms it up really:


In conclusion just use 10bit 144hz :p
 
Ah, yes you would definitely need the bigger one, I think the MX150 is only rated up to 27"? The MX450 was up to 35" when I got it, but looks like it's now up to 49"!
Managed to get the MX450 off Invisions eBay store for £33. I was quite happy with the MX150 so hopefully this will be more of the same!
 
Moss, is that you?

:D

fiP2E88.gif
 
Anyone running the DWF with a 1080ti? Wont be upgrading CPU/GPU till late next year - so wondering if this will tax the GPU too much - already playing at 1440p on most stuff and happy with current performance but the extra width might be too much
 
Back
Top Bottom