New 144hz 4k monitor spotted, INSANE PRICES!!!

Not really useful as he didn't show us him playing any games, except right at the end. And again the comment about the fan being noisy.

Something else that's been bothering me: the monitor is bandwidth-limited by DP 1.4, so why didn't they do what Dell did on their 8K monitor and just use two DP 1.4 inputs? That would have solved all the problems. Rumour has it that the next ranges of GPUs will have HDMI 2.1 and that does have sufficient bandwidth, so it might be worth waiting for the refresh with HDMI 2.1
 
Not really useful as he didn't show us him playing any games, except right at the end. And again the comment about the fan being noisy.

Something else that's been bothering me: the monitor is bandwidth-limited by DP 1.4, so why didn't they do what Dell did on their 8K monitor and just use two DP 1.4 inputs? That would have solved all the problems. Rumour has it that the next ranges of GPUs will have HDMI 2.1 and that does have sufficient bandwidth, so it might be worth waiting for the refresh with HDMI 2.1

Those monitors are meant to be used with the new nvidias with hdmi 2.1 for sure
 
Pity they have a 2.0 connection.
That is what I don't understand It feels like they were waiting to have both graphic cards and monitors released at the same time, and now.... they are not, plus the ****** up with the hdmi 2.0 instead of 2.1, the fans, the halos...

I wonder if the ultrawide monitors will be better than this ****** up 4k ones, I hope that at least they won't need a fan.
 
Hoping they're smart and hold off to implement a 2.1 connection, but yeah they'll probably have a fan too (or maybe 2 as panel is so big!!) :D
a solid 1440p or 1600p ultrawide would not run into the same bandwidth limitations as these 4k screens does. I would be much more interested in seeing a 1600p ultrawide with all these bells and whistles.
 
A thousand nits surely could be too bright? my sdr monitor has a peak brightness of 350 nits and I find that quite bright at times just browsing the internet so looks like 1000 nits is off the charts for me personally.Guru3D did a review for the Samsung C32HG70 HDR 600 nits and quoted "Lots of retina searing brightness kicks in and, in his experience, that's not always better on the eyes. It's quite a lot and he found himself lowering that Cd/m2 value towards 350 nits really fast"
 
A thousand nits if not implemented correctly could actually make the picture lose detail just look at some of the new samsung tvs reviews which can do over 1200 nits there actually making them so bright the image is being affected.

I have used a lg 4k HDR tv which had max 550 nits and i thought it was more than enough in a game like far cry 5, everybody is going to be different though.
 
IMHO this monitor is a bit of a fail, mostly due to price, but also the main selling point is HDR, the IPS panel is 1000:1, so irrelevant of FALD backlight it is not very good for HDR. It would be pretty good for SDR though with the FALD. Also it is 27". The 35" with VA panel would have 3000:1 native contrast, that should be a lot better for HDR and obviously better size, may have issues with motion blur etc. though. Overall both of them are stupidly priced.
 
I will add to my previous comment that it is good that there are FALD monitors and 4k 144hz HDR and all of this, but as above I think the price is shocking and also think that the 35" version with VA panel would be a lot better as a HDR monitor than the 27" IPS. Hopefully the panel in the 35" improves on some of the problems with previous AMVA panels.
 
Back
Top Bottom