New Dell 30" 4k Oled Announced CES 2016

I have the PG348Q

I did have the 144Hz Swift, now the 100Hz Swift so I feel I'm in a good position to be able to give my experience on both and the differences. It's not like for like ok, but its still a decent observation of them both.

Most will spout what they read on the net and draw their own conclusion from that, and yet others will just pipe up on a thread with nothing to contribute, I'm not sure what lols means, I'm older than 14.

I doubt you've ever actually played any games beyond 100 fps. Or rather, it is obvious.

You cant even figure out which games can be run at well beyond 144fps.

You might want to boot up CSGO, run it at 100hz and then 144. Then also compare with a 144hz TN panel.
You might also want to read on blurbusters, check out testufo and read around on various hardware sites.

Problem with these people that think "60hz is butterysmooth" "100hz is butterysmooth" is that they dont know what to look for.

Simply moving the mouse around and looking at the trails (i.e. move the mousepointer in a circle and stare in the middle of the circle) or follow it directly with your eyes, you will notice a difference, theres less blur and theres more pointers in the trail. Now imagine a huge image moving around, higher refreshrate with fast pixels means sharper image during motion.

Testufo has a great image for this http://www.testufo.com/#test=photo&photo=quebec.jpg&pps=1440&pursuit=0&height=-1 This at 60hz is a big blur, at 100hz, less blur, but still hard to follow any details. At 120hz an improvement, 144hz, yet another improvement, image gets crisper. (depending on if the pixels can keep up) with ULMB or Blur Reduction (BENQ(tm)) the image is CRISP, can follow the eyes of the dude.

Then theres also this http://www.testufo.com/#test=photo&photo=alien-invasion.png&pps=960&pursuit=0&height=-1 can you follow the pupils?

And about Response Times, you claim that you cannot tell a difference between 1ms and 4ms.
Well, you can.. less blur.
 
Last edited:
Certainly hoping to if we can :)

If you get the opportunity to review the panel, can you check near blacks and lower grey scales?, although I'l never be able to afford it the LG oled tvs including 2016 still suffer both problems with that, be interesting to see if this does.
 
I go with science on this one.

Also a lot of people don't realise the effect when you first become acclimated to higher hz over days/weeks then drop back. All over the net the discussion is dominated by the "I just use my eyes" people who glance at a 144 and write it off instantly.
 
@Baddass Obviously early days and unknowns at this stage; but if you're given the opportunity to review the monitor would there be scope to come up with some sort of test to give the monitor's "anti-burn-in" technology a test/stress to see how effective it actually works (although know thats more of a long term test)?
 
Had two Samsung phones and since switched to Sony and I would rather not use any tech where pixels wear down in individual rate - it will burn to some extent, whatever you do - and I'd rather not to worry that I display same content at same screen area for too long.
Curiously enough with Samsung screens it was not actually visible on white screen, but immediately visible on grey screen - nice imprints of Android status bar, faint, but obvious.
 
Last I heard, someone at Dell told an individual that called them there are no plans to launch this monitor. :(
 
Last I heard, someone at Dell told an individual that called them there are no plans to launch this monitor. :(

I wouldn't be surprised if that is indeed the case. This monitor doesn't present too much profit opportunity for Dell and requires significant R&D.

I expect the first desktop OLED monitors to come from Samsung or LG. The ones who are investing into building OLED TVs and phone displays. Although the panel for this Dell was probably made by one of those but I believe once the teach is ready, they will release their own products.
 
All the other monitor manufacturers begged them not to release the monitor as it would kill the interest in the LCD monitor market and they aren't done milking us with LCD yet.... :p
 
All the other monitor manufacturers begged them not to release the monitor as it would kill the interest in the LCD monitor market and they aren't done milking us with LCD yet.... :p

Especially if you consider the prices of some LCD monitors nowadays.
Have no doubt that it would be better to buy 50" OLED TV than high end monitor at the start of next year.
 
Especially if you consider the prices of some LCD monitors nowadays.
Have no doubt that it would be better to buy 50" OLED TV than high end monitor at the start of next year.

Yup absolute joke "high end" monitors, they are a rip off.

I was wanting a nice freesync 21.9 34" monitor but I can't bring myself to spend that much for what is essentially inferior old tech. now so sometime next year I am going to get an OLED 4k HDR TV instead.
 
I wouldn't be surprised if that is indeed the case. This monitor doesn't present too much profit opportunity for Dell and requires significant R&D.

I expect the first desktop OLED monitors to come from Samsung or LG. The ones who are investing into building OLED TVs and phone displays. Although the panel for this Dell was probably made by one of those but I believe once the teach is ready, they will release their own products.

I'm pretty sure this Dell would have used this Sony panel:

https://pro.sony.com/bbsc/ssr/cat-hdr/cat-hdr1/product-BVMX300/

That's a $20,000 pro monitor. The cost of course would be driven down by significant volume increase with the Dell.

I think Dell could pull a profit at $5,000. It would absolutely destroy every other monitor out there and people like me would pay for that. The real problem was Thunderbolt only for 4K 120 Hz. They could have shelved the monitor until DP 1.3/1.4 TCons come available sometime this millennia.
 
My new pc build has Thunderbolt so its becoming more common now :)

There is no way to get the GPU output of your high end video card out the Thunderbolt 3 port. That's the whole problem. Unless you plan to run 4K @ 120 Hz off an integrated Intel CPU/GPU. :confused:

There are only two known ways I can find to actually output a dedicated GPU signal through Thunderbolt 3. It's with a laptop that is specially designed to do so, or an MSI Vortex small form factor desktop that has laptop motherboard/video cards.
 
Is there an adapter? TB carries DP doesn't it?

So could you run a dual-link into some I/O box with TB out? Is it possible to multiplex like that? This is assuming the ports on the new GPUs are not in fact ready for 1.4 as some have speculated.
 
Last edited:
There is no way to get the GPU output of your high end video card out the Thunderbolt 3 port. That's the whole problem. Unless you plan to run 4K @ 120 Hz off an integrated Intel CPU/GPU. :confused:

There are only two known ways I can find to actually output a dedicated GPU signal through Thunderbolt 3. It's with a laptop that is specially designed to do so, or an MSI Vortex small form factor desktop that has laptop motherboard/video cards.

My instructions show you plugging the outputs of your gfx card into the thunderbolt 3 input which then sends it out on the thunderbolt output?????

https://www.asus.com/uk/Motherboard-Accessory/ThunderboltEX-3/

Or are you saying this doesnt work? I thought that was the whole reason they existed with the Asus motherboards?
 
Last edited:
Back
Top Bottom