ASRock Unveils New Phantom Gaming OLED Series Monitors

Forgive my stupidity; for a 4k 240hz monitor, what difference is there?
Not sure what you're asking unless you're on about the discussion regarding version of Displayport and how they impact a 4k240hz screen.

The different versions of DP have different bandwidths (bitrate to be exact). The bandwidth dictates what maximum resolution and framerate combination is supported.
I.e. higher resolution requires more bandwidth, higher refresh rate requires more bandwidth and higher colour depth (i.e. HDR) requires more bandwidth.
Higher resolution/framerate combinations can be achieved, but only if supported and only then by using something called display stream compression, which compresses the data in such a way that it is visually lossless (i.e. you can't tell by looking if DSC is being used or not).

Even within DP2.0/2.1, there are different classes of bitrate each called UHBR (ultra high bit rate), older versions of DP only had a single bitrate class. There isn't too much performance difference between 2.0 and 2.1, the main things are cable certifications and clearer definition of the UHBR specifications. For DP2.1, one needs to look at which UHBR class to see what performance can be had. It's similar to USB3, where unfortunately there is no single version of USB3, there are multiple versions of USB3. Same for DP 2.1, the different levels of performance can be found by the UHBR class. Due to cable specs, the higher bitrate cables only come in shorter lengths max of 1.2m at the moment.

DP1.4 (HBR3) - 25.92 gbps - max 4k@120Hz
DP2.1 (UHBR10) - 38.68 gpbs - max 4k@174Hz
DP2.1 (UHBR13.5) - 52.22 gpbs - max 4k@229Hz
DP2.1 (UHBR20) - 77.37 gpbs - max 4k@323Hz

Currently, there is only a single UHBR20 monitor, Gigabyte Aorus FO32U2P 4k240Hz OLED. And only AMD's professional workstation W7900 series GPU have a UHBR20 port.
AMDs RX7000 series GPUs support up to UHBR13.5. Worse still, Nvidia's RTX 4000 is only DP1.4, so it's not possible to run above 4k120Hz on Displayport, unless the monitor and cable support DSC. Fortunately for RTX 4000 series owners, HDMI 2.1 is supported which can support up to 4k@188Hz, so they can at least use 4k144hz monitors without DSC.

To answer your question, for full native support for 4k240Hz (without compression), both the monitor, GPU and cable need to support DisplayPort 2.0/2.1 at UHBR20. Visually there is no difference using DSC to achieve 4k240Hz.

Hence why folks are hoping for more monitors that support UHBR20, for native support of 4k240Hz. Though folks should also be hoping for UHBR20 GPUs as well, otherwise they still won't be able to gain the benefit of the UHBR20 monitor.
 
Last edited:
To answer your question, for full native support for 4k240Hz (without compression), both the monitor, GPU and cable need to support DisplayPort 2.0/2.1 at UHBR20. Visually there is no difference using DSC to achieve 4k240Hz.
Then I shall continue to act stupid; how does it support 4k 240hz with HDMI 2.1 and Display port 1.4, which max out at 4k 120hz?!?!
 
I think for the 27" screen at 1440P DP1.4 does 240Hz. Is that right? If so, that wouldn't put me off.
Correct, DP1.4 natively supports up to 251Hz at 1440p, with no compression required.
Then I shall continue to act stupid; how does it support 4k 240hz with HDMI 2.1 and Display port 1.4, which max out at 4k 120hz?!?!
You are correct, there is no native support for 4k240 on those display standards.

For HDMI2.1 and DP1.4, DSC (display stream compression) is required to achieve 240Hz at 4k resolution. Using DSC could result in some potential minor visual artifacts, due to loss of some data during compression. However the DSC standard is designed to be visually lossless, so there shouldn't be any actual noticeable difference using DSC.

Monitors Unboxed did a video explaining DP2.1, UHBR20 and DSC in this video. They explain your exact question too.

As explained, some Nvidia features are unavailable when using DSC.

DSC is better than chroma subsampling, where data bandwidth is reduced by encoding slightly less information for colour. Though both can be combined to achieve higher resolution/refresh combinations (only DP can do this, HDMI uses one or the other depending on target settings).

Example of Chroma subsampling:
640px-Common_chroma_subsampling_ratios_YCbCr_CORRECTED.svg.png

I hope that answers your question.
 
Last edited:
Yes. Yes it does. Thank you. Don’t intend on changing the 4090 anytime soon (running 5800x3D so more likely to go AM5 next) but least it’s not end of the world by looks of it. Still, does seem a bit odd!
 
Back
Top Bottom