LG OLED (2019) C9/ E9 /W9 with true HDMI 2.1

Caporegime
Joined
30 Jul 2013
Posts
29,641
Sounds like it might be a good year to upgrade :D

LG Electronics has issued a press release today ahead of CES 2019, detailing the South Korean manufacturer's new 2019 OLED TVs with Alpha 9 gen 2 processor, and it appears they will have true HDMI 2.1 with high frame rate (HFR), enhanced audio return channel (eARC), variable refresh rate (VRR) and auto low latency mode (ALLM) support.

LG's 2019 OLED TVs are the 8K 88-inch Z9, and the 4K W9 wallpaper, E9 and C9.

LG has announced that its OLED TV line-up for 2019 will comprise the C9, E9, and W9 ranges with 4K resolution as well as B9 that will come a little later in the year.

HDMI 2.1 is the headlining feature. Increased bandwidth now allows for up to 4K at 120 frames per second – also referred to as High Frame Rate (HFR) – inputs via HDMI. Last year’s models also supported HFR but only via streaming input.

This includes an upgraded black frame insertion system called ’OLED Motion Pro’ that now operates at 100/120Hz (compared to 50/60Hz last year) and with shorter black frame cycle (25% vs. 50% last year). LG says the system eliminates flicker and maintains brightness, which were FlatpanelsHD’s two main concerns with the BFI system in the 2018 LG OLED models. Other improvements include a separate ”smooth gradation” picture setting that no longer reduces resolution.

Speaking of motion, LG confirms that the 2019 OLED models will support HDMI VRR (Variable Refresh Rate), which was first implemented in Samsung TVs and Xbox One S / X last year. It is an adaptive frame rate system that matches frame rate between console and TV in real-time for smoother gaming performance with lower lag. The TVs also support HDMI ALLM (Automatic Low Latency Mode) that automatically switches to the TV’s game mode whenever you load up a game on your Xbox One console. PlayStation 4 does not support VRR and ALLM at this time.

https://www.flatpanelshd.com/news.php?subaction=showfull&id=1546474656
 
will be interesting to see
- if on the back of smaller 8k pixel size (?) they open PC monitor market
- close-up shot - will the pixel structure change further, or is it stabilized.
- any concession/guarantees(10yr) on burn-in/panel-life; they could emasculate the competitors/qled et al's argument.
 
What the point of 8k TV's when they already have problems were most 4k movies are just 2k upscaled to 4k :confused:

I think 8K panels are pointless for the home, we wont be able to see the difference over 4k when sitting at normal viewing distances, there is zero content and not likely to be for many years if ever. Pointless gimmick, but if people buy them..
 
We've had our B6 55" since Nov 2016, absolutely love it. Since then we've moved house and the lounge is much bigger so the TV looks tiny. People are shocked when I say it's 55", many assume it's 40" :p. I am really tempted to get a 65" C9 when prices "normalise" as I think it would be a solid upgrade. We could accommodate a 77" but it's outside of my budget :( Only thing that puts me off is having to upgrade the AV Receiver when it's less than a year old. I will wait for reviews I think.
 
It's still a bit early in terms of connectics (AVR, GPUs, etc) for HDMI 2.1 but that's alright, the advantages of 8K PPI will be noticeable even with 4K & lower content, plus VRR for PC gamers. For me the upgrade to 8K has been delayed by about 2 years since I got an XF90 for cheap just months ago so the overall need for an upgrade has been greatly diminished, not to mention my sound system got bought just weeks ago so that's another reason to wait (though this is a lesser concern).

Tbh I'm really excited for this, I think 8K is here to stay for a verrrry long time so when I do upgrade it's definitely going to be 75'' (if not 85'') & for >5 years to keep it. Not sure it's going to be OLED though, since that one's longevity is short(er) and it would only be superior for movies (in the dark). We'll have to see how things evolve.

As for 8K PC gaming, if you look at the Titan RTX Nvlink results it's pretty much there for 60 fps, with slight tweaks. That means the next generation will be enough for a lower price, since the main reason you'd want the Titan is for the higher vram. So let's say the next 2080ti performing card will be ~£800 then you'd need £1600 for two. To already be at 8K 60fps so early that's not a bad price at all. Or if you're not feeling 8K then a single 2080ti is more than respectable at 4K >60fps already, and chances are these new TVs will do 4K 120, so you can enjoy high resolution & HFR smoothness.

The future looks great! :cool:
 
Given that I don't really play any fast paced shooters anymore, I could see me using one of these as my main gaming display (both PC and console).
That would not be a very good idea due to image burn if OLED's are left with a static images displayed on them...
Plus they go very dim or to a screen saver after a few minutes when a static image is displayed on them
 
I've used my B6 as my main gaming display for over 2 years - PC/PS4/Xbox/Switch/Mini SNES

No burn in.

I wouldn't use it as a desktop monitor, but that's not what he said.
 
I wouldn’t upgrade from an earlier model yet - it will be years until devices are using HDMI 2.1 effectively and the latest OLED at that point will have other improvements.
 
i would still hold on a bit yet

this is taken from a comment by daniel taju on vincent teoh's HDTVTest you tube channel


Beware that the first HDMI2.1 chipsets will support control-plane signalling features such as VRR and ALLM, but their data-plane bandwidth limit could still be 18Gbps.

Just 4K 120fps doesnt make it a true HDMI2.1 chipset, as there are other two data plane variables missing: chroma sub-sampling and color bit depth.

Beware that the data plane bandwidth requirements of 4K 120fps 8-bit (SDR) 4:2:0 actually fits in the 18Gbps limit of HDMI2.0.

If a source device such as a PC manages to output via HDMI port 4K 120fps 8-bit (SDR) 4:2:0, LG marketing team could call it 4K 120fps, even if its just 8-bit SDR, even if the chipset can only do 18Gbps.

Beware of this trick, as it would be just 8-bit SDR. Check the math below with different data plane variables combination, that result in different bandwidth requirements on the video transmission chain:


4K @ 120fps @ 8bit per color (not HDR)
3840pixel x 2160pixel x (24bit + 12bit)/pixel x 120frame/second =
35831808000 bits/s = 35.8Gbps @ RGB/Y’CbCr 4:4:4
23887872000 bits/s = 23.8Gbps @ Y’CbCr 4:2:2
17915904000 bits/s = 17.9Gbps @ Y’CbCr 4:2:0 <--- (Beware of this trick, its just 8-bit SDR)

4K @ 120fps @ 10bit per color (HDR)
3840pixel x 2160pixel x (30bit + 15bit)/pixel x 120frame/second =
44789760000 bits/s = 44.8Gbps @ RGB/Y’CbCr 4:4:4
29859840000 bits/s = 29.8Gbps @ Y’CbCr 4:2:2
22394880000 bits/s = 22.4Gbps @ Y’CbCr 4:2:0 <------- Stored Content in HDR 10bit

4K @ 120fps @ 12bit per color (HDR)
3840pixel x 2160pixel x (36bit + 18bit)/pixel x 120frame/second =
53747712000 bits/s = 53.7Gbps @ RGB/Y’CbCr 4:4:4
35831808000 bits/s = 35.8Gbps @ Y’CbCr 4:2:2 <------- What is sent via HDMI, BELOW HDMI2.1 48Gbps limit
26873856000 bits/s = 26.8Gbps @ Y’CbCr 4:2:0
 
Back
Top Bottom