Help! 4k TV

Associate
Joined
14 Dec 2017
Posts
2,040
Location
Aberdeenshire
I’ve been looking for a new tv for a month or so now, I think I’m back to square 1.

Trying to get my head round hdmi 2,2a,2b,2.1 and HDCP2.2 do these things matter for 4k?

Currently looking at the Samsung RU7400 55” and I have £700 to spend.

Be used in the living-room to watch sky Q mainly/Netflix and Blu-rays.

suggestions welcome if there are better options, I was opting for the Samsung as I’m away to get a Samsung s10 I’m not to long.
 
The HDMI standards are about the hardware and what signal standards and hardware features they'll support. Your sources are a good example of the tiers of signals that the various HDMI standards support.

Blu-ray is 1080p resolution, 8-bit colour (so, not HDR then, just regular standard dynamic range 'SDR'), and either 2D or 3D for picture. For this, HDMI 1.4 or anything higher in all your gear will ensure complete compatibility. (For the sake of completeness, HDMI 1.4 will support 3840 x 2160 UHD '4K' resolution, but only at 25 and 30Hz, so it's only useful for things such as up-scaling BD players.)

Sky Q can go as high as 3840 x 2160 UHD res at 50Hz. This means that to run at that resolution and frame rate, your TV and anything that the HDMI signal passes through needs something better than HDMI 1.4. This is where HDMI 2.0 starts to become important. At the moment though, Sky doesn't use any of the other picture features of the UHD signal format, so it's not using HDR in any of the formats available (HLG, HDR/HDR10, HDR10+/HDR Pro, DolbyVision) or the bigger colour range made possible by WCG. In picture terms, all you're getting is some extra resolution.

When Sky does add HDR then it will be in the HLG format. At that point your hardware will need to be up to a minimum of HDMI 2b standard to allow the benefit to come through.

Netflix supports up to 3840 x 2160 resolution @ 60Hz, and HDR in HDR10 and DolbyVision formats. It also supports WCG. For this then, you need hardware up to HDMI 2.1

Any of the higher HDMI standards will support all the previous HDMI standards.


HDCP 2.2 is about how the industry protects its content from cracking and piracy. The ideal combination for your new TV then would be to have HDMI 2.1 with HDCP 2.2.
 
Thanks @lucid for the detailed reply, has helped me understand.
Seems nobody likes to advertise TV’s true specs nowadays or are they usually set at a standard.
 
Nothing on Netflix goes above 24P in 4K HDR, you do not need HDMI 2.1 for Netflix.

It's useful to know that, at the moment at least, none of the Netflix content goes above 24p in UHD. If we limit playback to a display with static metadata (HDR/HDR10) then certainly HDMI 2a would be all that's required. That's not the same though as saying the Netflix isn't capable of anything higher.

Currently, Netflix carries content that uses dynamic metadata. The HDR format is Dolby Vision. Here's a list of some of the titles. To play it properly requires HDMI 2.1

This content won't play through on equipment that doesn't support HDMI 2.1 Trying to play it through anything less than a HDMI 2.1 display chain will result in the video being reduced to Static metadata (HDR/HDR10).

Netflix also carries test video at up to 60Hz in UHD, so Netflix does support up to 60Hz in UHD resolution. See here for more info: https://www.howtogeek.com/338983/how-much-data-does-netflix-use/

:)
 
Thanks @lucid for the detailed reply, has helped me understand.
Seems nobody likes to advertise TV’s true specs nowadays or are they usually set at a standard.

That's nothing new really, but of late it has become harder to get down to the nitty-gritty of what's supported and what isn't with some TV brands.

There are no set standards per se, but there's definitely a strong relationship between price, features and performance.
 
It's useful to know that, at the moment at least, none of the Netflix content goes above 24p in UHD. If we limit playback to a display with static metadata (HDR/HDR10) then certainly HDMI 2a would be all that's required. That's not the same though as saying the Netflix isn't capable of anything higher.

Currently, Netflix carries content that uses dynamic metadata. The HDR format is Dolby Vision. Here's a list of some of the titles. To play it properly requires HDMI 2.1

This content won't play through on equipment that doesn't support HDMI 2.1 Trying to play it through anything less than a HDMI 2.1 display chain will result in the video being reduced to Static metadata (HDR/HDR10).

Netflix also carries test video at up to 60Hz in UHD, so Netflix does support up to 60Hz in UHD resolution. See here for more info: https://www.howtogeek.com/338983/how-much-data-does-netflix-use/

:)
I think you’ve got your wires crossed somewhere, HDMI 2.1 is only supported by a handful of LG TV’s and only as of this year, TV’s have been supporting Dolby Vision since at least 2016, my Apple TV and my UHD player also support Dolby Vision, as does my AVR. None of those devices are HDMI 2.1 and my TV definitely detects and displays them in Dolby Vision.
 
Thanks for that I had to follow the defence link but it’s still valid till the 31st of this month. Under the specs it says
HDMI 2.1 - no
HDCP 2.2 - yes
So take it it would be HDMI 1.6 would that be right? Looks a crackin tv and the next few models up from what I was looking at for less price.
 
Most current TV’s are HDMI 2.0b which handles everything currently available, just doesn’t handle some of the newer stuff like auto low latency mode for gaming, or variable refresh rate which is also for gaming.
 
Ok thanks! I don’t think I’ll be gaming on it really just watching tv and films mostly.
 
I was to late to get them at deal on the ru8000 but I have still been looking and narrows it down to the Samsung ru8000 and the lg sm8600pla which would you choose if you had to?
 
Back
Top Bottom