Pleseant surprise?

Associate
Joined
21 Apr 2015
Posts
435
Location
United Knigdom
I recently bought a Sony XD8305 49" and plugged my Xbox one S into it to try the 4K and HDR. I did do some research on this tv as well as others but could not find info on the "bit" for HDR.

At first I was disappointed but after getting a better hdmi cable according to the Xbox it is 10bit HDR. Is this the cheapest 10bit tv? As you can imagine I am happy about this and forza horizon 3 looks amazing.
 
Not really, if a TV doesn't state UHD premium, then it doesn't really support the standard correctly.

I've not been burned by this and I'm relatively tech savvy, so I'm an impartial observer with no agenda, but do you really think people should / would know this, especially in the face of language like this, in Sony's own specs:

"(Up to 3840x2160/60p 10bit)"

?

This could be interpreted as 10 bit with up to 3840x2160 resolution.

In my opinion the industry is awash with confusing and misleading patter and jargon, as well as incompatibility where there should by all rights be compatibility, probably second only to the car industry in terms of consumer markets.

There's example after example of this (hi early 4K TVs that don't support HDCP 2.2).

In summary, it's too complicated for the average punter, and the industry gets away with an awful lot.
 
The 4K HDR standards are a total mess.

Yep, and getting worse in 2017

There is the HDR10, Dolby Vision & HLG standards

Then Samsung bringing out the HDR 1500 specification which will make punters wonder if they should upgrade for the 1500nits brightness.

I'm pleased I waited in 2016 and it's a bit of a farce.
 
Back
Top Bottom