Sony XF90

@ubern00b if you have a keyboard pressing the "o" key when a film is playing tells you stuff, like this...

20201213-162213.jpg


that is within Kodi.

That is 23.976 fps and then the same for refresh rate.
 
Yup thanks, I meant at a display level though. That info is showing that the video is 23,976FPS, not that the display is matched at 23,976hz.

I know Kodi switches perfectly as does Plex but for things like Netflix, Amazon Video and Disney+ it's difficult to know what refresh rate it is currently set to unless you sit and look for judder or what not.

I think I found a way on the Shield though, bring up the side menu and look at the current display settings. Would be far easier if pressing info just showed the current refresh rate though heh.
 
ah, it is with you noting Kodi in your other post I assumed that the issue was all about that and not the TV in general and what content you are playing from all sources.
I don't know how to tell what refresh rate that it is. Like you I get the 1080/24p in Kodi, but then just the resolution of 4k when it switches back.

Just adding....

not overly impressed since the last update to this TV, these things did not happen before. Sometimes I can get a black screen when using HDMI 3 where my Cube is connected. I have now changed the Cube to HDMI 2 as you have noted that might help.

When turning on the TV from standby there will be a noticeable pause before it is responsive to remote selection, noticed when changing from Home to HDMI 3 for our Amazon Cube.
 
Last edited:
Does anyone else find the Dolby Vision very dark on this TV?

If so has anyone managed a work around or some settings combo that will brighten up the image without washing away any detail?
 
^^^^ Always been noted as being dark when watching DV material. I now somewhat dread seeing that on a NF movie. They did improve this in the update before last, IIRC, still not enough tho to show detail rather than just be too dark.
 
That's a shame. Only way around this is to play media via an external HDD or USB stick.

I need your guys help. I'm having a nightmare at the moment connecting my PC to this TV via a HDMI cable.

Basically the TV only displays anything from the PC when the "HDMI signal format" is set to "standard" which means I cannot have any other output colour format apart from YCBCR420 which means I cannot set it to 10bpc or enable HDR.

What settings are wrong on the PC or TV? I'm confused as before my PC upgrade (Nvidia 2080 to 3070) it worked fine, as in I could simple connect the HDMI cable and it would automatically adjust to the right colour format, 10bpc and HDR worked fine.

Would really appreciate some help please!
 
I just updated mine manually and used the same file to update my old XD80 which was also recently added the the list of supported tv,s
All working great and noticed must faster response to start up times.
 
@Brumboy did you try a different cable? Though it is a strange issue, and from what you're describing it would seem like more of a TV one. It's probably a mis-communication between the device & TV somewhere. And this may be an obvious question but did you make sure to slot it in into the same port in the back, because afaik not all hdmi slots for the XF90 show enhanced format properly.
 
@Brumboy did you try a different cable? Though it is a strange issue, and from what you're describing it would seem like more of a TV one. It's probably a mis-communication between the device & TV somewhere. And this may be an obvious question but did you make sure to slot it in into the same port in the back, because afaik not all hdmi slots for the XF90 show enhanced format properly.

Yeah it's strange. The original cable has stopped supporting 4k HDR for some reason because I ordered three (yes three) different cables and have tested them all. The 3 new ones are rated at 18Gbps but only 2 of them work with my setup.

Still confused as to why it's stopped working with the original cable but oh well at least the problem is solved.

Another query, what PC settings are optimal for this TV? 440 or 442?
 
Another query, what PC settings are optimal for this TV? 440 or 442?

For HDR afaik pretty much everything will actually be done in 420 (10bit) and if you keep it at RGB or 422 then it just gets scaled down anyway, so it doesn't matter too much. I say this because I know the new GPUs with HDMI 2.1 don't seem to allow us to choose like 422 12bit anymore on HDMI 2.0b displays. For non-HDR usage I'd stick to RGB 8bit regardless, and in fact I've stopped tinkering with it even for HDR and just keep it in RGB 8bit because then I can just enable HDR in windows (for certain games/youtube) and it will use dithering and work its magic anyway. Can't say I've noticed a difference since doing that (as opposed to manually switching to 422 12bit, which I can't do anymore).
 
For HDR afaik pretty much everything will actually be done in 420 (10bit) and if you keep it at RGB or 422 then it just gets scaled down anyway, so it doesn't matter too much. I say this because I know the new GPUs with HDMI 2.1 don't seem to allow us to choose like 422 12bit anymore on HDMI 2.0b displays. For non-HDR usage I'd stick to RGB 8bit regardless, and in fact I've stopped tinkering with it even for HDR and just keep it in RGB 8bit because then I can just enable HDR in windows (for certain games/youtube) and it will use dithering and work its magic anyway. Can't say I've noticed a difference since doing that (as opposed to manually switching to 422 12bit, which I can't do anymore).

Interesting. So for HDR content 420 10 bit, or do what you suggest and leave it at RGB 8 bit and turn HDR on when required.

Do you modify any Nvidia control panel settings as well?
 
Interesting. So for HDR content 420 10 bit, or do what you suggest and leave it at RGB 8 bit and turn HDR on when required.

Do you modify any Nvidia control panel settings as well?
I mean that's what to select in the control panel, I'm on AMD but I assume it's the same (check if it says 8bit with dithering for you too when you turn on hdr). Just keep it on RGB (Full) 8bit and turn on/off the HDR in windows display depending on if the game requires it or if you want to watch youtube hdr videos. For the TV itself I'm always on enhanced format.

dWQJ5VU.jpg.png
 
I mean that's what to select in the control panel, I'm on AMD but I assume it's the same (check if it says 8bit with dithering for you too when you turn on hdr). Just keep it on RGB (Full) 8bit and turn on/off the HDR in windows display depending on if the game requires it or if you want to watch youtube hdr videos. For the TV itself I'm always on enhanced format.

dWQJ5VU.jpg.png

I'd rather just use the built in YouTube app or my Nvidia shield than have to mess about with settings constantly.

It's a pity that there are so many compatibility issues with older and newer file types and content.

I'm having issues with x265 files and older x264 files not displaying properly unless I mess with settings between them. Gets annoying.
 
I'd rather just use the built in YouTube app or my Nvidia shield than have to mess about with settings constantly.

It's a pity that there are so many compatibility issues with older and newer file types and content.

I'm having issues with x265 files and older x264 files not displaying properly unless I mess with settings between them. Gets annoying.


What do you mean about the x265 and x264 (mkv) files and display issues with the TV..?
 
What do you mean about the x265 and x264 (mkv) files and display issues with the TV..?

The built in hardware is not powerful enough to use with an emby server with certain file types and sizes.

Maybe if I plugged in a usb drive locally it would fair better but doesn't like it over the network. Possibly the dodgy WiFi on the tv it's always been plagued with issues.

So I use a shield now with the emby app.
 
The built in hardware is not powerful enough to use with an emby server with certain file types and sizes.

Maybe if I plugged in a usb drive locally it would fair better but doesn't like it over the network. Possibly the dodgy WiFi on the tv it's always been plagued with issues.

So I use a shield now with the emby app.


ah, I have / had something similar. For that reason I now use a 4TB 2.5" drive connected directly to the TV for x265 UHD material. That helps, using VLC as the player.

Even tho Plex can complain when trying to stream from the media server in our loft Kodi will generally do it fine.

I do have a Fire TV Cube but needed to use a USB Gigabit ethernet adaptor for it.

I do wish that they would use Gigabit ports on TV's.
 
Does anybody know if it's possible to set the default input on this TV? I have an Apple TV 4K in HDMI2 and use it for pretty much everything. I'd like to just turn the TV on and it default to that - is it possible?
 
^^^^^^ I have my Amazon Fire Cube on HDMI 2 and when I switch the TV off it is on that input before I do so. That means if I switch the TV on it starts back on HDMI 2. Isn't that what you want..?
 
^^^^^^ I have my Amazon Fire Cube on HDMI 2 and when I switch the TV off it is on that input before I do so. That means if I switch the TV on it starts back on HDMI 2. Isn't that what you want..?

I think that’s the default behaviour e.g remember last input. I’m wondering if I can change it to always hdmi2?
 
This is what CEC is for. If you wake the telly up by pressing a button on the Apple TV remote it should turn the telly on and switch to that input. My shield does this as does Sky Q and my PS4.
 
Back
Top Bottom