Streaming ripped 4K films

I'm assuming Win server is the server and OSX is the client, and I'm assuming you're using SMB shares?

Sounds like OSX buffers the whole file to local disk and then starts reading it while it's still buffering at full whack, instead of reading it directly/only when necessary.

Can you get a disk usage meter in the task manager for Mac? Sounds like your stuttering is caused by excess disk IO.

Your assumptions are right.

I've been trying VLC and 5kplayer, both of which exhibit the same issues. The thing is, a 1080p film plays fine. I don't see why a 4K film should be any different apart from the obviously larger file size.

I'll double check all these figures and provide more info if possible tonight when I get home. I'll also try the file directly on the client computer. Excessive disk I/O shouldn't be major issue as the client runs off an SSD so it should be capable of handling anything major.
 
Pretty much all 4k content including that on physical UHD Blu ray discs is compressed with HVEC 265 and while that does mean you need a decoder I had taken that as a given.

However compression in this sense does not mean you lose quality.

In all reality a full quality 4K feature is probably going to max out somewhere well under 100mbs and probably closer to 50. (108 is just a theoretical maximum)

For reference a std blu ray is usually somewhere between 18 and 25 with a much less efficient compression codec so say 3x for HVEC 4K so say between 56 and 75 is realistic.

Yes, Bluray is basically a standard for compression/encoding.

Once video becomes encoded (such as on a Bluray) it becomes a data stream as opposed to a video stream which makes the bitrate of uncompressed video irrelevant because you're transferring encoded data. The bitrate of uncompressed 4K doesn't matter in this case because you're not streaming 4K video over the network, you're streaming a data file, the bitrate of which is calculated by dividing the total size of the file by the total length of the film. The destination will do the decoding of the data to turn it back into an uncompressed video signal such as HDMI so it can be displayed on a screen.


Thanks for the edification. :)
 
Your assumptions are right.

I've been trying VLC and 5kplayer, both of which exhibit the same issues. The thing is, a 1080p film plays fine. I don't see why a 4K film should be any different apart from the obviously larger file size.

I'll double check all these figures and provide more info if possible tonight when I get home. I'll also try the file directly on the client computer. Excessive disk I/O shouldn't be major issue as the client runs off an SSD so it should be capable of handling anything major.

You're unlikely to have hw h265 decoding so sw is used this will hammer your cpu and affect IO's, thats why 1080 is ok, it will be hw decoded (h264).
 
For reference a std blu ray is usually somewhere between 18 and 25 .

Its actually pretty common on good quality BR's to be much higher than that (Im getting well over 40Mb/s on stuff like Gladiator and Aliens for example, watched both of them pretty recently :D)


Would also be interested to know whats being used to create the 4k rip, as I wasnt aware there was anything that could do this at present (so surely it could be a ripping issue also)?
 
Last edited:
Would also be interested to know whats being used to create the 4k rip, as I wasnt aware there was anything that could do this at present (so surely it could be a ripping issue also)?
I believe it's normally one of three sources.

Webrip of the Sony 4k service available in the US.
Webrip of Amazon/Netflix 4k.
Re-encode of UHD Blu-ray using a HDFury.
 
I believe it's normally one of three sources.

Webrip of the Sony 4k service available in the US.
Webrip of Amazon/Netflix 4k.
Re-encode of UHD Blu-ray using a HDFury.

Exactly my point none of those would exactly be "gaurenteed" to get a decent rip in the first place (in comparison to a makemkv rip for std br for example
 
Did susbequently find some useful 4k/1080p clips at different bandwidths eg
http://jell.yfish.us/media/jellyfish-100-mbps-hd-hevc.mkv
with that 100Mb/s clip vlc was jittery versus rock solid mpc-hc, which I use normally.

[all w/ older Dell laptop using optimus system to push graphics to nvidia gpu versus Intel IGP, this is notoriously unreliable, & although neither igp HD3k nor nvidia NV4200M laptop has, give full h265 acceleration - they seem to do something ?]
 
Last edited:
I would be interested to try it out on my home media server and client.

Media server on the gigabit network
Client on AC1200.

Never had an issue with blu rays but I've only tested files up to 25gigabyte for a 2hour film (I can find out the bit rate if needed from memory its something like 25mbps for video and 2mbps for audio?)

I'm really just waiting for when makemkv can deal with UHD blu ray before I start buying them.
 
I would be interested to try it out on my home media server and client.

Media server on the gigabit network
Client on AC1200.

Never had an issue with blu rays but I've only tested files up to 25gigabyte for a 2hour film (I can find out the bit rate if needed from memory its something like 25mbps for video and 2mbps for audio?)

I'm really just waiting for when makemkv can deal with UHD blu ray before I start buying them.

Don't forget you get the 1080p BR in the box with the 4k BR, which means it may be worth spending a little extra if you were buying the 1080P one alone for future proofing.

I'm being a little sparing with my purchases though, easy to get carried away :D
 
Yes, Bluray is basically a standard for compression/encoding.
Wrong. The compression standards for Bluray are MPEG2, VC-1 and H.264. Bluray is the physical medium on which the compressed video, audio, special features, bluray menu structure etc are all stored.

the bitrate of which is calculated by dividing the total size of the file by the total length of the film.

Not really. The bitrate is not always fixed. Without going into too much detail, the compression ratio is variable, achieving a fixed rate is done by controlling the quantisation factor (Qp). However this only ever produces approximate results. Companies spend vast amounts of time optimising their rate control algorithms to provide good bitrate control.
Qp control is a major cause of compression artefacts, which are visible when the compressed stream is decoded.
Bluray's will often have a more variable bitrate to reduce/remove these compression artefacts. So for scenes with little motion or movement the bitrate will be low. For scenes with higher motion or explosions/random noise (i.e. smoke) then the bitrate will increase as there is more information and less redundancy in the data.
 
Quick update. I never got to the root of the streaming issue... However, I did try the file locally and the same issues occurred in exactly the same places. Could just have been an iffy rip.

I'm currently downloading the files mentioned by jpaul to give them a whirl.
 
LOL, rubbish. The streaming services use roughly 25Mbps for their UHD content and the Ultra HD Bluray spec lists three bitrates, 82Mbps, 108Mbps and 128Mbps. A 100Mb Ethernet connection is more than capable of supporting "decent quality 4k video".




I have tried streaming 4k films from my PC to my TV and it drops frames and stutters as the bitrate was exceeding 100mb/sec connection on the TV.

The streaming services compress the hell out of the films and TV episodes, so look rubbish compared to 4k bluray.

You assume that the network stack on all TV's offer 100% efficiency, but in reality they do not seem to, at least not with the Samsung 6400 & 9000 which both use wired connections yet still stutter with decent quality material.
 
The context of aim18's comment about bitrate calculation was an uncompressed / raw stream - I thought ?

Moreover in terms of causes of 'glitching'/stuttering I wondered whether a high bit-rate encode itself (like the 100Mb/s example I gave) is necessarily the most demanding for the hardware decode, or whether a more highly compressed (multi-pass, frames forward and back referencing) could be more demanding.
... so glitching might be a combination of bandwidth for the stream and hardware decode capability (are some hardware decodes more equal than others?)
 
Wrong. The compression standards for Bluray are MPEG2, VC-1 and H.264. Bluray is the physical medium on which the compressed video, audio, special features, bluray menu structure etc are all stored.

You just said the "compression standards" "for bluray" are MPEG2, VC-1 and H.264,

You've literally said "Compression standards for bluray are..." AND you've gone on to define them....

...so how on earth can it be wrong to say that Blurays are inherently compressed video, and that the compression used, and thus the specific decoders required, are standardised across all Bluray players. The encoding used is still a defined standard for Bluray even though there are various combinations to choose from, as well as audio encoders/decoders.

You are getting mixed up with Blu-ray drive as a data storage medium and Blu-ray Player as a standardised video delivery and playback medium. When you're using it as a video delivery medium there are standards to adhere to and compression is one of them (as is directory structure, menus etc, as you mentioned).



Not really. The bitrate is not always fixed.
I've already mentioned this, sorry.

Hmm that's an average 66.6 megabits per second for the file. So considering it's variable bitrate it's not surprising the network usage is surging to 90Mb/s, and to answer your question in the OP, I'd say it was perfectly normal.

As you can see the keyword here is average, and I am already aware that actual video bitrate is variable.

Without going into too much detail, the compression ratio is variable, achieving a fixed rate is done by controlling the quantisation factor (Qp). However this only ever produces approximate results. Companies spend vast amounts of time optimising their rate control algorithms to provide good bitrate control.
Qp control is a major cause of compression artefacts, which are visible when the compressed stream is decoded.
Bluray's will often have a more variable bitrate to reduce/remove these compression artefacts. So for scenes with little motion or movement the bitrate will be low. For scenes with higher motion or explosions/random noise (i.e. smoke) then the bitrate will increase as there is more information and less redundancy in the data.

Thank you for your explanation of how complex images use up more data than simple images.
 
Last edited:
I have tried streaming 4k films from my PC to my TV and it drops frames and stutters as the bitrate was exceeding 100mb/sec connection on the TV.

The streaming services compress the hell out of the films and TV episodes, so look rubbish compared to 4k bluray.

You assume that the network stack on all TV's offer 100% efficiency, but in reality they do not seem to, at least not with the Samsung 6400 & 9000 which both use wired connections yet still stutter with decent quality material.

May I offer a simple explanation.

Do those Samsung TVs actually support hardware HVEC ? it is far from automatic that just because the TV can hardware decode HVEC 265 via an HDMI port that it can via Ethernet or wireless.

My guess is the TV is software decoding and simply is not powerful enough.
 
May I offer a simple explanation.

Do those Samsung TVs actually support hardware HVEC ? it is far from automatic that just because the TV can hardware decode HVEC 265 via an HDMI port that it can via Ethernet or wireless.

My guess is the TV is software decoding and simply is not powerful enough.

Actually you made a slight mistake there imo

if the tv is capable of hardware decoding that type of stream, then it will be capable of doing it via HDMI or Ethernet (or wifi, but latter is not wise as it will be too slow for 4k). The same hardware decoder will be used no matter what input.

How fast that hardware decoder is fed the stream by the input is another matter entirely.

Obviously if the TV is doing it in software the quality will be much worse / and more likely wouldnt work at all.
 
You just said the "compression standards" "for bluray" are MPEG2, VC-1 and H.264,

You've literally said "Compression standards for bluray are..." AND you've gone on to define them....

...so how on earth can it be wrong to say that Blurays are inherently compressed video, and that the compression used, and thus the specific decoders required, are standardised across all Bluray players. The encoding used is still a defined standard for Bluray even though there are various combinations to choose from, as well as audio encoders/decoders.

You are getting mixed up with Blu-ray drive as a data storage medium and Blu-ray Player as a standardised video delivery and playback medium. When you're using it as a video delivery medium there are standards to adhere to and compression is one of them (as is directory structure, menus etc, as you mentioned).

What are you talking about. You said -

"Yes, Bluray is basically a standard for compression/encoding."

I said "Compression standards for bluray are", not Bluray is a compression standard. This is not the same thing :confused:

Blu-rays are a physical disk

"Blu-ray or Blu-ray Disc (BD) is a digital optical disc data storage format. It was designed to supersede the DVD format, in that it is capable of storing high-definition and ultra high-definition video resolution (2160p). The plastic disc is 120 mm in diameter and 1.2 mm thick, the same size as DVDs and CDs."

The video data stored on the disc, once extracted from the physical medium and the many data structures, is in a compressed format i.e. H.264/MPEG-2/VC-1.
 
FrankJH, Domi, this is getting confusing ;)
if you are feeding the video via HDMI then that is not a hevc/265 stream - just hdmi data 12Gb/s+ etc. ? the hevc/265 decode has been done by the external device (tv box etc)
But, yes agree with your comments on s/w decode and it would be interesting to know if Samsung glitching, dimesion99 referenced were down to network or decoder capabilities (suppose you can put the video on a sd card/usb and maybe identify which of the two)
 
FrankJH, Domi, this is getting confusing ;)
if you are feeding the video via HDMI then that is not a hevc/265 stream - just hdmi data 12Gb/s+ etc. ? the hevc/265 decode has been done by the external device (tv box etc)
But, yes agree with your comments on s/w decode and it would be interesting to know if Samsung glitching, dimesion99 referenced were down to network or decoder capabilities (suppose you can put the video on a sd card/usb and maybe identify which of the two)

Whatever is playing the video is the one that is decoding it. If it's a pc playing it connected for hdmi out then the pc is decoding it.
 
Back
Top Bottom