Streaming Data - How much does it actually cost?

Soldato
Joined
6 Aug 2009
Posts
7,158
Given that now everyone is charging more for 4K streaming and even HD, does the extra cost actually reflect the increase costs in data or is it another excuse to fleece to consumer? As you can only buy a 4K TV these days that should be the default imo.
 
How many students have a TV? Many people just use their iPads, laptops, computers, phone. Not all of those are worth the extra for a 4K sub.
The point being it should be 4K as standard not a big upcharge for it ;) If you're watching on a tiny screen you're really not bothered by image quality as DVD sales attest to. Given the quality TV's around streaming is still fairly poor. I see a big difference in picture and sound quality with 4K discs. Even the better streamers are only at something like 30Mbps bitrate. I was curious to see if there was anyone in the industry who had an idea if the extra charges were justified or is it a case of 4K costs them 50p more than 1080p but we get charged £10 more?
 
There are certainly more costs on the production side, in simple terms it costs 4 times more to create than HD especially when you factor in Dolby Vision and Atmos. On the actual streaming to the consumer side not so much.
Is it really four times the cost? It's four times the pixels but do the costs really scale linearly? I would have thought labour costs would be the same. Data costs higher of course. To be fair a lot of the 4K content doesn't include Atmos or DV or even any dynamic HDR.
 
It's probably an over simplification but there are areas where it scales like that, storage and compute being one but there is more work for the humans too as they have 4x the pixels to work on so colour grading, QC, edit, graphics, VFX, CGI (although a lot of that is still HD/2K) to save costs. Then add more work on top for HDR passes and again more for Dolby Atmos mixing it adds up.
Thanks. It sort of puts a nail in the whole idea of 8K as 4k being universal is obviously at least a decade away, likely much more. As other replies have said the fragmenting of streaming, rising charges and introduction of adverts is taking us back to how things used to be with separate TV channels! I'm feeling more vindicated buying physical media now, at least for the good stuff, I stream the casual viewing, watch-it-once stuff. So often I see a film on a streaming service and think I'll watch that only to find it's extra money to watch it or isn't in 4k or has lower quality audio. There is also a huge amount of rubbish on there too, I spend far too long sifting through the straight to video low budget stuff now on Prime trying to find anything worth watching.
 
AFAIK All of this stuff is already done at high resolution (sometimes greater than 4K) for it to go on the big screen. The actual time to the artist is minimal as long as they have the computing power.

I thought they worked at the highest resolution possible upto final output at which point it is scaled down.
The CGI is usually done at 2k because there a lot more processing/rendering time required, especially for films like the Marvel series that are very CGI heavy. I don't know about the other work.
 
How is that possible if they have 4K Blu ray releases?
They put together the live action at 4k and 2k CGI scenes. It's why the CGI often looks softer on a 4K release and it's not a lot better than just buying the blu-ray. Unless they are cheap I haven't bothered with buying any Marvel titles in 4K. Ironically older films shot on film really benefit from 4K and are often scanned in 8K or higher.
 
The more you know. Thanks.

Is this unique to marvel or do other Vfx heavy films do this?

So that’s why avatar took so long. They were just waiting for the render to finish. :cry:
I think it's a general thing. I'm sure I read somewhere that there have been a few with 4K CGI, maybe it was the new Avatar. I expect there will be more over time as technology moves forward. You're probably not wrong, it looks like it was almost all CGI bet it was a huge undertaking bearing in mind they started it years ago. I'm by no means an expert but I read around about physical media and the whole thing is a minefield once you dig into the different versions of films, different cuts, audio, formats, HDR etc. Its' quite feasible you can argue on the internet with someone about a film and have literally had different experiences, both being right :cry:
 
They do film (actual film and then scanned (unless you're Tarantino or Nolan) and shoot (digital) at a higher than 4K resolution in the camera, so the source material is higher, normally around 5 to 8K, which allows greater flexibility in editing. The edit though will be working towards a 4K (or HD/2K) output for the master, which is then used for everything else that I mentioned in my previous post.

As the source material is high res they can always go back and remaster for future resolutions. This is why old films shot on 35mm or 70mm look so good on their 4K HDR re-masters.

Funny you mention the big screen, most cinemas still only take a 2K resolution delivery in effectively SDR.
Just to add to this, a lot of the older films are being scanned now as the film stock is degrading and they need to preserve them. On a 4K blu-ray they look fantastic, especially on 70mm like The Ten Commandments and Lawrence of Arabia.
 
Yeah that's right, I know someone that works at a lab, can't remember what film it was but they were scanning it at 16K 16bit, so it was a GB per frame, 24GB of data per second!
I'd heard they were now scanning at 16k but couldn't find much on it. That will be good for a very long time to come.

I think the other thing often forgotten is viewing distance. 4K is no better than 1080p if you sit too far back from the screen.
 
A couple of points. A lot of the discussion here has focused on the resolution being 4X that of 1080p. IMO, @Mr_Sukebe hit the nail on the head.


4K uses a far more efficient CODEC than 1080p, and so in terms of data storage and streaming bandwidth it's about 2.5X the bandwidth required to store and send UHD. However, because it's a different CODEC (H.265 vs H.264) then I'm guessing that the services do need to store files in both versions since there'll still be some devices around that can't decode H.265.

Honestly though, I believe that these costs are a minor consideration. The driver here is supply and demand. Services charge what they can. For example, Sky still charges extra for HD at a time when nearly all new TVs are 4K.

7m2BfZ.jpg
Totally agree, hence I won't pay extra for 4K streaming content, at that point I wait and buy the better physical version. If it's just a TV show I'm not that bothered. Sky really take the michael charging for HD, SD is unwatchable to me on a decent 4K TV. I can see a bit realignment coming in the streaming space. Too many players with too little content and charging too much, soon to include adverts. I've looked around three separate streaming channels recently and struggled to find anything worth watching or that I hadn't seen before.
 
Back
Top Bottom