Streaming Data - How much does it actually cost?

Soldato
Joined
6 Aug 2009
Posts
7,158
Given that now everyone is charging more for 4K streaming and even HD, does the extra cost actually reflect the increase costs in data or is it another excuse to fleece to consumer? As you can only buy a 4K TV these days that should be the default imo.
 
How many students have a TV? Many people just use their iPads, laptops, computers, phone. Not all of those are worth the extra for a 4K sub.
The point being it should be 4K as standard not a big upcharge for it ;) If you're watching on a tiny screen you're really not bothered by image quality as DVD sales attest to. Given the quality TV's around streaming is still fairly poor. I see a big difference in picture and sound quality with 4K discs. Even the better streamers are only at something like 30Mbps bitrate. I was curious to see if there was anyone in the industry who had an idea if the extra charges were justified or is it a case of 4K costs them 50p more than 1080p but we get charged £10 more?
 
There are certainly more costs on the production side, in simple terms it costs 4 times more to create than HD especially when you factor in Dolby Vision and Atmos. On the actual streaming to the consumer side not so much.
 
Last edited:
There are certainly more costs on the production side, in simple terms it costs 4 times more to create than HD especially when you factor in Dolby Vision and Atmos. On the actual streaming to the consumer side not so much.
Is it really four times the cost? It's four times the pixels but do the costs really scale linearly? I would have thought labour costs would be the same. Data costs higher of course. To be fair a lot of the 4K content doesn't include Atmos or DV or even any dynamic HDR.
 
I’ll always pay more for 4k even though in an ideal world it would be standard. I have an expensive oled tv and I want to do it justice.

My bigger issue is the streaming services that don’t even have 4k available. Paramount plus for example, absolute joke that we don’t get 4k, Atmos and hdr.

The more worrying thing for me is not the move to charge for 4k but how adverts are creeping into paid services…..

Either way this streaming bubble will burst. The market will not sustain a continuing rise in prices, more and more adverts and the sheer amount of streaming services.

We are well into double figures for available paid streaming apps now. How long before every show has their own streaming app lol. At the moment I’m at Disney plus (annual) paramount plus (annual) prime (annual) Netflix, Apple TV+ and Now tv. I’m looking to cut down when it comes to renewal, there’s far too many and I’m getting fed up of the fragmentation with some shows having seasons spread across different apps.

Funny how streaming services were making people switch from the likes of sky etc and it’s almost gone full circle. Looking at what sky offer now it’s almost tempting to go back to their full service….
 
Last edited:
I believe Linus tech tips did a video this I think it is called something along the lines of “should YouTube charge for 4K”

From memory the big cost items was storage of the videos and bandwidth to send it out. Storage has stagnated in price and is no longer scaling down as it once was.
 
Last edited:
Is it really four times the cost? It's four times the pixels but do the costs really scale linearly? I would have thought labour costs would be the same. Data costs higher of course. To be fair a lot of the 4K content doesn't include Atmos or DV or even any dynamic HDR.
It's probably an over simplification but there are areas where it scales like that, storage and compute being one but there is more work for the humans too as they have 4x the pixels to work on so colour grading, QC, edit, graphics, VFX, CGI (although a lot of that is still HD/2K) to save costs. Then add more work on top for HDR passes and again more for Dolby Atmos mixing it adds up.
 
It's probably an over simplification but there are areas where it scales like that, storage and compute being one but there is more work for the humans too as they have 4x the pixels to work on so colour grading, QC, edit, graphics, VFX, CGI (although a lot of that is still HD/2K) to save costs. Then add more work on top for HDR passes and again more for Dolby Atmos mixing it adds up.
Thanks. It sort of puts a nail in the whole idea of 8K as 4k being universal is obviously at least a decade away, likely much more. As other replies have said the fragmenting of streaming, rising charges and introduction of adverts is taking us back to how things used to be with separate TV channels! I'm feeling more vindicated buying physical media now, at least for the good stuff, I stream the casual viewing, watch-it-once stuff. So often I see a film on a streaming service and think I'll watch that only to find it's extra money to watch it or isn't in 4k or has lower quality audio. There is also a huge amount of rubbish on there too, I spend far too long sifting through the straight to video low budget stuff now on Prime trying to find anything worth watching.
 
but there is more work for the humans too as they have 4x the pixels to work on so colour grading, QC, edit, graphics, VFX, CGI (although a lot of that is still HD/2K) to save costs.
AFAIK All of this stuff is already done at high resolution (sometimes greater than 4K) for it to go on the big screen. The actual time to the artist is minimal as long as they have the computing power.

I thought they worked at the highest resolution possible upto final output at which point it is scaled down.
 
AFAIK All of this stuff is already done at high resolution (sometimes greater than 4K) for it to go on the big screen. The actual time to the artist is minimal as long as they have the computing power.

I thought they worked at the highest resolution possible upto final output at which point it is scaled down.
The CGI is usually done at 2k because there a lot more processing/rendering time required, especially for films like the Marvel series that are very CGI heavy. I don't know about the other work.
 
How is that possible if they have 4K Blu ray releases?
They put together the live action at 4k and 2k CGI scenes. It's why the CGI often looks softer on a 4K release and it's not a lot better than just buying the blu-ray. Unless they are cheap I haven't bothered with buying any Marvel titles in 4K. Ironically older films shot on film really benefit from 4K and are often scanned in 8K or higher.
 
They put together the live action at 4k and 2k CGI scenes. It's why the CGI often looks softer on a 4K release and it's not a lot better than just buying the blu-ray. Unless they are cheap I haven't bothered with buying any Marvel titles in 4K. Ironically older films shot on film really benefit from 4K and are often scanned in 8K or higher.
The more you know. Thanks.

Is this unique to marvel or do other Vfx heavy films do this?

So that’s why avatar took so long. They were just waiting for the render to finish. :cry:
 
The more you know. Thanks.

Is this unique to marvel or do other Vfx heavy films do this?

So that’s why avatar took so long. They were just waiting for the render to finish. :cry:
I think it's a general thing. I'm sure I read somewhere that there have been a few with 4K CGI, maybe it was the new Avatar. I expect there will be more over time as technology moves forward. You're probably not wrong, it looks like it was almost all CGI bet it was a huge undertaking bearing in mind they started it years ago. I'm by no means an expert but I read around about physical media and the whole thing is a minefield once you dig into the different versions of films, different cuts, audio, formats, HDR etc. Its' quite feasible you can argue on the internet with someone about a film and have literally had different experiences, both being right :cry:
 
AFAIK All of this stuff is already done at high resolution (sometimes greater than 4K) for it to go on the big screen. The actual time to the artist is minimal as long as they have the computing power.

I thought they worked at the highest resolution possible upto final output at which point it is scaled down.
They do film (actual film and then scanned (unless you're Tarantino or Nolan) and shoot (digital) at a higher than 4K resolution in the camera, so the source material is higher, normally around 5 to 8K, which allows greater flexibility in editing. The edit though will be working towards a 4K (or HD/2K) output for the master, which is then used for everything else that I mentioned in my previous post.

As the source material is high res they can always go back and remaster for future resolutions. This is why old films shot on 35mm or 70mm look so good on their 4K HDR re-masters.

Funny you mention the big screen, most cinemas still only take a 2K resolution delivery in effectively SDR.
 
Last edited:
They do film (actual film and then scanned (unless you're Tarantino or Nolan) and shoot (digital) at a higher than 4K resolution in the camera, so the source material is higher, normally around 5 to 8K, which allows greater flexibility in editing. The edit though will be working towards a 4K (or HD/2K) output for the master, which is then used for everything else that I mentioned in my previous post.

As the source material is high res they can always go back and remaster for future resolutions. This is why old films shot on 35mm or 70mm look so good on their 4K HDR re-masters.

Funny you mention the big screen, most cinemas still only take a 2K resolution delivery in effectively SDR.
Just to add to this, a lot of the older films are being scanned now as the film stock is degrading and they need to preserve them. On a 4K blu-ray they look fantastic, especially on 70mm like The Ten Commandments and Lawrence of Arabia.
 
Back
Top Bottom