They may start off with the same source file.
How they stream it will vary massively as different streaming platforms use different methods and codecs, IIRC netflix has possibly the most advanced codec system out of them all because from memory they've spent a lot of time and money getting their system to work to give the best image at pretty much whatever combination of resolution, bandwidth and decoding hardware you've got with multiple sets of encoders optimized for different types of film and the ability to change on the fly, whilst there is a good chance at least some of the other platforms are using more generic tools or have optimized their streaming for specific hardware.
You can probably save a very significant chunk of bandwidth by allowing your streaming codec to have a lower bitrate and more "blur" at times, and on some things it might not matter but if it happens on say human skin it's going to be picked up even if just subconsciously (I seem to remember an article from netflix on their encoders where they covered some of this).
You see something similar with traditional broadcasts, different broadcasters use different bitrates, different encoders etc so one channel might look much better showing the same film as another channel simply because they've either used a higher bit rate, or because they've used a more optimized/better encoder.
Even DVD's and BD's have this, early ones often looked worse than later ones despite the early ones potentially pushing the DVD bitrate to the limit as the encoders got a lot better over time, and some companies simply had better tools or staff who knew how to handle their encoders better.