absolute rubbish, there are a whole host of processing and component quality issues between source and output.
according to your logic every vinyl record deck, every video player, every cd player, every DVD player would all play/look/sound the same to each other as well.
difficult one.
as the source quality increases, the baseline also increases. we arent talking about picking up an analogue recording on a vinyl, or any analogue recording so forget them. cd players have a whole host of different techniques available to them for reading a cd and creating an audio output - some of them oversample (by varying amounts), some of them dont over sample at all. by their very nature they WILL be different, and this is before they output anything. analogue outputs depend a great deal on the quality of the electronics used. the same can be said for any cable carrying an analogue signal. As far as video goes again, there are hundreds of different approaches to constructing an ouput from whats read on the disk. some are vastly better at deblocking than others, better colours ect ect.
when you move over to (much) higher bitrate sources, it becomes that much harder for premium players to set themselves apart. these days everything is digital - your bluray is read, decoded, doesnt need deblocking or any kind of noise-defeating techniques. it doesnt need anything, basically. the audio is lossless, doesnt need any manipulating when you can output the steam, totally untouched, to an av amp.
Now, apparently, there is a difference between letting say, the ps3 decode HD audio on a bluray and output that as lpcm to letting a bd35 output the native TrueHD/dts:ma track to an amplifier. i say apparently because i dont have the inclination to try a standalone with my setup - my ps3 more than suffices.
when you're transporting everything in digital, what it comes down to in the end is jitter. and thats just a bag of roflcopters from start to finish. ill give you an idea, ive had dosens of sound cards over the years and there has been bugger all difference in sound quality between any of them over spdif apart from the early cards which all had a bit of a problem with audio in general. im thinking of my old soundstorm-equipped abit nf7-s, here. the output was not perfect, be it spid or (especially) analogue. everything new ive tried has been damn near identical and i cant honestly tell the difference between realtek onboard, an xonar d2, my laptop or my htpc's hdmi outputs when it comes to 44.1khz, 24 bit audio. there's so little in it i honestly cant hear it.
video, also. both my laptop and my htpc are capable of playing a bluray, just like my ps3 is, and again there's nothing in the picture quality or the sound quality. Theorys are all well and good but the proof is in the testing and ive found it all a bit underwhelming tbh. to improve on any of them i think id need to be aiming for balanced 7.1 outputs to a full team of balanced amplifiers, but i just dont have £10k to spend and i wouldnt want to if i did lol.