Can optical (SPDIF) change the sound?

Soldato
Joined
15 Nov 2003
Posts
14,496
Location
Marlow
Someone said to me, one units SPDIF can sound better than another's?

Now this confused me, as I thought, given the same digital source, all both devices would do is transfer than same (unchanged) digital data down the optical cable?

How can either unit make it sound better or worse?
 
SPDIF is not a "Jitter" free interface, so different units CAN effect the performance depending on how good they are. It's a bit relative though as depending on the mating equipment will depend how noticeable this is.
 
SPDIF is not a "Jitter" free interface, so different units CAN effect the performance depending on how good they are. It's a bit relative though as depending on the mating equipment will depend how noticeable this is.

No, jitter is measurable but you cannot possibly hear it unless you artificially introduce an amount many, many times beyond what any equipment actually has.
 
Some units allow volume control. In these cases, if the unit/device is changing the volume within the binary data, isn't it then changing the binary data? Unless there's a specific data value for volume which it is just adjusting, so the audio data itself remains unchanged?
 
No, jitter is measurable but you cannot possibly hear it unless you artificially introduce an amount many, many times beyond what any equipment actually has.

That is simply not true. If the bitstream is changing through jitter, whatever is doing the processing at the other end is having to 'guess'. This guessing is clearly at least likely to affect the shape of the output waveform compared to the original and thus to say you cannot possibly hear it is false.

Unlikely is a far better term to use rather than the absolute "cannot possibly".

Run simulations in MatLAB and compare the output waveforms with and without jitter and calculate the SNR. If one unit is adding significant jitter (however that may come about) and the other is not, there is significant potential for an audible difference.
 
You're going to need to be more specific and not use odd terms like "units."

All S/PDIF interfaces are equal, anything that differs occurs before or after the interface. Anything that adjusts volume has nothing to do with S/PDIF.
 
That is simply not true. If the bitstream is changing through jitter, whatever is doing the processing at the other end is having to 'guess'. This guessing is clearly at least likely to affect the shape of the output waveform compared to the original and thus to say you cannot possibly hear it is false.

Unlikely is a far better term to use rather than the absolute "cannot possibly".

Run simulations in MatLAB and compare the output waveforms with and without jitter and calculate the SNR. If one unit is adding significant jitter (however that may come about) and the other is not, there is significant potential for an audible difference.

The bitstream is not changed by jitter, the data is identical and a changed waveform does not mean audibility, it's just a changed waveform of S/PDIF data.

"cannot possible" or "impossible" are perfectly accurate and I already qualified this. There is no equipment that produces an amount of jitter that is audible, or anything close to audible. Therefore.....
 
The bitstream is not changed by jitter, the data is identical and a changed waveform does not mean audibility, it's just a changed waveform of S/PDIF data.

"cannot possible" or "impossible" are perfectly accurate and I already qualified this. There is no equipment that produces an amount of jitter that is audible, or anything close to audible. Therefore.....

Err... what?!

If you view the on-the-wire bitstream on a scope, you will see that it looks like /---\__/----\ rather than |----|__|----| so the clocking intersects the digital waveform at some point on a distintly analogue-looking slope. If the clocking is good, it wont matter because it will be clearly above the required voltage (lets say 1V for arguments sake) to signal a 1. If you introduce jitter, the signal might be somewhere between 0 and 1V on the stroke of the clock, so it could be a 0 when it really ought to be a 1 and visa versa.

Error correction using parity bits will correct this should each word not be even or odd (but what if two bits are changed that makes the parity bit right but the word wrong?). As soon as you introduce this error correction there is a mathematical loss occurring and that is why jitter matters.
 
All types of jitter can be engineered out. Focus jitter, frequency jitter, aperture jitter, level errors and timing errors are all by-products of the technology used, and it depends how aggressively you wish to correct these problems that you get 'jitter-related distortion'.

So yes, one SPDIF interface can be better than another, as much as the same make and model of car can be better than another, the tolerences may be better on one than the other.

We are talking in terms of 99.999%'s though.
 
So we now arguing that "all digital devices sound the same" ... so all transports, PC outputs and all will have no effect, be it down to jitter or anything else ?

So jitter has no effect of sound quality is the claim by some !?!!?! :eek: it's like rolling back the clock 30 years ...ROFL
 
As someone who spent some time playing with PC's, Macs, DVD players, LD players and CD transports and such, comparing them all into older DACs and comparing them naked and reclocked, by cleaning up jitter and such, I do not believe all digital sources sound the same or are equal.

Digital output circuits are certainly not all manufactured to the same standards, optical in particular. And I am still unaware of any technical investigation into the detrimental effect of PC power supplies and other PC circuits on audio. I am sure all we stil have is opinion of such.

Quite simply some of the worst sounding results I have had were from PC digital, with spdif direct from motherboard, about as clean and straight a line as could be achieved, yet easily made better sounding by utilising a M-Audio sound card instead.

As for Jitter, we have correlated and uncorrelated jitter, and different levels with diferent equipment with no real correlation on percieved sound quality to some. Unless you measure that jitter over more than the narrow band manufacturers specify and listen to the music with that jitter addressed, well its only guess work, there are a hell of a lot of variables and other factors that affect PC sound quality.

I find removing the high jitter levels from my PC and sorting out the clocking errors to give a worthwhile improvement with some media players and decent quality music.

Now this is not saying one units spdif is going to sound noticeably different from another, two identical boards will not be differentiated in listening tests in my opinion, a poorly specified digital cable may have some detrimental effect that may be noticeable, and one motherboards digital output may vary from anothers. A Lynx soundcard will sound like a Lynx sound card, it will only be affected by the drivers, source data transmition, and media player. No one seems capable of measuring the different effects of different circuits and OS run background apps and all other internal goings on within a PC to justify any real performance loss, yet most Audiophile PC's rely heavily on modded system OS's and hardware, some radio users have found PC power supplies to affect their equipment, and it does seem worth investigating. Naim found their own PSU to affect their first audio server.
 
Last edited:
So we now arguing that "all digital devices sound the same" ... so all transports, PC outputs and all will have no effect, be it down to jitter or anything else ?

So jitter has no effect of sound quality is the claim by some !?!!?! :eek: it's like rolling back the clock 30 years ...ROFL

Well that is exactly what seems to have been modern DAC manufacturers goals, for instance less transport dependant, makes critical transport choice somewhat redundant.
 
Oh the silliness is in full swing today. Unless there's a fault then when using S/PDIF everything sent is received and is bit for bit identical. 'Jitter' affects timing of receipt and the claim is that it can have an effect on what you later hear after the whole D/A stage. You simply cannot. On even the most 'jittery' equipment it is completely beyond human perception.

Jitter is a legitimate topic but it is not an audio concern, unfortunately audiophilia/ignorance have made it one.

Incidentally HDMI has more a lot more jitter than S/PDIF as I recall. I wonder if this affects PQ? ;)
 
Oh the silliness is in full swing today. Unless there's a fault then when using S/PDIF everything sent is received and is bit for bit identical. 'Jitter' affects timing of receipt and the claim is that it can have an effect on what you later hear after the whole D/A stage. You simply cannot. On even the most 'jittery' equipment it is completely beyond human perception.

Jitter is a legitimate topic but it is not an audio concern, unfortunately audiophilia/ignorance have made it one.

Incidentally HDMI has more a lot more jitter than S/PDIF as I recall. I wonder if this affects PQ? ;)

Just out of interest what qualifies you to make such sweeping and bold statements ? ...........
As I think there are many companies, engineers and R&D departments that have been working at reducing such "jitter" that you suggest is inaudible......
 
Just out of interest what qualifies you to make such sweeping and bold statements ? ...........
As I think there are many companies, engineers and R&D departments that have been working at reducing such "jitter" that you suggest is inaudible......

I think I made it quite clear that I'm not suggesting anything. I'm stating this as a fact, if you're going to claim that barely measurable jitter pre-D/A stage can have audible effects then the onus is on you.

The only area concerned with this is the audiophile one. So these "companies, engineers and R&D departments" which are often the same one man for many operations are just working hard to further the myth. Not that they actually do a great deal anyway.

Hell, if you like you can go to an audiophile forum and you'll find 'jitter' repeatedly debunked not that it stops the crazies or greedy individuals and businesses.
 
Oh the silliness is in full swing today. Unless there's a fault then when using S/PDIF everything sent is received and is bit for bit identical. 'Jitter' affects timing of receipt and the claim is that it can have an effect on what you later hear after the whole D/A stage. You simply cannot. On even the most 'jittery' equipment it is completely beyond human perception.

Jitter is a legitimate topic but it is not an audio concern, unfortunately audiophilia/ignorance have made it one.

Incidentally HDMI has more a lot more jitter than S/PDIF as I recall. I wonder if this affects PQ? ;)

The magazines have already produced reviews which subjectively show that audio performance via HDMI with low Jitter output from BR players such as Pioneer was better sounding in blind listening tests.

The problems caused have been documented in a few magazines as clearly affecting sound quality. Companies such as Benchmark and Weiss designed dacs to combat such problems because it was a concern, not because they found all digital feeds identical.

It can be heard, but you usually have to listen, wether theer is enough jitter to make enough difference is subjective. SPDIF is not perfect, neither is toslink, there is plenty to google and many well educated discussions regarding such, the fact is that most DAC manufacturers addressed such so it is less of a problem for many.
I reclock my digital stream and it is also buffered in memory with metadata removed before playback. In certain systems with certain equipment it makes a worthwhile improvement.

To be honest if jitter was not a problem, DACs such as Benchmark, Weiss or Ayre Acoustics QB-9 asynchronous USB DAC which all address such would not exist. The high end sound cards would not be so well designed.

Edit, in the real world in some pro audio circles and studio's you may find master clocks and other hardware, some of which measure and display errors, jitter and clock inaccuracy as they do thier job.
My own shows the PC to have the highest clock inaccuracy in PPB when connected directly with spdif from the motherboard header.

I am also sure if you google and investigate enough you will find there is no simple answer, and most arguments are simplified nae sayers on both sides with a narrow minded opinion based on what they have read.
 
Last edited:
you have to be careful when you use that 'M'-word - that's normally synonymous with 'you can happily dis-regard everything you are about to read in this post after the 'M'-word'.

...so, what magazines did those tests? were they properly controlled ABX tests? did they detail how they conducted the tests at all?
 
Back
Top Bottom