Fake QED Reference Cable?

The guys in my local hifi shop come across as genuine people but they too have been fooled by being given free cables to take home and now effectively try to convince me whenever I go in. The fact I bought a dac and didn't want a usb cable for £50 really confused them lol.
 
I think they are thinking more related to sound everybody says 1 and 0's but the actual signal is still analogue a very fast analogue signal - which is timed at only one end and only so many cycles - that leads to errors in timin what is the cause of jitter and hdmi suffers badly

I think that is correct
 
Copy-pasta I've used before:

I'll try to explain nontechnically.

Digital is 1s and 0s -- in concept. But digital exists in analog medium. Cables can suffer interference, changing 0s to 1s and 1s to 0s. Uh oh! That's bad! Fortunately, this problem has been solved for decades and dealing with this is pretty ordinary. The answer in short? Error detection and correction.

First lets start with a little "error detection 101". Let's say we want to send someone a series of numbers over a noisy channel, and we want to be 100% sure they get there intact, or not at all. Here's an idea: In addition to sending the numbers, add them all up and send the sum too at the end, so the receiver can verify. Example:

Send: 5 2 9 3 8 2 4 (34) <-- sum
Receive: 5 2 9 4 8 2 4 (34)

Uh oh. I received something that doesn't add up! But that's good... we've successfully detected an error! There's no uncertainty here at all* because we know when errors occur! Hmm... but really, no uncertainty at all?

*What is the probability that a random distortion changes the numbers and the error goes undetected? Well, you'd have to change one number upwards and another number downwards just the right amount so the sum still adds up. Or you'd have to change one number and change the sum too. In this case, the odds of that happening depends on the range of numbers, the type of distortion possible, etc... but it's extremely unlikely. But it's still not unlikely enough. One in a billion still can happen. So what next?

Would you believe it if I told you... this is a stupidly simplified example? In reality, error detection codes are much more complex than this (not usually just a simple sum) and incredibly robust to the point where it's probably more likely that the sun spontaneously explodes than an error gets through undetected (that is, if we're actually trying to detect them -- and we almost always do -- certainly in HDMI we do).

Now my point in saying all this is simple. We can build a machine (HDMI TV) that can literally count the number of errors where the signal deviates from bit-perfect 100% perfect signal (and know exactly which pixels are effected). We can identify how many and what incorrect pixels were received with astounding certainty - I'm talking about certainty so high, that it would be more likely to win the lottery a million times in a row than for an error to be miss-identified.

Now, here's an exercise to the reader: For regular HDMI cable lengths, can you guess how often such errors occur? Exercise #2: When they do occur, is it possible for such errors to effect contrast/sharpness? (Hint: Not a chance)

...

I don't think HDMI has any protocol to re-send missing pixels, simply because it's a pointless waste of time to do so. If a pixel is missing (very rare event) it can fill it in based on a blend of its neighbors. Blu-ray 1080p uses MPEG2 or MPEG4-AVC, both of which do not even approach quality sufficient to the point where a single blurred pixel would be detectable, even if that blurred pixel remained blurred for a long time (it doesn't). This is fairly easily proven mathematically via entropic or mean-squared error or any other psychovisual error metric.

As for the frequency of such errors, HDMI requires a bit error rate BETTER than 10^-9 at bare minimum requirements.

http://www.comlsi.com/Cat5_BER_EQ2.pdf

In other words, an HDMI-spec-compliant cable will lose no more than 1 bit of data per billion bits on average. As that article states, this amounts to one incorrect pixel for 1/60th of a second occurring at MOST every ~8 seconds (again keep in mind this is the absolute worst case of the cheapest poorest quality standards-compliant cable you could buy).

Now keep in mind that when a pixel is detected as missing, the surrounding pixels are used to predict what that pixel would have been. It would be like choosing a random pixel on an image, erasing it, then filling it in again based on a blend of the color of its neighboring pixels. It is impossible that a human could detect such an error even by staring at a 1080p still-frame image for hours, let alone be able to perceive the error during a 1/60th of a second window. It is impossible to detect because the compression codec itself introduces a level of noise and information loss which is FAR worse than a single interpolated pixel.

Most claims about HDMI is that it improves clarity, contrast, etc. etc. There is no way HDMI errors could result in loss of contrast or clarity, even if a missing pixel for 1/60th of a second was perceptible (it's not -- once again this can be mathematically proven via information theory). If you can detect a single missing interpolated pixel from a still-frame from even an H.264 compressed source, then please by all means contact your local university because you have just transcended informational entropy.
 
I think they are thinking more related to sound everybody says 1 and 0's but the actual signal is still analogue a very fast analogue signal - which is timed at only one end and only so many cycles - that leads to errors in timin what is the cause of jitter and hdmi suffers badly

I think that is correct

Ah, in that case they can try the same BS that people do for audio over S/PDIF all ready. Jitter is inaudible by the way; well actually not so nevertheless the most 'jittery' devices are many, many times below the audible threshold.

It's a digital signal, not a "very fast analogue signal."
 
Off the record QED are working on a way of testing and proving differences in HDMI cables - obviously not easy but in their interest.

This will cause huge debate now and at the time - however it will be very interesting if they can do it.

Off the record on a public forum? :p

Also there will be a difference. There is a difference between copper and gold and the quality of shielding will yield a difference too. These differences will be negligible over 5 meters though.
 
Ah, in that case they can try the same BS that people do for audio over S/PDIF all ready. Jitter is inaudible by the way; well actually not so nevertheless the most 'jittery' devices are many, many times below the audible threshold.

It's a digital signal, not a "very fast analogue signal."

It's not as simple as sending no voltage and then sending 5 volts though. Logic gates can be switched into the on position at 3.6V and if a cable is of poor quality and occasionally a positive 5v degrades into 3v say then the logic gate won't open. There is lots of error correction involved to make it work as often as possible and as I've already said over 5 meters a cheap cable is unlikely to be worse than an expensive cable (from the users perspective), a long cable though >50m could be effected severely though.
 
It's not as simple as sending no voltage and then sending 5 volts though. Logic gates can be switched into the on position at 3.6V and if a cable is of poor quality and occasionally a positive 5v degrades into 3v say then the logic gate won't open. There is lots of error correction involved to make it work as often as possible and as I've already said over 5 meters a cheap cable is unlikely to be worse than an expensive cable (from the users perspective), a long cable though >50m could be effected severely though.

There is no error correction involved in S/PDIF or HDMI.

What does cheap have to do with performance and longer runs? There are tests out there showing that there's no benefit to expensive cables in this situation, and take them apart and you'll usually find the same thing construction.
 
There is no error correction involved in S/PDIF or HDMI.

What does cheap have to do with performance and longer runs? There are tests out there showing that there's no benefit to expensive cables in this situation, and take them apart and you'll usually find the same thing construction.

Well I was correlating lower price with lower quality. Your argument of cheaper cables being exactly the same as expensive ones is different to mine. I'm saying poor materials will result in a lower signal. There will be information somewhere specific to the cable that states what the dB loss per meter is and the simple fact is some cables will have a higher dB loss than others. The difference in dB loss over a short length will be negligible though as I keep saying, but if you're running an HDMI cable from a PC stored in your loft to the TV in your living room then the difference in loss between 2 different cables could end up affecting things, although again the losses involved may be so minimal that it really doesn't matter. To simply say a lower quality cable is as good as a high quality cable though is simply wrong.
 
Well I don't think I said that, and if I did it was a mistake. And it's rather semantic since very few cables won't actually function properly or they'd not be sold, or would not be sold for long.
"A poorly constructed cable that has not been properly tested may not be equal to a better constructed cable."
 
On a popular auction site someone is selling a QED Reference HDMI 5m cable. But the QED website states they do not make a 5m cable.

It's possible that QED did make a 5m cable and now longer make it, so that would explain why it's not listed on the website.

That said, even if it fake it should still work:

From wikipedia:"Although no maximum length for an HDMI cable is specified, signal attenuation (dependent on the cable's construction quality and conducting materials) limits usable lengths in practice.[67] HDMI 1.3 defines two cable categories: Category 1-certified cables, which have been tested at 74.5 MHz (which would include resolutions such as 720p60 and 1080i60), and Category 2-certified cables, which have been tested at 340 MHz (which would include resolutions such as 1080p60 and 2160p30).[62][68][69] Category 1 HDMI cables are marketed as "Standard" and Category 2 HDMI cables as "High Speed".[1] This labeling guideline for HDMI cables went into effect on October 17, 2008.[70][71] Category 1 and 2 cables can either meet the required parameter specifications for interpair skew, far-end crosstalk, attenuation and differential impedance, or they can meet the required nonequalized/equalized eye diagram requirements.[68] A cable of about 5 meters (16 ft) can be manufactured to Category 1 specifications easily and inexpensively by using 28 AWG (0.081 mm²) conductors.[67] With better quality construction and materials, including 24 AWG (0.205 mm²) conductors, an HDMI cable can reach lengths of up to 15 meters (49 ft).[67] Many HDMI cables under 5 meters of length that were made before the HDMI 1.3 specification can work as Category 2 cables, but only Category 2-tested cables are guaranteed to work.[72]"

If you were getting a longer cable then it's worth getting a better quality one, but at 5 metres you should be ok with almost any brand. :)
 
Last edited:
It's a digital signal, not a "very fast analogue signal."
Nope. The signal might be digital in terms of it's structure of high and low values. But when it comes to sending that down a bit of cable then effectively it's an analogue square wave at such a high frequency that the switching point becomes skewed because the time taken for the driver circuit to go from low to high or vice versa becomes significant.

In other words it's why any eye diagram for HDMI cable signals shows hexagon shapes rather than the square wave form you'd associate with what we think of as digital switching. See the diagram at the bottom which shows HDMI signal switching

4218Fig03.gif
 
The quality of the cable could be defined as the the quality of the connection, ie a cheap cable has a cheaply built HDMI plug while a good quality one will last longer and be able to withstand lots of unplugging.
 
The only thing that is of any concern even in the professional industry is frequency related attenuation per meter. I have dealt with cable lengths around 5-700M dealing with digital signals. SDI is particularly sensitive to the type of cable used as it is both video and 4 channel audio going down BNC terminated coax. The old MV3333C cables that were used before cannot handle the frequency (1.5GHz) over any considerable length without serious attenuation. Cable impedance also plays a crucial role as it is necessary for it to be 75ohm in both digital signals and older fashioned analogue video.

Using SDI as an example...

All digital data is derived and managed by a repetitive pulse train called a clock. Without it, data transitions cannot be identified coherently. The digital data can either contain the clock information embedded within it, or the clock signal can accompany the data separately. Since SDI is a singular wire transmission scheme, the clock is embedded. (AFAIK the clock is embedded via HDMI too) Therefore, not only does cable attenuation affect recovery of data, it seriously affects the receiver's ability to recover the clock signal such that the system can stay synchronized. This is where cable attenuation comes in. The maximum cable distance is governed by the receiver's ability to recover clock and data reliably.

Cable loss affects the amplitude of the SDI signal while jitter affects the zero crossing point of the data edges. The data edges appear to dance back and forth with random uncertainty. There is a jitter budget allowance, but since noise and jitter effects can become generally random, bit error rate can creep up periodically and cause lost data. If the jitter budget is exceeded, data cannot be recovered at all.

As with analogue signals, once you have noise in the signal, it is extremely difficult and costly to remove. Jitter caused by induced noise effects, unstable signal sources, or poor re-clocking systems will ruin digital signals. Sometimes, basic signal attenuation effects are mistaken as signal jitter. SDI signals contain a range of low to high frequencies like analogue signals. Cable attenuation still affects the high frequencies most. When looking at an eye pattern, the data zero crossing point (risetime/falltime area) appears wider than normal. The eye pattern is typically used to evaluate signal quality including jitter. This appears to smear the data edges and look as though large amounts of jitter are present (the thick green X shaped sections on lucids image shows the jitter range, the lower the jitter, the thinner these sections get), when, in fact, measurement with SDI measurement equipment may show the signal well within jitter specifications.

Also note, that what we commonly refer to as a square wave isn't technically possible as rise/fall times from IC's simply aren't fast enough as a true square wave is instantaneous off to on and visa versa. (note that the divisions on the diagram in lucid's post shows that each division is 100 pico seconds. Eye patterns for SPDIF are no different.) If you look at the datasheets for various opamps, particularly those designed for video, they show how they handle rapid rise and fall times and how long they take to settle. Depending upon the impedance load they are working into and their power supply, some opamps won't actually settle and will oscillate. This can also happen with audio amplifiers, and is quite often part of the reason why the ampilfier input is frequency filtered. Too high a frequency through some amps can cause parasitic oscillations which can lead quite rapids to its destruction. (can take the speakers with it too)
 
What happens when an SDI cable is too long or the cable attenuation is too high? If you get some data loss does the picture start to break up in a similar way to a bad satellite signal or does it just go off completely?

I use SDI cables quite a bit at work and touch wood I've had very few issues with them, if only I could say the same for Genlock cables, so it's nice to know of any pitfalls before they happen. However I'm not exactly pushing the envelope, just a single SD video feed with no audio is all I need.
 
If you are lucky you might get some garbled appearances on screen, but usually nothing. Hence why they call it a cliff, if you get errors, you are very close to complete loss. A single iffy Coax cable around 5m in length caused the loss of signal in a huge cable run we had a few weeks back. Swapped the cable out for another and it all came good. We were worried about using SDI as the building is all pre-wired with triax, but it's of an unknown length with no way of repeating. (We still use SD as well and it's known that 3-400M is close on the limit, we were estimating 5-700M) We couldn't use analogue particularly easily as the output from the unit wasn't working properly (couldn't get hold of another at short notice either), so we had come prepared with an SDI to analogue converter just in case. Luckily a cable swap fixed it, the outer braids looked frayed on the dodgy cable, probably caused by the lack of strain relief boots.
 
Got my HDMI from a pound store...work perfect!

I would never buy expensive HDMI cables, digital either works or it doesn't!
 
I see, that makes sense, I must admit 99% of the time SDI works perfectly for me or not at all, so that would explain the cliff effect. Considering how much abuse some of our cables take I'm amazed they still work, but as its mostly studio stuff I would say that none of our runs are longer than 100m so I cable attenuation shouldn't be much of an issue for me.
 
Back
Top Bottom