Testing different prices HDMI cables. Suggestions welcome, results will be published here.

Or maybe they were beaming pixie dust in to my eyes and altering my brain waves as I was not wearing my tin foil hat :D. Sorry, couldn't resist but the fact is that only one cable was provided, it was HDMI as the projector was right next to me so I can see him plugging it in and out. I was told this cable will be good enough for most people at this level, done deal. I specifically asked for a comparison with a more expensive cable and we are not talking about hundreds of pounds expensive here, and saw a difference in quality. I believe the more expensive cable is HDMI 1.4 (or whatever they changed the name too) and the other cable was probably not supporting deep colour or able to maintain the higher bitrate at that length. If the retailer had offered the choice of the two cables and both were HDMI 1.4 then I would agree that there is a chance that there was some foul play going on but that was not the way it happened. The retailer though it was all a done deal with the cheaper cable. Again the 'Ripped off ' crowd (not including you here Kyle) don't know the full picture and just jump to the 'you're an idiot and got ripped off' mantra.
The only real explaination was that it was a rigged demo to get you to buy their expensive cables, Monster (for example) are known for doing just this, where they compared composite cables with SD feeds, to HDMI cables with HD feeds, except they tell the people viewing the demos, that they're comparing their expensive pixie dust infused cables to "standard, low quality" cables.



So cables with different HDMI specifications (1.2 vs 1.4 for example) will not affect the picture quality?
Correct, it's not the cable that determines the HDMI spec, it's the devices either side of the cable. There are two types of HDMI cable, one is a single link one (same as a single link DVI cable) and a dual link (would know known as high bandwith/speed), neither one effects the image quality in the ways you saw, "deep colour" isn't a cable spec either, it's more a display spec, with 10bit image processing, which isn't a feature of Blu-rays.

You seem really hung up on image quality, and I get the feeling that you're ignoring how a system behaves when an inadequate HDMI cable is used, you seem to think it'll result in degraded image quality in the form of sharpness, saturation and contrast, when it's actually artifacts, the worse the problem, the more artifacts will be displayed until there's no image left.


Yes and I agree. What people do not seem to get past is that the cables may be of different HDMI standards. One might will have supported deep colour and the other may not. The one supporting should have exhibited better colour saturation should it not ? It is therefore like comparing USB 1.1 and USB2 cables and not USB2 and USB2 but more expensive.

RB
USB 1, 1.1 and 2 use all the same cables, it's the ports and devices that determine the standard, USB 1 all the way to 2, use the same set up, 2 data cables, and 2 power cables (USB 3 uses a completely different cable with different wiring). So, as I keep saying, the cables for the most part are exactly the same, the cables themselves aren't what support the standards.
 
Last edited:
Just to highlight that there are different HDMI standards (versions) and catagories....

Taken from www.hdmi.org.


So it would be reasonable to assume the cheaper cable I saw was a category 1 and the more expensive was a category 2 or do people still believe that there is no difference in the picture between cables manufactured to these two different categories ?.

RB

It wouldn't be reasonable to assume they'd effect the image quality. The differences between those cables, as I've said in my last post, is that one's single link and the other is dual link. It's not about image quality, it's about bandwith. The higher bandwith cables support features such as 2 way audio, much higher resolutions, ethernet over HDMI and various refresh rates at higher resolutions that all need more bandwith than what a single link cable can provide, it's that simple. Think about it, compared Gigabit networks to 100 Megabit networks, gigabit hubs can use CAT5e perfectly fine for gigabit speeds, even though "CAT6e" is recommended (though it's better for longer runs), and it's because the standard is set at the hardware ports, not the cables themselves. Look at things like PCI-E ports, and you'll see that while they've maintained their size, shape and pin configuration, there's a huge difference in bandwith between the very first PCI-E spec, and the current one (I am aware that PCI-E 1.1 can supply more power, so newer graphics cards won't work in older standard motherboards, but the data connections are the same).
 
Last edited:
The only real explaination was that it was a rigged demo to get you to buy their expensive cables, Monster (for example) are known for doing just this, where they compared composite cables with SD feeds, to HDMI cables with HD feeds, except they tell the people viewing the demos, that they're comparing their expensive pixie dust infused cables to "standard, low quality" cables.

Again whilst I understand your point I do not see how this could be the case as the two cables were not being suggested. Only one was offered. Likewise, switching between the short and long cable was just a case of unplug one, plug in the other. Both were HDMI cables.

Correct, it's not the cable that determines the HDMI spec, it's the devices either side of the cable. There are two types of HDMI cable, one is a single link one (same as a single link DVI cable) and a dual link (would know known as high bandwith/speed), neither one effects the image quality in the ways you saw, "deep colour" isn't a cable spec either, it's more a display spec, with 10bit image processing, which isn't a feature of Blu-rays.

So a category 1 cable only able to manage 1080i if put in a setup would produce the same picture quality as a category 2 cable put in the same setup which is able to display at 1080p. If the cable is only sepc'd to category 1 then how can it transfer the amount of data to match the picture quality of a category 2 cable ?. By category definition it cannot and if it cannot then how can the resulting picture be the same.

USB 1, 1.1 and 2 use all the same cables, it's the ports and devices that determine the standard, USB 1 all the way to 2, use the same set up, 2 data cables, and 2 power cables (USB 3 uses a completely different cable with different wiring). So, as I keep saying, the cables for the most part are exactly the same, the cables themselves aren't what support the standards.

And yet when USB2 came out a lot of USB 1.1 cables could not obtain the same transfer speeds that USB 2 was able to run at.

RB
 
It wouldn't be reasonable to assume they'd effect the image quality. The differences between those cables, as I've said in my last post, is that one's single link and the other is dual link. It's not about image quality, it's about bandwith. The higher bandwith cables support features such as 2 way audio, much higher resolutions, ethernet over HDMI and various refresh rates at higher resolutions that all need more bandwith than what a single link cable can provide, it's that simple. Think about it, compared Gigabit networks to 100 Megabit networks, gigabit hubs can use CAT5e perfectly fine for gigabit speeds, even though "CAT6e" is recommended (though it's better for longer runs), and it's because the standard is set at the hardware ports, not the cables themselves. Look at things like PCI-E ports, and you'll see that while they've maintained their size, shape and pin configuration, there's a huge difference in bandwith between the very first PCI-E spec, and the current one (I am aware that PCI-E 1.1 can supply more power, so newer graphics cards won't work in older standard motherboards, but the data connections are the same).

Maybe we are agreeing but are coming from different angles....

If the standards are only set by the ports then why cannot I not use a cat 4 cable in a cat6 port and get the same Gbit speeds. ? Why can I not put a cat5e cable in cat6 and get 10GBit speeds over long runs. Both ports support it. Cat 5e is spec'd up to 1GBit so using a cat5e cable in a cat 6 port you would expect to be able to get up to 1GBit.

If I can only supply enough data using a single link cable for a 1080i picture, how can I expect the picture to be the same quality of a 1080p picture which only a dual link cable (category 2) can support. Comparing a single link and a single link cable should show no difference regardless of quality and price (give or take sparkles etc produced by transmission failures over length). The same is true with testing a dual link with a dual link cable. I do not believe that a single link cable supplying a feed to a device able to show 1080p pictures is able to produce the same picture as a dual link cable supplying data to the same device. The projector taking the 1080i feed and trying to change it to either a 1080p picture or dropping it to a 720p picture is likely to suffer quality loss as it does not have the pixel data to match what is being transmitted over the 1080p dual link cable.

This is unless you believe a 100" image made up of a 720p, a 1080i and a 1080p resolution feed will all look identical.

RB
 
Again whilst I understand your point I do not see how this could be the case as the two cables were not being suggested. Only one was offered. Likewise, switching between the short and long cable was just a case of unplug one, plug in the other. Both were HDMI cables.
Yes, but the only explanation for the difference in image quality is that it was rigged, because HDMI doesn't work like that.



So a category 1 cable only able to manage 1080i if put in a setup would produce the same picture quality as a category 2 cable put in the same setup which is able to display at 1080p. If the cable is only speced to category 1 then how can it transfer the amount of data to match the picture quality of a category 2 cable ?. By category definition it cannot and if it cannot then how can the resulting picture be the same.
Well now you're suggesting different resolutions. :confused: Of course 1080 wouldn't identical 1080p. That would also fall in to the category of "inadequate", you wouldn't use a cable with lesser bandwith than what you need, would you? And this isn't what we've all been discussing. What you're saying is implying I believe there is no difference in image quality different resolutions simply because it's using HDMI.



And yet when USB2 came out a lot of USB 1.1 cables could not obtain the same transfer speeds that USB 2 was able to run at.

RB

In which case, they didn't meet the specs of USB1.1 in the first place, never mind 2.0. If a cable meets the specs of USB1.1, then it's already good to go on USB 2.0, as they use the same specifications.
 
Maybe we are agreeing but are coming from different angles....

If the standards are only set by the ports then why cannot I not use a cat 4 cable in a cat6 port and get the same Gbit speeds. ? Why can I not put a cat5e cable in cat6 and get 10GBit speeds over long runs. Both ports support it. Cat 5e is spec'd up to 1GBit so using a cat5e cable in a cat 6 port you would expect to be able to get up to 1GBit.
You're not talking about differently specced cables.

If I can only supply enough data using a single link cable for a 1080i picture, how can I expect the picture to be the same quality of a 1080p picture which only a dual link cable (category 2) can support. Comparing a single link and a single link cable should show no difference regardless of quality and price (give or take sparkles etc produced by transmission failures over length). The same is true with testing a dual link with a dual link cable. I do not believe that a single link cable supplying a feed to a device able to show 1080p pictures is able to produce the same picture as a dual link cable supplying data to the same device. The projector taking the 1080i feed and trying to change it to either a 1080p picture or dropping it to a 720p picture is likely to suffer quality loss as it does not have the pixel data to match what is being transmitted over the 1080p dual link cable.
Why are you suggesting people are claiming 1080p will look identical to 1080i? They're effectively different resolutions, of course there will be differences, the argument is that the cable itself doesn't change the image quality of the feed you're running through it, it doesn't matter what resolution it is, providing you're comparing cables at the same resolution. I'm not sure where you're getting your ideas of a dual link cable MUST produce better image quality than a single link cable, it makes no sense. If both cable support the bandwith required for 1080p, then why would the images be different? Your misunderstandings of how it works is giving you strange ideas of how other things work. A dual link cable is for higher bandwith, it's not about quality, what you can't do with a single link HDMI cable is run quad HD with it, whereas you can with a dual link, and native 1080p video will be of course lesser quality than native 2160p, but that's about as far as it goes because you're comparing different resolutions then which is pointless.

This is unless you believe a 100" image made up of a 720p, a 1080i and a 1080p resolution feed will all look identical.

RB
I get the feeling your argument is conveniently changing to another subject...

In simple terms, passing 1080 over a single link and dual link cable, will produce the same image quality, and it's futile comparing a single link cable's theoretical maximum bandwith to a dual link cable's bandwith because a single link cable simply wouldn't display quad HD material.
 
Yes, but the only explanation for the difference in image quality is that it was rigged, because HDMI doesn't work like that.



Well now you're suggesting different resolutions. :confused: Of course 1080 wouldn't identical 1080p. That would also fall in to the category of "inadequate", you wouldn't use a cable with lesser bandwith than what you need, would you? And this isn't what we've all been discussing. What you're saying is implying I believe there is no difference in image quality different resolutions simply because it's using HDMI.

No, I have been suggesting for quite a while (from the start and again in post #36) that the cables may have been of different specs. Maybe specs was a poor choice of words. Of different categories (dual or single link). The suggested cheaper cable may have been of single link level and so the picture quality took a hit. I then asked to test a more expensive cable which was dual link and the quality improved back to the level of the short cable the initial demo was done with.

I chose the more expensive cable as it was a dual link rather than the cheaper cable which may have been a single link. The retailer seemed to think a single link cable would be adequate as I made it clear price was a major concern and so surprised him when I chose to purchase the more expensive dual link cable.

Without having the cable in my hands to test I cannot confirm this though.

I appreciate it is very late for you so thanks for continuing to discuss.

RB
 
No, I have been suggesting for quite a while (fromt eh start and again in post #36) that the cables may have been of different specs. Maybe specs was a poor choice of words. Of different categories (dual or single link). The suggested cheaper cable may have been of single link level and so the picture quality took a hit. I then asked to test a more expensive cable which was dual link and the quality improved back to the level of the short cable the initial demo was done with.
I understand what you're assuming, but it's just not how HDMI works. What it seems like you're theorising is that what you saw as lower quality, was because the cable didn't have enough bandwith to display the picture "completely", as if it was being compressed to run through it, again, it's not how HDMI works, and if it didn't have the bandwith to pass through a 1080p signal, fuzzy bushes isn't how it would come accross, it'd either be absolutely full of sparklies, or simply you wouldn't have got any image, or the device (PS3) wouldn't have let you choose 1080p in the first place.

I chose the more expensive cable as it was a dual link rather than the cheaper cable which may have been a single link. The retailer seemed to think a single link cable would be adequate as I made it clear price was a major concern and so surprised him when I chose to purchase the more expensive dual link cable.

Single link cables have more than enough bandwith to support 1080p, it's not a problem, so a dual link cable will fare exactly the same, as it'll use exactly the same amount of bandwith as a single link, there will be no difference, factually so too.[/quote]


Without having the cable in my hands to test I cannot confirm this though.

I appreciate it is very late for you so thanks for continuing to discuss.

RB[/quote]

Regardless of its standard, if its specs meet the requirements for 1080p, it'll be no different.
 
You're not talking about differently specced cables.

Yes I am. Category 1 and category 2 (single and dual linked / standard and high speed).

Why are you suggesting people are claiming 1080p will look identical to 1080i?

Because you are saying single link cables will produce the same picture as dual link cables when single link can only go to 1080i and dual link can go to 1080p.

They're effectively different resolutions, of course there will be differences, the argument is that the cable itself doesn't change the image quality of the feed you're running through it, it doesn't matter what resolution it is, providing you're comparing cables at the same resolution.

Yes but as per the HDMI categorisation of cables, a cat 1 is spec'd up to 1080i and not 1080p as posted in the chunk taken from their site in post #40. If a cat1 cable could support 1080p then yes the picture would be the same. Just like if a cat2 network cable could meet the specs for cat6 then it would be able to transfer 10Gb/s but then it would be called a cat6 cable.....

I get the feeling your argument is conveniently changing to another subject...

Nope this is the point I have been trying to make all alone however badly.

RB
 
I understand what you're assuming, but it's just not how HDMI works. What it seems like you're theorising is that what you saw as lower quality, was because the cable didn't have enough bandwith to display the picture "completely", as if it was being compressed to run through it, again, it's not how HDMI works, and if it didn't have the bandwith to pass through a 1080p signal, fuzzy bushes isn't how it would come accross, it'd either be absolutely full of sparklies, or simply you wouldn't have got any image, or the device (PS3) wouldn't have let you choose 1080p in the first place.

No what I am saying is that it may be possible that the single link cable was not able to pump enough data for a 1080p signal. As the ps3 sees that it is in fact a single link cable it drops the sending resolution down to 1080i and sends at this resolution instead thus resulting in a picture that looks worse.

Single link cables have more than enough bandwith to support 1080p, it's not a problem, so a dual link cable will fare exactly the same, as it'll use exactly the same amount of bandwith as a single link, there will be no difference, factually so too.

Not according the the HDMI standards. In order to meet the HDMI standard for a category 1 cable it needs only to be able to pass a 1080i signal. I am not denying that some cables may surpass this standard, what I am suggesting is the cable I first demo'd with didn't, it met the standard.

Regardless of its standard, if its specs meet the requirements for 1080p, it'll be no different.

Agreed. I am not talking about signal loss in the cable etc. I am talking about the cables effect in the whole chain from the source to the projected image. If the cable just meets the cat1 HDMI cable spec and forces a resolution drop on the source component then would this not produce a difference in picture. If a cat 1 HDMI cable does not meat the specs of 1080p (which it is under no obligation to do) then why would the resulting picture be the same as the one using a cable capable of transmitting in the 1080p resolution. Trying and failing would produce the artifacts as you describe but surely most equipment will switch down to a lower resolution the cable can handle without issue.

I did not confirm the resolution when demoing as TBH it is immaterial to me (and I didn't think of it at the time). If the cheaper cable forces a res drop then I would prefer the more expensive cable that does not as long as it is within my budget.

RB
 
Are you doing it blond?

As in someone else number them, plug them in for you, so no placebo effect but knowing what cable it is.
 
I wouldn't entertain an expensive HDMI cable, but the length (16M) is almost beyond the limits of HDMI, and it will help in this instance.

However, get some CAT5/6 converters off eBay and buy a decent build cheap HDMI cable from Neet and be done with it. I'm running a whole house HD video matrix using these and haven't noticed any drop in PQ.

My test equipment includes:
Pioneer Kuro LX6090 60" plasma TV*
PS3 slim
HD over CAT6 extenders
CAT 6 Low smoke zero halogen cable
Neet HDMI 1.4 cable

*TV calibrated prodessionally to ISF standards by the same guy whos done work calibrating equipment for pros for years. When I saw him he'd just got back from doing the LittleBigPlanet studios.

I also have access to a set of QED SR-1 reference cables, and the Purple QED cables (one down from the "reference" cable, if you want me to do this.

Camera: Canon EOS 1D MK IIn
 
Let me claify some confusion in here.

A HDMI 1.4 spec cable has the same amount of wires in it as a 1.2 spec cable.

There is NOTHING majorly different in the different specc'd cables except maybe the additional oxygen gold plated insulated crap you find advertised on expensive cable packaging.

Any 1m HDMI cable is capable of 1.4 spec duties. Any 5m cable is.

The reason they sell different spec cables is because longer cable runs require a better signal. This problem doesn't really exist 7meters or less, so it's pointless having that super dooper cable on small cable runs. Again there is nothing extra in a 1.4 spec cable than in a 1.2 spec. Just better insulation etc which is not needed.

If you turn on Deep Colour or other features that require higher bandwidth like lossless audio on a cheapo 20meter cable, you may see the sparklies effect and will INSTANTLY know their is a problem. Thats when the cable cannot handle the higher HDMI spec bandwidth. You will not get slight skews on colours etc, it just doesn't work like that, it's impossible.
 
Last edited:
Are you doing it blond?

Nope, I have brown hair and will not be changing the colour.

Sorry Raymond, but the miss spellcheck correction (?) brought a smile.

As in someone else number them, plug them in for you, so no placebo effect but knowing what cable it is.

Not really. I have no real vested interest in the outcome one way or another. I am happy if there is no change or if there is a change, I would just like to know either way. As mentioned, I am currently suspecting that the cheaper cable prompted the PS3 to step down a resolution for some reason hence the difference in picture quality. The reason for the pictures is to take as much subjectiveness out of it as possible and to be able to see two pics side by side (and make them available to others in CR2 unedited format).

On this note, looking at the captures I did with my cable and a single frame over the period of 30 minutes, the images came out well. Two things I could notice was that the resolution is good enough to see the individual pixels in the projected image at 1:1 resolution and a remote release is def required (I have not dug it out of my boxes yet as we have just moved to this new apartment).

I will put one pic up later as a reference so people can see the expected quality from the camera.

Cheers
RB
 
I wouldn't entertain an expensive HDMI cable, but the length (16M) is almost beyond the limits of HDMI, and it will help in this instance.

Yes, I have heard about cables failing, especially cheap cables of longer runs, and as I am sure you can appreciate considering your home build thread, I would rather not have to pull a cable out of a wall 6 months after putting it there and decorating because it was starting to show problems.

However, get some CAT5/6 converters off eBay and buy a decent build cheap HDMI cable from Neet and be done with it. I'm running a whole house HD video matrix using these and haven't noticed any drop in PQ.

My test equipment includes:
Pioneer Kuro LX6090 60" plasma TV*
PS3 slim
HD over CAT6 extenders
CAT 6 Low smoke zero halogen cable
Neet HDMI 1.4 cable

*TV calibrated prodessionally to ISF standards by the same guy whos done work calibrating equipment for pros for years. When I saw him he'd just got back from doing the LittleBigPlanet studios.

I also have access to a set of QED SR-1 reference cables, and the Purple QED cables (one down from the "reference" cable, if you want me to do this.

Thanks EVH, it may be interesting just to add if you are happy to do it. At worst it will show no difference at all so people can take it to Dixons etc and wave it in the sales persons face ;). A great service to the general public I am sure most will agree.

Camera: Canon EOS 1D MK IIn

Well that might me just about good enough :D. [Whisper] at least it is not a Nikon ;) [/whisper]. Nah, I am sure Nikon users are only human too :D.

Do you have any suggestions for sample movies ?

Cheers
RB
 
Let me claify some confusion in here.

A HDMI 1.4 spec cable has the same amount of wires in it than a 1.2 spec cable.

There is NOTHING majorly different in the different specc'd cables except maybe the additional oxygen gold plated insulated crap you find advertised on expensive cable packaging.

Any 1m HDMI cable is capable of 1.4 spec duties. Any 5m cable is.

The reason they sell different spec cables is because longer cable runs require a better signal. This problem doesn't really exist 7meters or less, so it's pointless having that super dooper cable on small cable runs. Again there is nothing extra in a 1.4 spec cable than in a 1.2 spec. Just better insulation etc which is not needed.

If you turn on Deep Colour or other features that require higher bandwidth like lossless audio on a cheapo 20meter cable, you may see the sparklies effect and will INSTANTLY know their is a problem. Thats when the cable cannot handle the higher HDMI spec bandwidth. You will not get slight skews on colours etc, it just doesn't work like that, it's impossible.

To quote from HDMI.org here (emphasis applied by me).
EDID Implementation Issues

One key area of interoperability for HDMI-connected devices is the ability to effectively communicate EDID data via the DDC channel. If the sink device (the HDTV or projector) has its EDID ROM coded incorrectly, or if a source or repeater device fails to read it properly, the system will fail in its attempts to auto-negotiate the proper video and audio modes. Symptoms of this problem include incorrect color space and/or the wrong resolution.

There is also a fairly interesting pin out diagram here for anyone interested.

RB
 
Oh, and HDMI extenders are £26.23 inc vat, that's both ends, all you supply is the CAT6 cable... Hence why I suggest if going over 5-10M, it's a far cheaper option.. (The extenders are on a competitors site, but a well know computer e-tailer)

Where from?

Edit: found them.
 
Last edited:
To quote from HDMI.org here (emphasis applied by me).


There is also a fairly interesting pin out diagram here for anyone interested.

RB

You also missed this bit:
Symptoms of this problem include incorrect color space and/or the wrong resolution. Some installers take a small, reliable 1080p set along on installation calls to troubleshoot for this – if it doesn’t look right, the problem is most likely in the source device; if it looks good, the problem is probably in the sink. Regardless of which component is to blame, the issue can probably be resolved with a firmware update from the manufacturer.
So at no point is cabling the problem, merely poorly written firmware. However, i would be interested in seeing your results, although if you could find a way to analyse the data coming from the cable empirically it would be much more interesting. Perhaps by capturing the frame using a pc or something.
 
Last edited:
Back
Top Bottom