Lightning Connector Hate

Soldato
Joined
14 Apr 2003
Posts
4,950
Location
Deepest Yorkshire
Apple are getting a huge amount of stick for making an adaptor which has a chip in for outputting video: http://www.theregister.co.uk/2013/03/04/cpu_inside_apple_video_adapter/

Am I the only one who thinks there is nothing wrong with this and it is the best solution to the problem?

Most people are saying that Apple should have gone with Micro USB like everyone else has for the connector for the iPhone but if you think even a tiny bit about it you can see why it is unsuitable.

For iDevices the requirements are:
  • Single Connector to keep the device elegant and small
  • dockable connector
  • high current for charging iPads in a reasonable amount of time
  • futureproof

Apple already tried the route of a single connector with multiple outputs - The old dock connector has FireWire, USB and Video cables, but as you know it has become obsolete as standards have moved on, leaving a bulky, fragile cable with lots of unused pins.
Therefore the best solution is a single cable with only a data interface, leaving what is at the end of the table to do the decoding.
This is what they have done.

Why not use Micro USB then?
  • Micro USB doesn't support high enough currents to charge iPads in a decent amount of time
  • Micro USB is terrible for docking - hard to mate the dock and device, too fragile to hold the device
  • Micro USB is not futureproof - When USB3/thunderbolt/whatever comes after becomes the norm the connector will change and everyone's speaker docks will become obsolete.

So surely this leads to the solution that has been chosen - a compact, dockable, futureproof connector which can transfer any kind of data anywhere.
If in 2014 4K TV requires a new HDMI connector then chances are the Lightning adaptor released will work with any device with a ligtning connector, meanwhile an Android phone would need USB, HDMI, and new connector ports which is silly.

anyone with me?
 
I don't hate it, but I don't have anything that uses that connector.

It's only the latest iPhone and iPad isn't it?
 
...

If in 2014 4K TV requires a new HDMI connector then chances are the Lightning adaptor released will work with any device with a ligtning connector, meanwhile an Android phone would need USB, HDMI, and new connector ports which is silly.

anyone with me?

I'm sure that is flawed. If a 4K TV in 2014 requires a new design of HDMI connector then connecting a Lightning connector would require some sort of adapter ... but whats to stop them releasing a adapter to convert the existing HDMI port on old Android device to the new connector in the same way (it's not as if the device will support 4K anyway).

As for new devices, if there is a new HDMI port then it's likely there would be micro/mini variants anyway for use on phone type devices and the chances are you would be able to get an adapter to convert for legacy HDMI devices.

So in all cases you are using an adapter anyway ... so the only downside is having a micro-usb port as well which given the trend for large screen phones it's not as if there isn't space for one.

I really don't see the big deal here ...
 
To be honest the Lightening connector IS a PITA tho, and was the sole reason I switched to the lumia after having iPhones since the original. (I had lots of docks and didn't like the adapter solution,so it gave me the chance to try other ecosystems, which I did and liked it!)

IMO the case you just made above for lightning vs USB, if you reread it as dock connector vs lightning it would seem the dock was the better connector (just too big).

Dock connector (w/out adapter)

  • HD Video
  • Analogue audio
  • Digital audio
  • Charging
  • Data transfer
 
Am I the only one who thinks there is nothing wrong with this and it is the best solution to the problem?

If it worked as it was supposed to, then yes I do agree. However it doesn't and that is the thing that people are annoyed about. Not the fact that there is a SoC in there.

What Apple have done is replace an HDMI output signal that is capable of 1900x1080 and replace it the Lightning connector that outputs to HDMI at 1600x900 upscaled to 1900x1080. That is the issue, the reason for the hate. You cannot aim to replace a standard feature that people are used to with something unless there is good reason and it at least meets the capabilities of what you are replacing. Remember when Apple pulled Google Maps? Same thing.
 
I like the Lightning connector too.

I have, however, noticed that the failure rate seems quite high. I presume it's from people pulling the cable to remove it, but Apple should have seen that one coming and toughened it up a bit.

That aside, I think it's great. Double sided, tiny connector on the device, and the device connector is rock solid without any fiddly bits.
 
Most people seem to hate the prospect of replacing 10 year old accessories for their old type connectors which costs more money.

I am really looking forward to slowly getting more devices for the format, once I own more than one I ca charge properly from my mac as currently I need to use one of the main USB ports to provide enough power (one on the screens hub aren't good enough for lightening).

Anyway it's a much nicer and more elegant standard for apple devices so I hope it picks up and gets more popular soon
 
Glad some people agree with me.

I think this will be the last dock connector Apple ever produce. It can't really ever become obsolete as it's not even tied to any data transfer protocol like USB.
I can see Apple removing all connectors except the headphone port within a few years once Wireless charging becomes popular.

I have not once connected my current or previous iPhones to a computer as Apple TV, iTunes Match, Dropbox does everything I need.
Also I've put a dust cover in my headphone port as I have no need for that either: For the car I have bluetooth audio, for home I have Airplay to Apple TV and on the move i have bluetooth headphones (Plantronics BackBeat).
 
Companies have been doing it for years, each one trying to make their own connector and own standards. Sony along with Apple are the worst offenders, trademarking the standard and stopping or charging 3rd parties from/for using them. Its just another way of making extra cash.

I can still remember SCART first coming to the UK and it was the must have thing for tvs and videos, I don't think my small TV even has a SCART socket; just more usb and hmdi headers. standards come and go; else we still be using tech that decades old. :)
 
Have lightning on the current iPad and I do worry that it could easily snap if pulled out wrong, or catches something. Obviously there aren't many accessories out there like the original connector, but I just can't see docks being anywhere near as robust with such a flimsy connector on it.
 
The adaptor you're on about is an inventive bodge. It's effectively another computer that receives display data on the lighning bus data then converts it in software and outputs via HDMI.

It's not a pure AV signal, hence the compression articfacts similar to what may be seen using AirPlay.
 
Interesting. I'll quote it properly :
Anonymous Coward said:
Airplay is not involved in the operation of this adapter.
It is true that the kernel the adapter SoC boots is based off of XNU, but that’s where the similarities between iOS and the adapter firmware end. The firmware environment doesn’t even run launchd. There’s no shell in the image, there’s no utilities (analogous to what we used to call the “BSD Subsystem” in Mac OS X). It boots straight into a daemon designed to accept incoming data from the host device, decode that data stream, and output it through the A/V connectors. There’s a set of kernel modules that handle the low level data transfer and HDMI output, but that’s about it. I wish I could offer more details then this but I’m posting as AC for a damned good reason.
The reason why this adapter exists is because Lightning is simply not capable of streaming a “raw” HDMI signal across the cable. Lightning is a serial bus. There is no clever wire multiplexing involved. Contrary to the opinions presented in this thread, we didn’t do this to screw the customer. We did this to specifically shift the complexity of the “adapter” bit into the adapter itself, leaving the host hardware free of any concerns in regards to what was hanging off the other end of the Lightning cable. If you wanted to produce a Lightning adapter that offered something like a GPIB port (don’t laugh, I know some guys doing exactly this) on the other end, then the only support you need to implement on the iDevice is in software- not hardware. The GPIB adapter contains all the relevant Lightning -> GPIB circuitry.
It’s vastly the same thing with the HDMI adapter. Lightning doesn’t have anything to do with HDMI at all. Again, it’s just a high speed serial interface. Airplay uses a bunch of hardware h264 encoding technology that we’ve already got access to, so what happens here is that we use the same hardware to encode an output stream on the fly and fire it down the Lightning cable straight into the ARM SoC the guys at Panic discovered. Airplay itself (the network protocol) is NOT involved in this process. The encoded data is transferred as packetized data across the Lightning bus, where it is decoded by the ARM SoC and pushed out over HDMI.
This system essentially allows us to output to any device on the planet, irregardless of the endpoint bus (HDMI, DisplayPort, and any future inventions) by simply producing the relevant adapter that plugs into the Lightning port. Since the iOS device doesn’t care about the hardware hanging off the other end, you don’t need a new iPad or iPhone when a new A/V connector hits the market.
Certain people are aware that the quality could be better and others are working on it. For the time being, the quality was deemed to be suitably acceptable. Given the dynamic nature of the system (and the fact that the firmware is stored in RAM rather then ROM), updates **will** be made available as a part of future iOS updates. When this will happen I can’t say for anonymous reasons, but these concerns haven’t gone unnoticed.

I do wonder that if say playing a movie it's decoding the video then re-encoding it on the fly to the HDMI dongle for lots of extra compression artifacts.
 
Back
Top Bottom