Tim Cook's reply to US Court order for backdoor access to iPhone

Soldato
Joined
18 Mar 2006
Posts
4,148
Location
Liverpool
Not sure if any of you have seen, but a US judge has ordered Apple create a back-door to allow access to the phone of the US Gunman who committed the December attacks in San Bernardino last year.

Story Here


Well it Tim Cook has written a response to that:
A Message to Our Customers

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.
The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.
A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook

http://www.apple.com/customer-letter/

I'm glad to see Apple taking such a stance. As many faults as Apple has as a company, I'm glad to see the stance on encryption and data security they take, same way they did with 3rd party fingerprint readers with the Error 53 fault.
It'll be interesting to see how this plays out.
 
Nice sales pitch and crowd pleasing response, i imagine the share price has gone up a little.

You can't blame them for asking, it's a bit idiotic, especially when they already have flexibility with the patriot act.

Getting access to everyone just in case? Bad

Accessing a proven criminal? Good

Apple marketing at it's best!
 
Not sure if any of you have seen, but a US judge has ordered Apple create a back-door to allow access to the phone of the US Gunman who committed the December attacks in San Bernardino last year.

Story Here


Well it Tim Cook has written a response to that:


http://www.apple.com/customer-letter/

I'm glad to see Apple taking such a stance. As many faults as Apple has as a company, I'm glad to see the stance on encryption and data security they take, same way they did with 3rd party fingerprint readers with the Error 53 fault.
It'll be interesting to see how this plays out.

Why don't the FBI ask apple to create a crime unit - then provide them with the phone (under observation) to use any tools to recover data thus ensuring only that version of the OS remains inside Apple?

I know what they're saying - if someone creates a version it only takes one employee to sell the binaries to undo the security. The result is that it's bypassing the company security and not just the security on the phone (i.e. no one employee has full access to the entire source code to do that by themselves).

Perhaps US will have a UK law - that it's illegal not to provide an encryption key. That means in the UK the person is responsible for their action and not a company..

Operators in the US are already subject to laws that demand communication and messaging (SMS) etc can be redirected, monitored, altered/substituted and blocked by government agencies. Thus the move from SMS to iMessage probably bypassed their monitoring too unless Apple are subject to the same forms of laws.
 
Last edited:
Much respect to Apple.

Is Apple able to weaken the security of an encrypted iOS device?

The FBI has asked Apple to do two things.

Firstly, it wants the company to alter Farook's iPhone so that investigators can make unlimited attempts at the passcode without the risk of erasing the data.

Secondly, it wants Apple to help implement a way to rapidly try different passcode combinations, to save tapping in each one manually.

http://www.bbc.co.uk/news/technology-35594245
 
Massive respect to Apple for showing a strong response to the US government. Hope we see other tech companies such as Google provide their support with this. It'll be interesting to see how the US Government continues from here...
 
Massive respect to Apple for showing a strong response to the US government. Hope we see other tech companies such as Google provide their support with this. It'll be interesting to see how the US Government continues from here...

I can't see how Google, and any one else in the tech industry, can't support this. This needs to be a united front, as if one falls, they all fall.
 
I don't think its really as simple as US government = bad and Apple = good on this subject.

While I disagree with a blanket backdoor and mas surveillance, I'm not sure I agree with those that believe devices should never be able to be unlocked when due process is followed. Each case needs to be considered on its own merits and the threshold needs to be high before its considered and it should be transparent.

In reality what is happening at the moment Apple have created an anonymous, secure and user friendly platform for people to use in their daily lives which protects from from opportunist attack from hackers and thieves which is a good thing.

But what that also creates is an anonymous, secure and user friendly platform for criminals and extremists to communicate with relatively low risk and no chance of the data being recovered by law enforcement in the event they are apprehended. I don't think anyone 'reasonable' will argue that this is good.

Consider this situation, following a terrorist style attack where the assailant was killed and was likely radicalized by others still at large and data on their phone may help that strand of the investigation do you think Apple are right to take this stance?

Ask yourself the same question but your family member or friend was killed in the above, do you still think its right?
 
Last edited:
Consider this situation, following a terrorist style attack where the assailant was killed and was likely radicalized by others still at large and data on their phone may help that strand of the investigation do you think Apple are right to take this stance?

Ask yourself the same question but your family member or friend was killed in the above, do you still think its right?

Of course - that isn't to say I'd like it but the only people who suffer if encryption is weakened are the good people - there is still a million other ways people upto no good can hide what they are doing.

And what is to say once they finally get into the device they can even necessarily read anything on it anyhow? if the perpetrators were half prepared they'd probably used some kind of cypher of their own as well.
 
Apple's stance is correct in my opinion. Those who utilise encryption for subversive purposes have many places to turn if iPhone encryption is hacked. It's the liberty of those of us who value our privacy that will be damaged by such moves. The government is overstepping its bounds.
 
I don't think its really as simple as US government = bad and Apple = good on this subject.

While I disagree with a blanket backdoor and mas surveillance, I'm not sure I agree with those that believe devices should never be able to be unlocked when due process is followed. Each case needs to be considered on its own merits and the threshold needs to be high before its considered and it should be transparent.

In reality what is happening at the moment Apple have created an anonymous, secure and user friendly platform for people to use in their daily lives which protects from from opportunist attack from hackers and thieves which is a good thing.

But what that also creates is an anonymous, secure and user friendly platform for criminals and extremists to communicate with relatively low risk and no chance of the data being recovered by law enforcement in the event they are apprehended. I don't think anyone 'reasonable' will argue that this is good.

Consider this situation, following a terrorist style attack where the assailant was killed and was likely radicalized by others still at large and data on their phone may help that strand of the investigation do you think Apple are right to take this stance?

Ask yourself the same question but your family member or friend was killed in the above, do you still think its right?

Do you not understand what they are saying? They cant unlock a phone for an individual and not give the ability for everyone else to unlock it.

Which means stalkers, hackers, nasty people will find this exploit and steal all your family members data and rob them or do even nastier things to them.

I mean everyone in Government would surely have to stop using the phones, because they would be so insecure and dangerous. I dont know whether companies use Iphones but if they communicate sensitive data then they cant use an Iphone or any phone at all.

It's insane it only helps everyone else. Like any other country that is interested in data in that country would have a field day. Russia, China, North Korea would find it hugely lucrative and easy at stealing money and information.
 
I saw something about them being asked to not break the encryption itself, but to change it so they can guess the passcode as many times as they want (for this phone in this case). Or did they ask for that and something broader?

They are asking for a tool that could guess the passcode though hardware, rather than manually inputting the code. It also would remove the restriction on 10 incorrect attempts wipe. This is effectively breaking the encryption as it will allow the full unlock of the device.
 
Hopefully if there's a way to update the iOS on a locked device that doesn't instantly bin the encryption keys and erase everything from the device then such a feature will be added in future hardware/software revisions.
 
Google CEO Sundar Pichai has chimed in on the escalating battle between the FBI and Apple over iPhone encryption. Describing the letter published by Apple's Tim Cook as "important," Pichai says that a judge's order forcing Apple to assist the FBI in gaining access to the data on a terrorist's iPhone "could be a troubling precedent."

Source - TheVerge.com
 
Good article written for MacWorld here.
I was looking for a quote to paste here, but the article needs to be read in its full extent to have the impact required.
 
Consider this situation, following a terrorist style attack where the assailant was killed and was likely radicalized by others still at large and data on their phone may help that strand of the investigation do you think Apple are right to take this stance?

I literally change my opinion on this every five minutes, The government can't be trusted with the ability to unlock phones but at the same time we don't want criminals and terrorists with this amazing way to plot and plan with no interference. Really difficult.
 
I literally change my opinion on this every five minutes, The government can't be trusted with the ability to unlock phones but at the same time we don't want criminals and terrorists with this amazing way to plot and plan with no interference. Really difficult.

Yes, it is a difficult one. I wonder how Tim Cook's insular view of protecting American freedoms would change if they had as many terrorist threats/attacks as we have in Europe and the Middle East. I suspect that he would have a different tune if a few serious incidents happened over there and the cells were using iPhones.

Just waiting for someone to start a "iPhone the Terrorist's Choice" campaign on social media... That would have quite a reaction that's for sure.
 
Back
Top Bottom