End to end encryption under threat

Or how about I don't particularly want anyone who steals my phone to have access to all my personal data and financial information?

Anyone saying that Apple are in the wrong here needs to get a grip. Some people died because of a couple of nutcases. It's not a reason to suddenly decide that encryption is evil.
I'm not saying the encryption is evil, in fact I think it is great, and should exist and be implemented. The whole scenario is just the same as the feds asking to do a wiretap. All they want is access, and Apple can facilitate that without creating a backdoor that affects everyone. Heck, they aren't even asking for a backdoor, just a engineering special, a one off, that can only be loaded on to a single particular phone that disables features.
 
Last edited:
So how do you think that changing the legal status of encryption will stop terrorists using it. Genuinely curious as to why you think people plotting to murder other human beings will suddenly go 'Encryption is illegal now, better go back to sending our super secret messages using SMS instead of our ISIS provided custom encryption software!'

Firstly because you won't be able to sell phones with E2E encryption in the UK, getting hold of one will be more difficult. It'll be possible to identify suspects by their possession of a phone that supports E2E encryption. I suggest that anyone who has an ISIS provided app on their smartphone is also easier to identify as a suspect, rather than everyone who has an iPhone. Secondly, while using custom encryption might protect their messages, they are still people and prone to mistakes such as incorrectly using the software, or sending an SMS by mistake. It's too easy for them to get a phone off the shelf and automatically start sending messages that the authorities can't read after they've died in a suicide terror attack.
 
Well the law disagrees, and says it is fine to go about creating a so-called back door to aid FBI investigators. I believe in the rule of law. If the risk is too much for you, don't own an iPhone 5C.

Slavery was once perfectly legal. Would you argue that it should be now ?

Times change and the law needs to move with it. We should absolutely be debating this and challenging it and not just rolling over and saying OK because a low level court said so.

If this is upheld all the way to the supreme court, it may be time to roll over and say OK. But not as it stands now.
 
Slavery was once perfectly legal. Would you argue that it should be now ?

Times change and the law needs to move with it. We should absolutely be debating this and challenging it and not just rolling over and saying OK because a low level court said so.

If this is upheld all the way to the supreme court, it may be time to roll over and say OK. But not as it stands now.

Seriously? You just compared slavery to breaking into a dead terrorist's phone :rolleyes:
 
Firstly because you won't be able to sell phones with E2E encryption in the UK, getting hold of one will be more difficult. It'll be possible to identify suspects by their possession of a phone that supports E2E encryption. I suggest that anyone who has an ISIS provided app on their smartphone is also easier to identify as a suspect, rather than everyone who has an iPhone. Secondly, while using custom encryption might protect their messages, they are still people and prone to mistakes such as incorrectly using the software, or sending an SMS by mistake. It's too easy for them to get a phone off the shelf and automatically start sending messages that the authorities can't read after they've died in a suicide terror attack.

Because smuggling a phone into the UK is incredibly difficult. Are you seriously suggesting making the personal information on 99.99999% of peoples phones less secure just to foil a handful of terrorist attacks? It's an idiotic idea.
 
I'm not saying the encryption is evil, in fact I think it is great, and should exist and be implemented. The whole scenario is just the same as the feds asking to do a wiretap. All they want is access, and Apple can facilitate that without creating a backdoor that affects everyone. Heck, they aren't even asking for a backdoor, just a engineering special, a one off, that can only be loaded on to a single particular phone that disables features.

It's the precedent it sets though. If Apple roll over and acquiesce to this request then they'll have to do it for any future requests. It can and will get abused by government/law agencies. The FBI have wanted access to encrypted phones for a while, they're making a huge deal out of this one as they want to play on the whole ISIS/terrorism angle. If they'd really wanted to get access to the phone then they might have been better off not executing the couple on sight.
 
Because smuggling a phone into the UK is incredibly difficult. Are you seriously suggesting making the personal information on 99.99999% of peoples phones less secure just to foil a handful of terrorist attacks? It's an idiotic idea.

It's more difficult than buying one in this country, it also highlights you, makes you stand out from everyone else which is to the terrorists detriment. Yes I am suggesting that potentially stopping terrorist acts is worth making 99.99% of messages (99.99% are not of interest at all) subject to authority intercept. Right to life comes before the right to privacy for me.

It's the precedent it sets though. If Apple roll over and acquiesce to this request then they'll have to do it for any future requests.

It's already established that corporations have to follow the law.
 
Seriously? You just compared slavery to breaking into a dead terrorist's phone :rolleyes:

No i used another example of the law moving with the times.

If you'd like a more relevant example, lets work it the other way. Pretty sure unauthorized use of a computer wasn't a law back in the 1800s either. It now is .. because the law has to move with the times.

Then of course you have the case where the courts simply got it wrong. Doesn't take long to find an example of that

http://www.bbc.co.uk/news/uk-35598896

And this is even before we deal with the fact that this has so far only been through one court. Apple have right of appeal which can go the district court and then onto the supreme court.

What happens if the district court turn this over on appeal and side with Apple. Who is right then ???
 
I think the line is when the courts say that encryption must be broken, then that encryption must be broken.

so what if i wont give up my encryption key?

i think i read that a 128bit key would take 1.4 billion years to brute force crack. and most people use 256 now.

good luck with that.

unless youre talking back doors into all encryption methods which would be plain madness.
 
so what if i wont give up my encryption key?

i think i read that a 128bit key would take 1.4 billion years to brute force crack. and most people use 256 now.

good luck with that.

unless youre talking back doors into all encryption methods which would be plain madness.

I believe it's criminal offence now to not give up your private key/password when requested. Can't remember what the penalty is but I seem to recall it involves jail time.
 
I believe it's criminal offence now to not give up your private key/password when requested. Can't remember what the penalty is but I seem to recall it involves jail time.

Probably less jail time than giving up the password :D.

Edit: longest sentence was 9 months for refusing to give up an encryption key to anti terror police;

http://www.theregister.co.uk/2009/11/24/ripa_jfl
 
Last edited:
What would happen if you had an encryption system via which you needed to keep abreast of a key/login detail at specific times (bad example being having a time dependent authentication key without the app to see it), otherwise the data is lost forever.

What then?
Auto-Arresto-time-o?
 
What would happen if you had an encryption system via which you needed to keep abreast of a key/login detail at specific times (bad example being having a time dependent authentication key without the app to see it), otherwise the data is lost forever.

What then?
Auto-Arresto-time-o?

There was a pretty famous story of a programmer making such a thing as a novelty then was hospitalised iirc when he got run over. By the time he came to hed lost around 6 months on a contract he was working on.

Annoyingly I can't find a link atm so either it's that old or I'm making stuff up and "remembering" it unintentionally :D.
 
What is it that's so bloody interesting that requires all our data to be encrypted on your phone? Yes banking etc is a big deal but encryption is not relevant here unless you're so forgetful you need to write down your passwords on your phone.
Seriously the amount of civil liberty / it sets a precedent whining nonsense that's going on here is utterly depressing.
The police want a way to access a phone that doesn't take ages.this may potentially save lives. They are not asking for a back door to all encrypted phones. Just this one which belonged to a terrorist who killed countless innocent people.
How are people not getting this?
 
I think the line is when the courts say that encryption must be broken, then that encryption must be broken. End-to-end encryption is a terrorist's wet dream and we know they're using it against us.

But they really aren't.

The paris attackers just used normal sms text on prepaid phones.

No fancy encryption nothing
 
unless youre talking back doors into all encryption methods which would be plain madness.

Youve got to remember scorza thought (and suported) that all uk door locks had police master keys. ...


He doesnt consiser it a risk that somone else would ever break the deliberately vunerable backdoor/masterkey
 
Firstly because you won't be able to sell phones with E2E encryption in the UK, getting hold of one will be more difficult. It'll be possible to identify suspects by their possession of a phone that supports E2E encryption. I suggest that anyone who has an ISIS provided app on their smartphone is also easier to identify as a suspect, rather than everyone who has an iPhone. Secondly, while using custom encryption might protect their messages, they are still people and prone to mistakes such as incorrectly using the software, or sending an SMS by mistake. It's too easy for them to get a phone off the shelf and automatically start sending messages that the authorities can't read after they've died in a suicide terror attack.

The actual attack preparation has to happen in the physical world, and transactions have to be made that leave a trace. The way that encrypted communication is being portrayed people would think that it's solely responsible for all successful terror attacks, which just isn't the case.

And unless you're going to force all phones sold in the UK to be incapable of running software, then there's no such thing as a phone that doesn't support end-to-end encryption. What's stopping someone wanting to carry out a terrorist attack buying an iPhone and an Apple Developer account to load an open-source encrypted messaging app on a select few iPhones, and giving it the Facebook icon? Or should the government have the right to install device management profiles on your personal devices to ensure this isn't happening, and have a massive web filtering system in place to prevent anything that could potentially be used in a terrorist attack from being available in the UK?

I don't want my personal data and communications to be made less secure on purpose simply because people are afraid of the terrorist boogeyman. Tim Cook is right to take a stand on this, and if it forces others to publicly take the same position then that's even better.
 
Last edited:
Firstly because you won't be able to sell phones with E2E encryption in the UK, getting hold of one will be more difficult. It'll be possible to identify suspects by their possession of a phone that supports E2E encryption. I suggest that anyone who has an ISIS provided app on their smartphone is also easier to identify as a suspect, rather than everyone who has an iPhone. Secondly, while using custom encryption might protect their messages, they are still people and prone to mistakes such as incorrectly using the software, or sending an SMS by mistake. It's too easy for them to get a phone off the shelf and automatically start sending messages that the authorities can't read after they've died in a suicide terror attack.

Why would you need a phone?

Gsm modules can be bought very cheaply and even already set up for all the common microcontrolers/mini computers like raspberry pi or arduino.
 
Because smuggling a phone into the UK is incredibly difficult. Are you seriously suggesting making the personal information on 99.99999% of peoples phones less secure just to foil a handful of terrorist attacks? It's an idiotic idea.

Alternatively you could just download an app and communicate through that. No need for all the convolution Scorza has come up with.

An E2E encrypted version of Kik or whatsapp would do fine and easily available. If it were banned in certain countries then tbhe user would just need to change the location of their phone in account settings, which is what people do now to get certain apps.

Unfortunately, much like many lawmakers and politicians, Scorza doesn't seem to understand the Internet and the problem with national laws on an international stage.
 
Back
Top Bottom