Privacy vs Security: Should Apple have a Back Door?

John Lister's picture

Apple has told a court that it's impossible to access data in most iPhones and iPads without a password. It could lead to a legal standoff in the 'security versus privacy' debate.

The comments came in a case involving a recently-seized iPhone. The United States Justice Department is unable to access the contents of the phone and has therefore asked the court to order Apple to help them gain access. In this specific case however, Apple is physically able to access the device's data because the phone itself is running a susceptible operating system (iOS version 7). Nonetheless, Apple has made a point of filing a document to put on record that it does not normally hack into phones to provide such data to third parties.

Decryption Impossible Without Password

Apple explained that for devices running iOS8 or iOS9, enhanced encryption means it is literally impossible for Apple to access data on the device if the owner has set a password and refuses to provide it. That filing leads to two separate legal questions for courts to decide.

The first is whether the Justice Department has the right to force Apple to help it carry out search warrants in cases where it does have the technical ability to access phone data. According to Apple, that should only happen where a court has expressly reviewed a case and issued an order. Apple's lawyers say that "Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand." (Source: reuters.com)

Court Order Could Have Farcical Outcome

The second question is what happens in such cases where the phone is running iOS8 or iOS9. It's not yet clear what would happen if a court ordered Apple to decrypt phone data, but Apple insisted it was technically unable to do so. In theory it could lead to a judge holding Apple in contempt for failing to do the impossible, a situation that would no doubt escalate up through the court system.

Apple boss Tim Cook has previously rejected the idea of including a "back door" that would allow law enforcement officials the ability to access data in such cases. He says that would be a security risk "because you can't have a back door that's only for the good guys." (Source: siliconbeat.com)

What's Your Opinion?

Which should take priority in this situation: privacy or security? Was Apple smart to beef up security so that court orders for it to decrypt data become irrelevant in practical terms? Should the law force device manufacturers to allow scope for law enforcement to access suspect's devices?

Rate this article: 
Average: 5 (4 votes)

Comments

Dennis Faas's picture

The quote that "you can't have a back door that's only for the good guys" says it all - if you put in a back door in order to circumvent security, it will surely be exploited by third parties (such as hackers) and you would never even know you were being hacked. It's quite simply a bad idea. Equally so: if I knew that Apple devices were easily hacked, then I doubt very much I would buy an Apple device. As such I believe they have every right to defend their argument.

Don Cook's picture

NO!. All devices that are protected by a security system can only be accessed by the person or company employing the system. FULL STOP for yanks. The info kept is NOT for any other persons eyes or ears.

alan.cameron_4852's picture

>drh.cook
What did you mean by "FULL STOP for yanks.", it should apply everywhere, NOT JUST for yanks.

guitardogg's picture

If the cops or the DOJ or whoever can't make a case without cell phone info, they probably don't have much of a case anyway. Privacy has to trump security! I don't want to hamper law enforcement, but we've given up too much already in the name of security!

matt_2058's picture

Should apple have a backdoor? That is only for them to decide, and deal with the consequences like brand image, security, court orders, etc. As mentioned it starts a mess that snowballs out of control.

Apple was smart to beef up the security. It basically removed them as the middleman in the law enforcement goals vs citizen privacy fight. Now they don't have to consider the repercussions of either party.

Your last question is the sledge hammer. A law to force manufacturers to build a backdoor for law enforcement? I can understand laws driving design for safety. But not for law enforcement to access a device. Can you imagine the scene once the access is compromised? Or when a government employee or contractor abuses the access?

All too soon we forget how goofy people get when given access to private information:
IRS agent looked up personal info on 197 celebs, a neighbor, and 4 personal business acquaintances.

"The Internal Revenue Service fired 23 employees, disciplined 349 and counseled 472 after agency audits found that government computers were still being used to browse tax records of friends, relatives and celebrities."

IRS--"fiscal 1994 and 1995, listed 1,515 cases where employees were accused of misusing computers. "

"A former Seminole County deputy was stripped of his law enforcement certification after accessing information on more than 125 people, including celebrities, co-workers and ex-girlfriends."

Secret Service employees violated Privacy Act: “Some thought that accessing such a record, even to satisfy personal curiosity, was appropriate..." Lame excuse because ALL US Government employees sign a statement understanding the Privacy Act.

fly_5659's picture

Sure, Law&Order types want full transparency (except for their information) to make their job easier, but Apple's new policy means the Law&Order types are just where they would be if the communicators whispered to each other out in a remote field, or any other form of REALLY private communication. Nobody made WWII spies give the spooks their code books, and Germany wasn't required to register its Enigma machines, so why should spooks expect all private communications include a side trip through them?