Why Apple’s Fight With The FBI Might Decide The Future Of Privacy

shutterstock_215339035
Shutterstock

The San Bernardino shootings were a tragedy that the FBI is still unraveling. As part of their investigation, the FBI has asked Apple to rewrite certain aspects of iOS to unlock the phones of the shooters. Apple has refused, and this might be a landmark moment in whether or not privacy as we know it even exists in the future.

Despite what you’re seeing in the headlines, the FBI hasn’t asked Apple to compromise its encryption. Instead, the FBI is asking for a custom version of iOS that disables a key security feature. Generally, after ten failed attempts to guess the password, your iPhone completely wipes any data. The FBI has asked this feature be removed so it can keep guessing until the phone unlocks.

Apple has refused, and explained why in an open letter. Once they create this software, which they’ve refused to design, Apple has no guarantees that it won’t be widely used by law enforcement as a way to get around Apple’s end-to-end encryption for users. And to be frank, Apple has every right to be suspicious.

Law enforcement authorities across the world have been attempting to force tech companies to install backdoors in their encryption for years. The FBI has insisted commercial encryption helps terrorists. UK authorities have proposed backdoor designs for various communications networks. French lawmakers attempted to force hardware backdoors onto phone manufacturers. No matter where you go in the world, there’s at least one government figure arguing that encryption must be stopped or defeated.

Tech companies have campaigned against it as it makes their systems dangerously weak. Despite the insistence of governments, any backdoor has the potential to be exploited, no matter how closely guarded a secret it might be. The first cyberweapon, Stuxnet, exploited a simple backdoor in industrial equipment to target and destroy Iranian centrifuges. U.S. authorities believe Chinese hackers used backdoors to gain access to crucial infrastructure. This isn’t even considering the problem of potential abuse by government employees, as the NSA has been accused of accessing and sharing nude photos of innocent people from their phones.

This in particular is pernicious because it’s not, technically, a breach of Apple’s encryption. Nonetheless, Apple argues, correctly, that once the software exists, it will face increasing pressure to provide it to law enforcement authorities across the world and will be unable to control who uses it and for what purpose. Similarly, if Apple complies, pressure will increase on other tech companies to build the same software for law enforcement. It’s a backdoor in all but name, and worse, it’s a backdoor even the simplest of hackers can exploit.

Perhaps the most glaring issue, however, is that the FBI likely doesn’t need this tool in the first place. The Justice Department has wide-ranging authority to access private communications, often without needing a warrant. It’s unlikely there’s anything on those phones the FBI doesn’t already have.

It’s not clear whether Apple can win this fight. But whether or not it does might decide how much privacy you have in the future, and make storing any personal information on your phone a bad idea.

(via Apple)

×