Apple vs the FBI has raised the seemingly endless debate on how we strike a balance between security and privacy, but the case is quickly proving it isn't possible to have both.
If the US government is successful in forcing Apple to downgrade an iPhone’s security, it could have crippling consequences for the company’s bottom line as customers go elsewhere to preserve their privacy, and may also put countless people at risk all over the world.
The FBI has an iPhone 5C, seized from one of the shooters in the San Bernardino massacre, that the agency believes contains information that will aid in its case.
It is unable to access the data, however, because it’s encrypted on the iPhone, protected by a PIN code.
Apple has done such a good job engineering the security systems in its mobile operating systems and devices that the FBI’s forensics guys are powerless.
The encryption is so strong that even with the most powerful supercomputers running a brute force attack, they can’t get in – so the only option is to use the PIN to unlock it.
However, Apple has also layered in another security mechanism that stops attackers running a brute force attack against the PIN code: if an incorrect PIN is entered ten times in a row, the iPhone automatically wipes all the user’s private data.
The FBI has served Apple with a court injunction, requiring the introduction of a custom iOS variant that removes the constraint of ten incorrect PIN tries, so allowing their techs to brute force the pin.
Apple CEO Tim Cook has said he will not comply with this request, arguing it sets the stage for a slew of cases where the government can demand backdoors be built into any products from any companies that may help with their investigations.
Cook published an open letter to Apple's customers claiming the FBI was setting a "dangerous precedent" by using a magistrates’ court order to "justify an expansion of its authority".
He said he was also concerned the backdoor code might fall into the wrong hands – and let’s face it, this is entirely possible, just ask Juniper – and has the potential to put innocent people’s lives at risk all around the world.
If the code ends up in the hands of oppressive government regimes or in the arsenal of malicious actors, our identities, bank details, health records, personal emails, text messages, family photographs and contacts will all be at risk.
What can be done?
The reality is this is not an argument that can be won. There is no balance that can be struck between privacy and security, since they are diametrically opposed concepts.
If we are to build systems that protect our data, we can’t build in backdoors to reduce the security of the system.
There may be a precedent in this case that some people believe warrants cooperation by Apple, but Cook has decided to take a stand against the bigger issue facing the rest of the world should such a precedent be set.
At the beginning of the month, a bipartisan bill was introduced in the US that sought to prohibit the government from requiring tech companies to build encryption weaknesses into their products, entirely in contradiction to this latest move from the FBI against Apple.
The Juniper story from January shows exactly what can happen if a backdoor or weakness is deliberately introduced into a security system for the purposes of the vendor or the government.
In this case, the backdoor was being exploited by an as yet unknown third party without the knowledge of Juniper for a couple of years before it was discovered in a code review.
Each case will be dealt with based on its own merits.There are so many grey areas in each argument that no one can be completely right – some will argue Cook should help the FBI as law enforcement is more important than privacy, especially as it may stop terrorism.
As an industry, we need to be prepared for a battle, because if the government comes knocking on our corporate door at some point in the future, asking us to do what the FBI has asked Apple to do, are we ready to take a stand as strong as Cook's?