Ed. Note.: This post, by Benjamin Vanlalvena, is a part of the NALSAR Tech Law Forum Editorial Test 2016.
A background of the issue
On December 2, 2015, 14 people were killed and 22 were seriously injured in a terrorist attack at the Inland Regional Center in San Bernardino, California, which consisted of a mass shooting and an attempted bombing. The FBI announced on February 9, 2016 that it was unable to unlock the iPhone used by one of the shooters, Farook. The FBI initially asked the NSA to break into the iPhone but their issue was not resolved, and therefore asked Apple to create a version of the phone’s operating system to disable the security features on that phone.
Apple however refused which led to the Department of Justice applying to a United States Magistrate judge who issued a court order requiring Apple to create and provide the requested software and was given until 26th February, 2016 to respond to the order. Apple however announced their intention to oppose the order. The Department of Justice in response filed a new application to compel Apple to comply with the order. It was revealed that they had discussed methods to access the data in January however, a mistake by the investigating agencies ruled out that method. On March 28, the FBI announced that they had unlocked the phone and withdrew the suit.
The dilemma
Privacy is a recognised fundamental right under Article 17 of the International Covenant for Civil and Political Rights and Article 12 of the Universal Declaration of Human Rights.
Encryption is a process through which one encodes or secures a message or data to make the content readable only by an authorized party or by someone who has the decryption key. Apple claims that it does not perform data extractions as the ‘files to be extracted are protected by an encryption key that is tied to the user’s passcode, which Apple does not possess.’ This, according to the FBI Director, James Comey, is a cause for concern as it means that even with a court order, the contents inside the device of all kinds of criminals would not be accessible. Having a backdoor or ‘golden key’, though slightly different [though not totally] from mass surveillance, as agencies herein would be having the capability to access data stored in the devices as compared to a constant monitoring of data. It’s no longer a matter of constant surveillance but the potentiality of other non-governmental persons gaining access through some illegitimate means. The major contention is that there is an assumption either that those who have access to the key are ‘good people’, who have our interests in mind or that the backdoor would only be accessible by the government. The Washington Post reported that the FBI had (after failing to get Apple to comply) paid professional hackers to assist them in cracking the San Bernardino terrorist’s phone. This itself is a cause of concern as it is proof of vulnerabilities existing in our phones which are seemingly secure.
A data that is encrypted cannot be considered to be totally secure if there is some party which has a means to bypass said encryption. The FBI’s request is therefore problematic as it gives it a backdoor to the data which would be a vulnerability which effects all users. One should bear in mind that the trade of such ‘zero-day vulnerabilities’ is not something unheard of and the NSA or FBI having such tools which keep our data secure is problematic as such tools could be end up in the hands of hackers or leaked. One of the most hard hitting points raised is the issue of national interest, that terrorists or paedophiles use encryption and that it is a “safe space” for them. However, a creation of a backdoor according to the former NSA chief, Michael Hayden, would be futile as terrorists would be making their own apps based on open-source software, the presence of a backdoor would simply make innocent persons less secure and vulnerable to people who would be taking advantage of such backdoors.
While the intention of the agencies might be good or in the interests of the public, one should keep in mind that once a backdoor is provided, not only is this a dangerous precedent but the dangers of such an encryption leaking an effecting the lives of common persons is huge.
For more information, visit:
https://tcf.org/content/commentary/weve-apple-encryption-debate-nothing-new/
https://www.aclu.org/feature/community-control-over-police-surveillance
https://www.ctc.usma.edu/posts/how-terrorists-use-encryption