Monday, February 22, 2016

Apple, the FBI, and the Cost of Bypassing Encryption

The fight over liberty versus security makes the news again. The Federal Bureau of Investigation (FBI) asked the company Apple to comply with an order to bypass security features on an iPhone 5c from Syed Farook, one of the attackers in the San Bernardino shooting back in December. Tim Cook, the president of Apple, released a poignant letter saying that Apple refuses to comply. In response, the Department of Justice filed a motion to force Apple to assist the FBI. It seems like a simple request: help the FBI unlock the phone of a terrorist who possibly had ties to ISIS. Why is Apple taking such a forceful stand?

At first glance, this seems like an open-and-shut case of the Fourth Amendment. This is especially true since Farook is dead, and the information could potentially help bring down certain terrorist organizations. While there is a general expectation of privacy in American jurisprudence, it can be overwritten if the government has acquired a warrant, which is permissible under the Fourth Amendment. What is being used to attempt Apple's compliance with such an order is an unprecedentedly broad reading of the All Writs Act of 1789, which involves compelling individuals or companies not involved in a given case to comply. While it's true that the Fourth Amendment allows for warrants, the warrant still needs to be executed in a reasonable manner. The Supreme Court ruled in the case of United States v. New York Telephone Company that the government cannot compel a company to comply if doing so would place "unreasonable burdens" on it. There is also a federal magistrate judge in New York that implied in preliminary comments on another case involving an Apple phone that the All Writs Act doesn't apply, which could help Apple's case during litigation. I'm sure that the ensuing legal battle will make for more interesting commentary, but I wonder just how much of an unreasonable burden such a compliance would cause.

Much like other versions of the iPhone, the iPhone 5c is an encrypted phone, which is to say only one who has a key (e.g., a four-digit password) can access the phone. It's a good thing that smartphones are encrypted because as I brought up about a couple years ago, smartphones can store information on banking, health, and location, as well as private text conversations. Encryption is also great if your phone is stolen by criminals or if you're visiting a totalitarian country, and the police decide to seize your phone. The fact that the FBI cannot bypass the encryption shows just how secure encryption can be.

Apple has already provided the FBI with data Farook stored on a Cloud device a month before an attack. Apple also provided the FBI with data that is directly under Apple's possession, so it's not as if Apple is deciding to be stubborn or recalcitrant as a form of baseless defiance. This case is not your standard Fourth Amendment issue in which the government has a warrant to access a house. It is more analogous to asking a lock manufacturer that manufactured locks for a certain house, and telling the manufacturer they have to come on-site to unlock the house so the police can gain entry. It would explain why the All Writs Act is being used in hopes to compel Apple.  While the circumstances of this case are unusual (e.g., the owner of the property is dead), this does set precedent that the government can coerce someone to gain access to someone else's property.

Aside from an enforced delay between password guesses, the security measure that is causing so much trouble has to do with passwords. Ten failed attempts at entering the phone could lock the phone and erase the data on the phone, which is why "brute force" wouldn't work in this case. The flaw in the iPhone 5c, which is an older version of the iPhone, is that the software controlling the phone is not encrypted. As such, the FBI is asking Apple to create a new version of iOS, Apple's operating system. By installing this new version of iOS on Farook's phone, which can be done without the consent of the owner, the FBI would be able to access the information required. Aside from the fact that such technology doesn't exist, what's the issue with this request? The FBI makes it sound as if the reply were a one-time request that would remain an isolated incident. However, the issue is that according to industry officials, the hacked software being requested would be general, and would work on any phone of the same model. Cook's issue is that creating a new version of iOS that bypasses security measures in such a manner would create a backdoor. The idea of a backdoor sounds alarming, not to mention an argument could be made that such an order could violate the First and Fifth Amendments (believe it or not, Bernstein v. United States and Junger v. Daly ruled that software source code is protected by the First Amendment). What are the possible outcomes of legal action? I see four possible outcomes.

  1. Apple wins the case, and the government is kept at bay. This is a scenario in which the status quo is preserved, and sets precedent that the United States government cannot conscript companies to produce hacking or surveillance tools that are meant to compromise the devices. Privacy is not just an essential for living in a free society; it is vital for self-development. It's not only privacy, but the increased cost of having to create sophisticated technology along with products and workhours used to compromise said security. Imagine an entire department at Apple dedicated to building spyware for government. Imagine the barrier to entry that would create for start-up companies in the telecommunications industry. Also, imagine how lowered security on such devices would open up so many people to cybercrime, thereby increasing social cost. PriceWaterhouseCoopers found that in 2014, there were 42.8 million cyber attacks globally. McAfee also found that in 2014, cyber crimes cost $400 billion. Ironically enough, a secret document from the U.S. National Intelligence Council admits that encryption is the best tool again cyber attacks. It would be interesting to see what a cost-benefit analysis on the issue would end up looking like (the issue is too recent for such information to be readily available), but at first glance, having Apple lose doesn't look good. As someone who loves civil liberties and does not like government-induced price hikes, the preservation of privacy would be a desirable outcome. 
  2. The FBI wins the case, and civil liberties aren't eroded because the government's scope is actually limited to this particular case. Some argue that Cook's claim about creating a backdoor is a tenuous argument, and that it is technically feasible to create an iOS software update for a specific phone without it being applicable to other iPhones. If it is possible to create the iOS software without giving the FBI the key to its encryption that could be used on other devices, then this could end up preserving both civil liberties and national security. 
  3. The FBI wins this case, and it's the beginning of the end of civil liberties. Requiring backdoors could very well provide a Pandora's box for cybercriminals, industrial spies, and intelligence agencies to conduct all sorts of surveillance and steal secrets. Even if the technology were only applicable to Farook's cell phone, why do you think it would stop there? This is a precedent that could affect the future of smartphones, computers, and other digital devices. Such devices would come with an implicit warning label of "Sorry, we might be forced to hack you." The Senate Intelligence Committee Chair is already drafting legislation to compel technology companies to weaken encryption in order to make it easier for governments to access devices. And this only considers what would happen on a national level. This could create precedent for other countries. The would be even worse if such software were to be used by authoritarian regimes that have little to no respect for civil liberties. There are countries where speaking out against the government could lead to imprisonment or even death. Encryption protects correspondence of dissidents, and having Apple create a key could mean that through legal channels or illicit ones, authoritarian governments could get its hands on this technology to make the lives of millions even worse. 
  4. It doesn't matter what the outcome of the case is because it's already the beginning of the end. We could very well find ourselves in a scenario in which the government is already on the path to use such devices as Internet-connected sensors, cameras, and other devices for surveillance purposes, as this 2015 Harvard University report shows. Technological development could very well inevitably render privacy a relic of the past, regardless of what the Supreme Court has to say. At the very least, the government can use technology to partake in a level of fact-finding and investigation that was not available in years past. 
So which is it? Can the government actually find a balance between national security and privacy? Will this case lead to a slippery slope in government surveillance? Does it ultimately not matter because it is only a matter of time that the government will have the technological capability to spy on us all, or will companies like Apple be able to harness the power of technology to protect the privacy of the people? I don't have the clairvoyance to tell you with 100 percent certainty what the future holds. I can make the educated guess that based on the government's past history of use and abuse of surveillance, the government's general trend towards becoming ever larger, and the demand from government to have such technology (e.g., the District Attorney of Manhattan said this past summer that there was a six-month period in which 74 iPhones were inaccessible), the government would not stop at Farook's phone. If the government knows that it can coerce one of the largest companies to create software that undermines the security of their products, then it would become more reliant on such a method in the future. It would also create a conflict of interest for telecommunications companies because it has to work so hard on creating devices and creating a separate division to undermine the security.

Ultimately, I think this will most probably end up being one of the biggest cases in American legal history involving civil liberties. Either we live in a society in which companies can provide us secure products to consumers, or we live in a society in which the government has the ability to force companies to violate the security of those products, thereby allowing that eerie, Orwellian capability for the government to use technology to spy on its people "for the greater good." This could very easily be used to undermine the trust in many devices on a global outcome, which is all the more important with increased reliability on digitized data. Whatever the outcome of the case ends up being, I hope that civil liberties prevail and that trust in digital devices can be preserved.

No comments:

Post a Comment