Friday, 1 April 2016
FBI v. Apple is really Smarts v. Democracy
What to make of the latest developments in FBI v. Apple, usually described as an epic battle between your security or your privacy? As I pointed out a few weeks back, this frame -- security v. privacy -- begs the question. (Oh, sorry, forgot, this is a blog, not an Oxonian debate). Security v. Privacy is a false choice set up to make you think it has to be one or the other, that you have one or the other.
To refresh your memory. The FBI convinced a lower court that the only way it could access secrets -- possibly vital to national security -- stored on the San Bernardino terrorist's i-phone was for Apple to write a software patch around its own encryption. So the court ordered Apple to create that patch. But this week, the FBI announced it had got the job done without Apple, so the court order was "vacated". This must have been an unhappy moment for Apple. As you may also recall, Apple's CEO, Tim Cook, had positioned himself as the Privacy Super Hero fighting for his customers' privacy rights, safely preserved by Apple's spectacular encryption, to the last $1500 per hour legal bill. Cook had insisted that Apple must also resist the court's order because it presented a constitutional issue as well as a privacy threat. Cook argued that no court can force Apple to write new software because software is a form of speech and the First Amendment protects Americans, and American corporations (thanks to the now dead Antonin Scalia) from State forced speech. Now no court of appeals will get to weigh in on either issue.
The presentation of Apple as the defender of privacy was odd since Apple has not only failed to protect customer privacy in the past, but has cooperated secretly with the State by giving the NSA access to its servers, as revealed by Edward Snowden's purloined documents three years ago. The FBI also seemed to be using this case as a kind of unstated excuse for its failures to protect Americans from domestic terrorists -- such as the San Bernardino husband and wife team. Strangely, no one I read questioned the FBI or Apple's assertions that current i-phone encryption is a thing of unbreakable beauty -- not even when the FBI announced that an unnamed 'third party' had taught them how to hack it.
Third party. Hmm.
A rumor immediately circulated (started by whom?) that the third party is the Israelis, though which Israelis -- spooks, a business, random hackers -- was not specified. Another rumor (started by whom?) said the third party was someone who'd formerly worked at Apple. Again oddly, no one I read suggested that the most likely identity of the third party is another government agency, such as NSA or one of its Five Eyes partners.
The Electronic Frontier Foundation, a privacy activist group with Libertarian connections, was quick to descry the fact that the FBI did not immediately hand over to Apple the method used to break into the phone. There is apparently an inter-Agency agreement on the books in the US that requires those who find breaches in commercial systems to tell the companies involved. The Electronic Frontier Foundation complained that most operations under this agreement remain classified so it's hard to know what has been patched and what hasn't.
Which brings me to the word patch. To patch is to temporarily fix something that is broken, or torn, by overlaying some other material. To patch is not to rebuild, it is to stick one's finger in the dyke until a better day. A software patch is the code equivalent of the mechanical workarounds developed by engineers to deal with complex systems as their parts break down with age and use. Workarounds can be more dangerous than the flaws they evade.
I learned this several years ago when interviewing the former head of the Nuclear Division of the former Ontario Hydro. More than 70% of Ontario's electricity is generated by nuclear plants, all then owned by Ontario Hydro. Two major nuclear facilities are in the Greater Toronto Area which is home to about 5 million people. We sat and talked in his comfortable family room in one of Toronto's western suburbs, at the obverse end of the City from the most troubled nuclear facility serving it, the Pickering plant. The Pickering facility is legendary for its breakdowns, labor issues, cost overruns. Serious radioactive leaks at Pickering have spawned various investigations and public hearings. Everything about Pickering is cost plus. Everything about Pickering involves workarounds.
This man explained that when the nuclear facility in Chernobyl, Russia, blew in 1986, and made a whole region uninhabitable for humans, he went over to have a look. He was appalled by what he saw, couldn't get it out of his mind. He came home determined to make sure Ontario systems were safe. He soon realized that the Pickering plant's myriad workarounds had not all been recorded on blue prints. When workarounds are not recorded, if things go wrong, the engineers trying to deal with the problem will either lose a lot of time trying to find an important valve or widget, or make serious mistakes, because the blueprints basically tell them lies. The man told me that until he got those workarounds recorded, he had difficulty sleeping at night .
Now it is obvious that smart phones are not as dangerous to your personal security as nuclear plants. No one says that nuclear power can ever be made 100% risk free. Every nuclear engineer will tell you that without hesitation. But smart phones, along with the plethora of other smart devices on offer, are extremely dangerous to your privacy. So why do companies like Apple insist that they can protect our secrets by encryption when a third party can waltz in and make that encryption useless? Why do so many corporations selling smart devices insist that they can and do protect our secrets? If you have a smart furnace, thermostat, fridge, stove, car, television, computer--any machine that can be directed by means of internet carried signals-- they can and may be hacked. We have to learn to think of these devices as strangers in our midst recording everything we say and do. We have to understand that privacy can not be assured by anyone.
Which is why I say that thanks to smart systems, the age of privacy, which arose coincidentally with democratic institutions and individual rights, is over. Once upon a time, the State could not read your mail or listen to your conversations on the phone without a warrant issued by a court. But nowadays, the State you live in, or a State that is an ally of the State you live in, can capture, store and read whatever you have texted, shared on facebook, tweeted, said on your phone, or, said to the love of your life while eating popcorn in front of your smart television. Never mind States, anyone with the right equipment can watch, listen and learn. If these devices continue their quick march into the centers of our lives, we will eventually find ourselves living the way Kings and Queens did for centuries--everything in the open, nothing hidden, not the bedchamber, not the bath.
So here's my question: if privacy is going, going, gone, what does that mean for democracy?