Edward Snowden: Freedom and Democracy Are Threatened Not by Foreign Rivals but Also by Our Own Government
September 19, 2016
Alan Rusbridger / Financial Times
Edward Snowden reflects on the consequences of his whistleblowing: his new life as a fugitive, the "reforms" that were promised following his disclosures of government spying, and his views on Oliver Stone's new film. "What's powerful about it" Snowden says, "is that it compresses years of reporting into two hours so it reaches a whole new audience who aren't wonks. The film can be a vehicle for democratizing information, and that's a beautiful irony for a film about an act that was intended to do the same thing."
Edward Snowden: Freedom and Democracy
Are Threatened Not Only by Foreign Rivals,
But Also by Our Own Government
Alan Rusbridger / Financial Times
(September 17, 2016) -- On September 2016 Alan Rusbridger interviewed whistleblower Edward Snowden over Lunch with the FT in a hotel room in Moscow. This transcript of their discussion has been edited and condensed.
Alan Rusbridger: Have you seen the Oliver Stone film?
Edward Snowden: I've seen a copy of the film that was close to final, if not final. What's powerful about it is that it compresses years of reporting into two hours so it reaches a whole new audience who aren't wonks or, you know, people who aggressively follow policy. And what this means is that the film can be a vehicle for democratizing information, and that's a beautiful irony for a film about an act that was intended to do the same thing.
AR: Tell me about your involvement in it. You met Stone how many times?
ES: He came to Russia a few times. We'd meet in an office and he would tell me what he was thinking and asked me some questions about my life. And I talked with the screenwriter generally to say, 'No, these things are completely impossible', and try to keep the film a little bit closer to reality. It's a drama, not a documentary, but I think people will appreciate that there are no car chases.
AR: If you had to give it a reality score out of 10, what would you give it?
ES: On the policy questions, which I think are the most important thing for the public understanding, it's as close to real as you can get in a film . . . But there are little things about my personal life, for example throughout the whole film I'm wearing glasses whereas in real life I would normally only wear glasses for interviews, that are helpful because they mean that I get to keep a little piece of myself for me.
AR: So that's a tiny little detail that is not right?
ES: Yeah. I mean, when you look at the film, it's a film about reality. Any differences are primarily stylistic for the purpose of narrative. For example, if you're not making a documentary and you've have a limited running time because it is a film, you can't include every person who ever influenced me in the course of my career.
And of course I haven't written my own story so these details aren't on public record: no-one has them. But you can create a composite of people that mattered to me, sort of the broad strokes of my history and my background and that's what I think he strove to do.
AR: Was it weird seeing yourself on screen played by an actor?
ES: I don't think anybody gets excited about the idea of a film being made about themselves unless they have a personality disorder. It is one of the most terrifying things that I can imagine . . . Because the thing is, no matter how much a film strives to represent a person, it will never quite be that person. It's a simulacrum, right? And, speaking as a privacy advocate, privacy isn't really about protecting secrets; privacy is about protecting people, it's about protecting identities, it's about protecting that sense of self, that piece of individuality.
There would be something sinister if everything about you could be taken and perfectly replicated in two hours in a film. I don't think it's possible, I think people are more than that, I think we're more than a collection of details in the fact that we're always changing. In short, when I was told that there was going to be a film made about me, whether I was in on it or not, it was a scary thing. Now, looking back on it, I hope it helps and I'm cautiously optimistic that it will.
AR: How do you think it will help? Because people have a better sense of you and where you came from and your journey that led to what you did?
ES: I'm not really important as a person in this story. I was only the mechanism of this revelation of information, returning information to the public that always should have been public anyway. And, yes, there's something interesting in that from the human element, but the issues that matter here are not about me but about us all, and that's where I think the film can help.
If they want to use my person as a vehicle to make a film about the issues that people care about, I'm OK with that as long as people think about the substance, as long as people are confronting the challenge of how our world is changing and the fact that, right now, even though it's difficult, even though a lot of these authorities and policies have become deeply entrenched and politicians are very much afraid to, shall we say, resist or challenge or check or even question the claims of our spies in today's political environment.
We the public are at one of the last points that we will have to make a difference in how normalised the culture of mass surveillance becomes. And so anything that gets people talking about it, any little thing that gets people thinking about it that otherwise might not, I think is a net positive.
AR: Joseph Gordon-Levitt plays you in the film. What do you think about him?
ES: He's an amazing guy.
AR: Did he come to Moscow to see you?
ES: He did. We met. We had lunch together and talked for several hours, just about everything, you know, sort of our personal lives, what we think about, what we care about, basic things like that. At the time, I thought it was just a social visit but, after the fact, he told me that he was actually sort of scoping me out, trying to get my mannerisms and things like that.
But I'm impressed with his performance. His characterisation of me, you know, makes me uncomfortable, with the super deep gravelly voice, but that's because you never hear your own voice the way other people do, right? You hear that sort of modulated by your skull. My family, people close to me, who have seen the film, say that he nailed my voice, so we'll see.
AR: Were you an admirer of Stone before this film? Are you a movie buff?
ES: I'm actually not that much of a movie buff but some of his early films I really enjoyed, particularly Platoon, which I think was based on his personal experience.
AR: Did you find Snowden the film moving?
ES: I think as someone who is the subject of a film, watching that film, you can't help but look at it from a more technical perspective and as a, gosh, this one's hard to talk about . . . There's always going to be something emotional about seeing something that you did retold as a story by other people. It shows a reflection of how your choices matter to them and what it meant to them. And I couldn't be more grateful.
And to have been so wrong about what I said in that first interview in the Guardian, which was my greatest fear, in that no-one will care and nothing will change and then, coming out three years later and seeing that, you know, what we thought was going to be a five-day story is still being reported on, and that I wasn't crazy.
This is a real controversy, this is something that people do care about and should have been told about. And, I don't know, seeing that as a film reinforces that sense of relief, I guess, because there's nothing scarier than taking a decision like that and not knowing what will happen next.
AR: And what do you want to say about the New York Times story and the involvement of your Russian lawyer?
ES: I think it's kind of amazing to see the difference in perspective between people who work in journalism, who are reporting that story, and then people who work in other segments, like Hollywood, or the law, or anything like that. I guess I never realized that what people never see about films is all the knife-fighting that happens in the alley that eventually leads to the final product.
AR: You didn't earn a penny out of the film?
ES: It's public record that I haven't been paid for it.
AR: Let's move on to what you were just touching on. It's now, what, three years since the revelations?
ES: It's been more than three years. June 2013.
AR: Tell me first how the world has changed since then. What's changed as a result of what you did, from your perspective? Not from your personal life, but the story you revealed.
ES: The main thing is that our culture has changed, right? There are many different ways of approaching this. One is we look at the structural changes, we look at the policy changes, we look at the fact that the day the Guardian published the story, for example, the entire establishment leaped out of their chairs and basically said 'This is untrue, it's not right, there's nothing to see here'.
You know, 'Nobody's listening to your phone calls', as the president said very early on. Do you remember? I think he sort of spoke with the voice of the establishment in all of these different countries here, saying, 'I think we've drawn the right balance'.
Then you move on to later in the same year when the very first court verdicts began to come forward and they found that these programs were 'unlawful, likely unconstitutional' -- that's a direct quote -- and 'Orwellian in their scope' -- again a quote. And this trend continued in many different courts. The government realizing that these programs could not be legally sustained and would have to be amended if they were to keep any of these powers at all.
And to avoid a precedent that they would consider damaging -- which is that the Supreme Court basically locks the power of mass surveillance away from them forever -- they need a pretty substantial pivot, whereby January of 2014 the president of the US said that, well, of course you could never condone what I did. He believes that this has made us stronger as a nation and that he was going to be recommending changes to a law of Congress, which then later, again this is Congress, they don't do anything quickly, they actually did amend the law.
Now, they would not likely have made these changes to law on their own without the involvement of the Courts. But these are all three branches of government in the US completely changing their position. In March of 2013, the Supreme Court flushed the case, right, saying that this is a state secret, we can't talk about it and you can't prove that you were spied on.
Then, suddenly, when everyone can prove that they had been spied on, we see that the law changed. So that's sort of the policy side of looking at that. And people can look at the substance there and say, 'This is significant'.
Even though it didn't solve the problem, it's a start and, more importantly, it empowers people, it empowers the public; it shows that, for the first time in four years, we can actually start to impose more oversight on intelligence agencies, on spies, rather than giving them a free pass to do whatever, simply because we're scared, which is understandable but clearly not ethical.
Then there's the other way of looking at it, which is in terms of public awareness. It seems that from time to time in history we become a little too comfortable with the claims of officials and with the claims from people in positions of power. Because we all want to believe that the most privileged in society are also the most ethical, and that we can trust them to represent our interests and that we can trust them to do the right thing.
But, unfortunately, every government is comprised of people, and people are imperfect. The only way to get the best, to get the most ethical behavior from people, from government, is to hold them to the highest standard of accountability. Privacy is for the powerless, not for the powerful, and that was what we did not know had been lost.
The rise of the state secrets privilege in governments around the world had created an imbalance of public power that continues today, that we are reliant on journalists and journalism in newspapers to actually fight this battle for us.
They have to be in contest with institutions, be they government or corporate or, you know, anything else, to find the truth, find the facts that matter, make an independent judgment about what the public interest is, and to convey them to us because, without that, if we just, as the public, allow things to happen, if journalists simply report the claims of both sides without testing them, without challenging them, we very rapidly reach this paradigm, in which we were living, where the government knows everything about us.
They know where we went, they know when we woke up, they know how we travelled, they know why we travelled, they know what we were doing there, they know how long we were there, they know everyone we love, for how long we've loved them, they know our hopes, they know our dreams, and they can know more if they want. They know more about us and ourselves than we do.
At the same time, we're not permitted to know even the barest details of their programs and policies, of their prerogatives and interests. And this, in a very fundamental way, is a corruption of democracy, because the bedrock principle of democracy is that the government operates based on the consent of the governed, but consent is only meaningful if it is informed. And that's what we lost. We have more knowledge now than we did then. We have curtailed some programs, some powers that were abused. But there are more. There's so much that we don't know.
And I think the lesson of 2013 is not about surveillance but democracy. It's that if we, the public, are going to actually have a government that serves us rather than a government that we are subject to rather than partner to, it's a process of constant gardening and we have to be active participants. We have to be adversarial . . . This is not to say that the government is the enemy.
Government's not the enemy, we don't have to burn them down, we don't need to be anarchists to be able to enjoy peace and prosperity and success. But we do need to recognize that there are some principles, some values that must be defended, not just against adversaries and foreign rivals but against our own governments, because the threat to rights is not enemies but power.
AR: Remind me: what are the programs that have been curtailed?
ES: We'd need a couple of hours but, in broad strokes, we learned about the existence of a global system of mass surveillance carried out most centrally by a group called the Five Eyes Network. This would be the US, the UK, Australia, New Zealand and Canada. These governments act in concert to create a giant bucket in which the world's communications are poured after they've been intercepted.
This bucket can then be drunk from or used for many different purposes without the involvement of any court, or at least without, let's make this more defensible because they do have secret courts and things like that although they don't use them except in certain cases, you don't really have to use the bucket, you can use something else.
But the whole idea is that once the communications have been collected, and they are being collected without any suspicion of individual wrongdoing, they are being collected simply because they happened and they could potentially be interesting, eventually these communications stop being a bucket and they start being an ocean. And this ocean can be trawled through for anything, in the dark, without the public's knowledge, without any adversary or legal process and without, in many cases, the full protection of our laws and our rights that we would expect in a traditional criminal investigation.
Now this ocean of data can be analogized to the program called XKeyscore, and this ocean consists of all of the communications that are sloshing around the Internet at any given time, crossing Internet service providers, British Telecom, AT&T, Horizon. And there are other lakes and rivers beyond that that are entirely corporate -- these are private stores of communications and data held by companies like Google, Facebook, Microsoft, Apple, Yahoo.
And these companies had secretly gone beyond what was required by the law to cooperate in a program, called by the NSA, Prism -- a name which the companies themselves were not aware of, but allowed groups such as the FBI in the US to access, without any individual warrant, the communications of anyone who was not American who they claimed was a target of an investigation . . .. are a certain set of criteria. And that set of criteria expands over time.
That set of criteria is not set forth in the law but rather in secret by the Attorney General. It has since been made public as a result of these disclosures but it could be changed at any time without our knowledge.
AR: Let's take a country like Britain, where somebody who's followed this story reasonably closely might say, 'Okay, so Snowden did all this stuff in 2013 and the end result that we've ended up with is worse . . . Not only has nothing changed, it's actually got worse.'
ES: I think the way you have to look at it is in terms of what I set out to do which was not to tell the world how to structure their laws. It's not up to any individual to say, 'This is right and that is wrong'. It was never my intention to set policy but, if you believe in democracy, you believe the people should at least have a voice in the process, and we do.
While laws have gotten worse in some countries, some jurisdictions, France has gone very far, Russia has gone very far, in ways that are completely unnecessary, costly and corrosive to individual and collective rights.
We, of course, see the same campaign in Britain that's occurring quite aggressively, primarily led by Theresa May, who appears to have no regard whatsoever for the rights of individuals, where there is, without a doubt, an authoritarian trend in the direction of many nations' national security policies.
And it's important to understand that this is not unique to any one country, you know, we could talk about the atmosphere that's leading to these things in the UK, we could talk about the way that national security officials exploited terrorist attacks to gain authorities, as they always do in, for example, France.
But again, we see this happening in countries that have long had sort of the leash to carry out these policies but they did them in the dark before, now they've doing them in public. Of course, I'm thinking of countries like China, like Russia.
Now what this leads me to believe or I think what this sort of outlines is that we're in a global moment where people are less willing to defend individual rights because they are afraid. We live in a world where, despite the fact that by any measure you choose, our lives are more privileged and comfortable and safe than they have ever been before.
Whenever any bad thing happens, whenever there's an attack, whenever there's an atrocity on the other side of the world, it is in the living room of every home in every country by the end of the day. This is the result of that.
AR: So in terms of timing, almost the moment you did this the problem of Isis terrorism increased. I guess a lot of people who might temperamentally have been on your side might have said, 'I was sympathetic to what he was doing but now actually my mind's changed because it's obvious we need these powers'.
You'll see Independent Reviewer of Terrorism Legislation David Anderson in the UK has come out and said there's no alternative to the bulk collection of data.
ES: Right. I haven't actually read that report. Were there specific cases that he lays out?
AR: He said there were many cases he was told about where this had stopped terrorist atrocities.
ES: Let's start with the beginning and I'll use the US context because that's the one I'm familiar with. The White House did a full review of these programs. They formed two independent commissions, the President's Review Group on Information and Communications Technology or Intelligence and Communications Technology, I think, and the Privacy and Civil Liberties Oversight Board. And both of them reviewed these programs and found that mass surveillance, at least in the telephonic context, had never stopped a single terrorist attack in the US.
Moreover, they had found that it had never made a direct or a concrete difference in a single terrorism investigation in the US, and that's a very low bar. Now the real question here in terms of this British context is how they're defining mass surveillance, which they don't. They don't use the term mass surveillance, they use the term bulk collection, which is in many cases a euphemism.
But as soon as governments begin sort of redefining language and inventing different terms that should be a cause for concern because we have to wonder what, specifically, they are describing. Now, bulk collection is a form of mass surveillance but the argument here isn't that mass surveillance can never have value.
Of course it can, particularly if you're looking to create examples to justify its behavior and you've got three years to write a report in which to do so. It's the sort of issue of if all you have is a hammer, everything looks like a nail.
If you realise your mass surveillance powers are under threat, and you've got two years to write this report, which they had in the UK since 2013. Actually . . . they've had three years to go around and basically create justifications for mass surveillance. I wouldn't be surprised if they could shoehorn cases in there and say, 'This was valuable'. I also would not be completely surprised if some of them were legitimate cases. Let's say they have 30 cases that they're listing, three of them are probably truly convincing.
Some would argue that, well, one case is all that matters, right? If you can save one life, if you can stop one attack, doesn't that mean mass surveillance is worth it? But that's not the way that we evaluate any decision about rights . . . Is torture justifiable if it's effective? Donald Trump would argue that yes, it is, but every court in the world argues that it's not. With whom should we agree?
Similarly, mass surveillance is a clear violation of [our human] rights. Courts from the United States to the European Court of Human Rights to the European Court of Justice have all found mass surveillance has violated rights, which is why the safe harbor framework was shut down. In that context, the question is not, 'Can mass surveillance ever be effective'? It is, rather, 'Is mass surveillance something that we as a society want to engage in'?
If it is, is it worth the costs practically?
And finally: is there no alternative means? Now, everyone in the world who has studied this issue has found that it's not necessary, aside from perhaps David Anderson. And this is because we have targeted means of surveillance. Just a few days ago, there was an exploit for the iPhone that was publicly revealed that had been used specifically to break into the phones of targeted journalists and human rights activists. Did you hear about this case?
AR: Yes. I've updated my software since reading about it!
ES: It's terrifying. But this is one random US-owned company based in Israel who is able to achieve this. If a random company in Israel can break into any iPhone in the world, what makes you think the GCHQ or the NSA can't? Of course they can. In many cases this is what we pay them for.
The choice that we have before us is not to decide whether mass surveillance can ever be valuable but whether we need to discard the privacy rights of every innocent in a country because there are a few rare edge cases where criminals exist amongst them.
We don't do this in any other context. We don't allow police to enter and search any home. We don't typically reorder the operation of a free society for the convenience of the police because that is the definition of a police state.
And yet, some spies and officials are trying to persuade us that we should. I would argue there's no real question that police in a police state would be more effective than those in a free and liberal society where the police operate under tighter constraints. But which one would you rather live in?
Now, there are criminals out there, there are terrorists who we should track, who we should monitor, who we should surveil using the full scope of capabilities that are available to us -- but mass surveillance is not necessary to do that. Again, if we simply target an individual terrorist, we can get around any form of encryption by targeting the devices they use to communicate rather than intercepting the communications of everyone in the country. . .
AR: What about the argument you have somebody does something bad in France, goes and shoots a lot of people, lets off a bomb, and at that point you have a limited period in which you want to discover what led up to that and so you want to go back in time and the only way you can go back in time is by having collected the data.
So you want to find out where they've been, who they were talking to, who their associates are and, regrettably, that means that you have to have that data store which is available to the government.
ES: I think it helps to contextualise what this argument actually is about. This argument proposes that someone at some point could become a criminal, therefore we should watch and monitor everyone, store the records and activities of everyone, so that when a crime occurs, we can go into our surveillance time machine and scrutinise every point of their life. Perhaps we want to make that decision but we should be conscious of what it is that we are agreeing to.
It would be a world in which not only are we completely incapable of resisting government, should they choose to interfere with our resistance because, of course, how do you coordinate, how do you organise, how do you protest, when the government has every detail about the life of everyone who could be or is or perhaps might one day be in the opposition.
It's a fundamental disempowerment of the citizenry relative to the state but some people may agree and say, you know, 'Well, why not? Maybe we should do that, maybe that's the kind of world where things are going to'.
If that is the case, surely we should have the strongest safeguards that we are capable of designing in order to control those processes. And, in this case, the barest threshold would be, alright, name an individual suspect who you think is in this bucket, the people that you suspect are responsible for this attack and ask a judge to authorize that. Yet, even in the UK, where these reports are coming out, the government is resisting this, saying they don't want judges to evaluate the judicial merits of every individual claim.
Instead they would rather have them, you know, do it in these different mechanisms which you are more familiar with than I, I suspect. But it's also important to understand that these kind of emergency powers already exist, they are already in use in advance of legislation, in advance of public debate, in advance of legal authorization.
Is that the way it should be? Again, I'm not here to decide where policy should come down. My sole belief here is that we should at least have a hand in determining where that comes out. More generally, in a lot of those cases, and correct me if I'm wrong, do you know an individual case where that actually worked?
AR: The claim has been it's been invaluable in a case where, say, three people are involved, two have blown themselves up, one goes on the run and you can go back into the time machine and you can say, 'Well look he's got these 10 friends, we can now go back and find all those 10 friends'. I'm sure they would say, 'We catch the third one on the run because we are able to make use of information that was there to be filtered through'.
ES: It's an interesting point; it's an interesting question. In all the recent terrorist attacks, the suspects have been known to the authorities in surveillance, so they already had the ability to specifically target these people and collect their communications selectively rather than everyone's.
I think the argument here is that, of course, if we go looking for a hypothetical situation in which mass surveillance could be effective, we'll find them; if we look around in reality long enough, we will find them. But, again, we need to balance the threat to the way our societies operate, the rights of individuals that have existed for generations, with the danger that's actually being faced.
Terrorism is truly frightening and it is a real threat but it's also a . . . small threat. We spend far more combating terrorism that we do heart disease or automobile accidents or any of these other things that claim many more lives, even suicide. Despite that, despite the fact that terrorism is a greater concern now that it has ever been in our sort of modern or, you know, last hundred-year history, it claims fewer lives in western Europe now than it did during the 1960s and 1970s and 1980s.
You know the number of terrorist attacks, the [number] of the terrorist attacks during the period of the IRA was far more impactful and dangerous than it is today with jihadists. So why is it that measures that never could have passed back then are now suggested so casually today?
For me, institutions, and the press share some blame here because [they] are amplifying fairly rare threats, true threats but rare threats, in a manner that is causing more terror than the acts themselves.
AR: Last question on this. Do you ever have moments of doubt when you think, 'I may have made the security services' job against Isis harder?'
ES: No, because they already knew. Again, Osama bin Laden stopped using a cell phone in 1998, and it wasn't because of a story that came out in the newspaper, it was because Bill Clinton authorised a [strike] against the training camp that he had made a phone call at the day prior.
There is an aggressive form of Darwinism that takes place in terrorist circles and long before we, the public, come to know about any of these surveillance measures, they have already known for years because, if they had not, they are already dead. Moreover, we see this in the statistics, we see this in the capabilities, we see this in the methods that terrorists are using to communicate today that they had not learned the most significant lessons of how these surveillance programs operate.
If we were talking in the academic sense of breaking out the newspapers and trying to figure out how to counter these systems, they have done a very poor job in doing so because they are using the wrong kind of encrypted applications, they are using ones that are vulnerable to lawful surveillance, they're using methods . . . and what not that we know are not completely effective in avoiding surveillance.
But they do it because they know there's some danger out there but they operate based on that Darwinist method of trade craft of what they've learned from their friends who are still alive. But what keeps you alive in Syria is not the same thing that will keep you off the radar in France or Britain.
But let's presume the opposite, right. Let's presume that there were terrorists teaching literal classes from the front pages of the Guardian. Would that mean it was wrong to publish the story? Well the question is does this have more value to the public or does it have more value to terrorists, and who are there more of, more terrorists or more public? Whose interests should we be representing? Who should we be working for? And who gets to make the decision of what should be public and what should not?
I never published a single story about surveillance; I never revealed a single document to the public. Instead, I gave them to the press. Now, the press with access to as many experts as it needs about a major story with the full weight and resources of their institutions are the ones in our society who are placed to make those decisions.
Moreover, they're not just placed and prepared to make those decisions about public interest to weigh the risk to governmental capabilities against the harm that they cause to the public in the corrosion the secrecy has to sort of democratic processes but, further, they're actually charged with doing so, at least in the US model of democracy where we have the First Amendment.
But let's argue the alternative. Let's say that the newspapers had decided this should not be public. Let's say the intelligence services had been able to continue using these programs in secret, would it have stopped any of the terrorist attacks that have occurred in the last three years? There's no public evidence that that's the case.
In fact, there's no classified evidence that that's the case, or else we'd be reading it in the newspapers. There has been not a single example of [preventing] terrorism that was successful except for public knowledge of these programs. And that's very likely to continue because things are more complicated than that.
I think in general terms here the question is how much should we be governed by fear rather than by principles? The cost of democracy is uncertainty. If we are going to be a free people, if we are going to be a self-governing people, we have to be comfortable with risk.
We have to recognize that, yes, there are costs to having a free press, there are costs to having the kind of society in which people who are not senior officials of government can make decisions that although a top spy might say what you're doing is dangerous, a newspaper can say what you are doing is also dangerous and it's not your place to decide for us where that balance should fall, that's for the public.
AR: Can we move on to this current situation where there's a suggestion that there has been another whistleblower within the NSA.
ES: Oh, you're talking about the Shadow Brokers? So, yeah, what I tweeted out on this is that it's not an internal NSA leaker, there's no evidence to support that, it doesn't look as though it were that, and that's a very, in my perspective, unlikely explanation because of the framing of the disclosure. It was not announced in a way that sounded like a real ideologue who believed in something and believed the public should know about this.
This person did not say they disclose these tools so that they could be, the vulnerabilities that they were exploiting could be patched and people could be protected. They said 'something, something, we don't like the elites, they rule the world, here's an auction that we're creating to sell these tools', or whatever.
To me, that sounds an awful lot like a pretext. In the current dynamics of cyber attribution, which is that any time anybody gets hacked, the conventional wisdom is it's the Russians, well it used to be the Chinese, now it's the Russians, and in the fact that a lot of US intelligence sources say, off the record, that they suspect the Russians were behind the DNC hack, and there's been a lot of noise coming out of DC saying that they're going to have to take some action, or they're likely to take some sort of retaliatory action for it.
And then, suddenly, out of nowhere, a leak appears with a quality of information that we've never seen before, these kinds of tools were things that even I didn't have access to at the NSA and I had access to an ungodly amount of information, appears.
That, to me, doesn't strike me as a whistleblower; that strikes me as a warning. It's political messaging being carried out through information disclosure. And my suspicion, which I said before, is that somebody hacked this NSA commanding control box, right, which is, it's like a staging server, you can think of it as a kind of relay, they call them redirectors, which means when you're attacking from here, right, you don't want to attack this point directly because it's really easy to trace it back home, right.
So, instead, you go through a little tunnel, a bunch of different countries, and you pop out the other side. So when you trace it you end up in Africa, and if you trace that one you end up in Ecuador, and if you trace that one you end up in France, Germany, so on and so forth. It's a pain in the ass. But these are out on the open Internet, right, these are Internet connected systems, they are not air gapped at the NSA where nobody can get to the goodies.
So when you're carrying out these aggressive hacking campaigns at the NSA, you've got to move your tools from your tooling server out on to the Internet somewhere so you can use them.
What it looks like is that one of the servers that they moved these tools to had either already been hacked or was observed after the fact that it engaged in a hacking campaign and somebody was a little bit lazy, somebody was a little bit forgetful, they forgot to wipe basically the fingerprints clean after they were done, and these tools were captured.
This is not the first time this has happened by the NSA. They've been hacked before. And we do this to our adversaries all the time: it's routine business. This is actually the kind of thing that I did personally against the Chinese in my last position at the NSA doing counter computer network exploitation. This is how we attribute attacks.
So, again, the whole question of all of that is, you know, why does it come out? And what I was saying, and again this is pure speculation, I don't know if it's right or wrong, is that this seems to be basically somebody saying, 'Yes, you can show who we hacked but if you do that we will also show who you hacked'. It's kind of an implicit threat. But again there's no real evidence supporting, there's only indicators. I mean, it's just my speculation. But when I look at this it's basically a question of why did someone do this?
I lived through that whistleblower's perspective. I was tortured and struggling with, you know, do I come forward, do I not, what do I say, how do I justify this, how do I explain this, because this is what I believe in?
Compute the statements that I made during that period where I came forward even when I was secret to the journalists, some of those letters have been published since, for example, by Laura Poitras. Or the Panama Papers' leaker, who remains anonymous, but they wrote sort of their own manifesto. And compare that to the statement by these individuals, whoever they may be, and I think the difference is clear.
AR: What do you make of the DNC hack?
ES: I don't really study it but we don't know that it came from one hack. Allegedly they were hacked more than once by a bunch of different actors but the conventional wisdom on it is clear, right, which is that the Russians were interested in whatever they could get off these servers so they hacked them.
This is obviously something that we don't want going on but this is part of the problem of this surveillance free-for-all that we're allowing to occur by refusing to moderate our own behavior. We have set a kind of global precedent that anything is possible and nothing is prohibited.
For example, as far as we know, the only offensive cyber operation that's had physical effects on a nuclear plant was by us, in the Stuxnet case against Iran, jointly with the Israelis. If we're hacking people's nuclear plants, it's very difficult for us to condemn in meaningful concrete terms, right, to the application of sanctions and so on and so forth, people who are doing traditional political espionage.
The fact that the DNC got hacked is not surprising and interesting. We're hacking political parties around the world and so is every country. What makes it interesting is that some of the take, some of the things that were taken from this server, were published afterwards. That's quite novel.
AR: It means you think what?
ES: It's for political effect but the question is: what can we do about it?
AR: It's political effect meaning it's somebody who wants to influence the result of the next election?
ES: That's what it seems to be, but we can't say for sure. I would certainly think that would be the case. Without evidence, we don't even have a basically confirmed attribution here. Now from my position as someone who formally had access to XKeyscore, I think it would be very easy to attribute this hack to whoever it is.
But this creates a problem because, let's say, the NSA has the smoking gun that says the Russians hacked the DNC, and they tell us the Russians hacked the DNC, how can we be sure? It presumes a level of trust that no longer exists and it's for this reason that we need to establish processes for declassifying this kind of information, these kinds of indicators of compromise that were observed by these mass surveillance systems that we build.
They can actually serve to hold our adversaries, rivals, and criminal groups to account for a violation of the law. This is what's currently missing. We have all these extraordinary capabilities that we're investing a lot of money in, right, but we're not actually using them to the full public benefit. We're using them for an intelligence capability which, in theory, is supposed to inform policy makers so they can make decisions. Increasingly, though, policy makers operate in the public domain.
The principle that I'm trying to illustrate here, that I think is fine, is that it's not enough for policy makers to know something, they must be able to demonstrate that it is true, and this, when we see our adversaries or presumptive adversaries are engaging in novel forms of offensive action, we need to be agile enough to use our capabilities in novel defensive ways. And I think one of the best things we can do is to publish what we have for those kind of indicators.
AR: What are the downsides of doing that?
ES: The theoretical downside, it's sources and methods, right? This is why the intelligence agencies are always against doing anything about what they know. They go, 'We don't work for the public, screw the public, we work for the policy-makers, the policy-makers are our customers'. That's why intelligence agencies exist. And they're right, that's what their sort of mandate is for. But intelligence agencies don't have a finite number of sources and methods that are exhaustible like fossil fuels from underground.
In human intelligence this is true, right, because you've got a fixed number of agents out there, you can always get more but it takes time, it's difficult and, of course, human lives involved. But when you're talking about technical surveillance, signals intelligence, we are constantly inventing new methods, we're constantly gaining new accesses by hacking things and so on and so forth.
The downside, they would say, is if we publish evidence that we caught the Russians and here's how we know it was the Russians, or whoever was responsible, they would be more careful next time. And that's true.
But the fact of mass surveillance, this is actually the one case in which mass surveillance is actually effective, is the cyber context, but I don't talk about it a lot and I'd rather you don't play me up here, just talking about it just so you will understand it, is when you are watching a network of communications, right, no matter how careful they are, a signal has to get from point A to point B and that's a crime over miles of lines.
This is the . . . problem. It doesn't matter how carefully they encrypted it, if they see at this time of day this line went hot, it doesn't just go from the DNC server to the first level router and then disappear, it's got to get from there all the way back home.
Every signal has a start point and an end point and when you can see enough of the network and no matter how careful they are if they have to go over that, it's like somebody trying to sneak down your hallway; if you've got a camera on the hallway you are going to see them down there. They can wear a mask but you know only one person walked down the hallway wearing a mask, you know who it is.
The whole point is that, in this context, you can make them a little bit more careful by disclosing how you caught them but you can't make them safe. This is the same way that we process evidence in criminal trials. Traditionally we say we intercepted this person's phone call, or whatever.
You know, the police might fight against this, they may not want to give up the fact they had a wiretap but eventually people find out that wire taps exist, eventually people find out the masked raider exists, eventually people find out that you know MI5 is doing hacking and everybody's doing hacking and the FBI is doing hacking, it's all over the place. That doesn't stop it from being effective, it doesn't stop it from being valuable.
In every court room around the world, or in courtrooms in every country around the world today, somewhere there's some criminal who's facing a prosecution and it's the play button and hears themself on tape despite the fact that we've known about wire taps for, you know, 100 years, and that will continue to be the case. Not because they don't care, not because they're not careful but because they have no alternative. If they want to achieve their goal, they must assume certain risks.
That is very much the case in this kind of hacking. And this is the balance of equity that we can start to reshape if we think carefully about do we actually have to keep everything secret or are there things we can disclose? It will make our adversaries more careful but it won't make them safe.
And the threat, here's the other thing too, there's a strategic uncertainty that arises from this, kind of like these, you know, everybody loves these nuclear analogies, the idea that people don't know whether there'll be first use or, you know, whether it will only be second use.
If adversaries who are proposing a campaign to hack this group or that group had to think about the fact that 'Well, we could actually be facing not just the DNC's capabilities here, not just the hacking . . . but the company that they hired to investigate the hack if he gets caught, we're going to be facing the NSA, whose capabilities are not known but we do know they're incredible, and they could break from prior precedent to expose us specifically, individually, as an example and a deterrent, that changes that risk calculation, that may make certain operations that would previously be on offer as now no longer be authorized. I mean, I can't say for sure but that's the way that I think about it having sat in that chair, right?
For example, the Chinese don't really care about getting caught because there are no consequences for them. They break into everything everywhere, they're not cat burglars, they open the window, climb in after it, take everything they can and they leave.
Sure, they may have teams who are more careful handle political operations but I was covering one of their good teams and this was very much their operation. They just weren't really that concerned. We have to be able to change that, and if we're going to be able to do that we have to change the status quo of what our responses are.
AR: The Panama Papers. What do you make of those?
ES: I think that was an extraordinarily important disclosure and I applaud the whistleblower involved in it, I also applaud his skill in remaining unidentified, thus far at least. Increasingly around the world, whether it's in the areas of state power or corporate power or anywhere where great institutions, organisations of scale, establish a foothold, we see an inevitably arising corrupting influence sort of emanating from them.
Now this is not to sort of put malicious intent upon organizations or corporations, that's not the point. I don't think these are mustached villains sitting around a table thinking about how to destroy everything just because there's scale. It's the inequality of influence from which this arises.
Institutions, quite naturally, use everything within their capability to achieve their goals, to protect their interests and pursue their own prerogatives.
Now when this happens in a static system over time, the individuals have very little power relative to institutions and individual power has difficulty accreting because we have difficulty organizing, we have difficulty, even though we have a lot of power collectively, we're kept quite separate and distracted and distant from one another, not necessarily by intent, although perhaps you know there's a grand scheme somewhere, but more about the circumstances of our times and our lives, institutions begin to become the only thing that really matter within society.
They shape our laws, they shape our future, they shape our economy. They create booms, they also create busts but they don't suffer the consequences of the busts because they are able to shelter themselves using their influence that we, the real victims, certainly lack.
And this is the kind of thing that Panama Papers gets to the heart of, the fact that people tend to think about surveillance and they think all government is the enemy, and government has been a truly bad actor in many cases on the subject of surveillance. But corporations have played a significant role in enabling government surveillance. Beyond the issue of surveillance much of the inequality and unfairness that we deal with today comes from the self-interested behavior of corporations.
Now, you can't blame them for pursuing their own self-interest but you can blame the system that we have in place for failing to create mechanisms by which we can hold them to account. And of course this gets into the basic issue of tax evasion, tax avoidance.
I come from a libertarian background and tax avoidance is something generally to be prized, you're supposed to minimise your tax burden so long as you are not evading it unlawfully. And many of these groups would argue that what they are engaged in is tax-avoidance, they're using the system to its utmost.
But when they have mechanisms available to them that ordinary people do not, suddenly it stops becoming clever self-interest and it starts becoming a kind of pseudo-criminal exploitation of a system which has real victims.
These are groups, which are benefitting from infrastructure that has been collectively paid for, for even those libertarian types who're opposed to confiscatory taxation, money that has been taken from these individuals not on a voluntary basis but via mechanisms of [taxation]. And it has been used to benefit this group without bearing the same costs, without playing by the same rules.
It is creating a kind of exceptionalisation that's not available and, many would argue, not permitted by the rules at all. But because of the institutional influence, because of the institutional power and, most critically, because it was secret enough that we did not know about it, the victimisation can occur on a grand scale rather than on something more traditional. . .
So the question becomes how do we sort of ameliorate these rules or the structure of our system? And the first step of that is we have to know that they exist. You know, the doctor has to be able to diagnose what's wrong but until he can get an X-ray that's something that's not an option.
What the Panama Papers provided for us was an X-ray of a hidden section of the legal world that we see is taken advantage of by the most powerful members of society, the ones that we should expect the best behavior from. Now unfortunately we're finding that it's often the worst.
We saw, of course, this happening in Russia with widespread corruption, which is perhaps not so much of a surprise although it should always be a great disappointment, but we saw this happening as well in western liberal societies that we believed to be, by and large, free from similar levels of corruption.
Once you pull back the blanket, you begin to see that we have problems in our house, too. The unfortunate thing is when we see that politicians and policy-makers have been implicated in these things, as has happened in the UK and Iceland, rather than having the rest of the policy-makers push for reform, we see unfortunately the first instinct is often to protect the party, protect the politician, and I think that makes our problems all the more stark.
AR: We've talked about how the most secret agency in the world appears to have been hacked from the outside and here's a company whose whole rationale is secrecy and has had forty years of documents filched. Is no CEO safe? Is there no company that can be complacent about their information?
ES: We are living through a crisis in computer security the likes of which we've never seen. We have more systems that are more connected with more vulnerabilities than have existed in the past. This does not mean that it's easy, that it's trivial to hack into an organization that has good practices.
There are things you can do to reduce the risk of becoming victim to a compromise in the exploitation. But until we solve the fundamental problem of this paradigm, which is that our policy is incentivized offence to a greater degree than the incentivized defense, hacks will continue unpredictably and they will have increasingly larger effects and impacts, particularly given the sprawl of censors that are occurring in the wake of sort of this push toward the Internet of things where now when you buy a car, your car is networked.
This is not an idea that is going to be appetizing to a lot of software companies but for the vast majority of enterprises in the world we would all benefit from a policy that imposed liability for true negligence in the architect in the software.
Now what does it mean? Software has bugs, right? And no programmer is going to be able to create completely clean code, perhaps they could if they worked at it, if the incentives were correct. But right now, this story with the hack of Shadow Brokers for example showed that the NSA knew of vulnerabilities in US hardware products.
Now Cisco, they're the world's largest routing company, right? They form the backbone of the Internet. If you run a business or a company of any scale, you probably have their devices sitting in your data centers, the firewalls that the NSA had hacked were likely to be the same kind that you yourself were using.
And yet, despite the fact that NSA knew about these vulnerabilities, and despite the fact that the NSA likely knew that they had been hacked in 2013, they didn't tell the manufacturer about it so they could close the hole that the NSA had been exploiting because the NSA wanted to continue to use this same hole themselves to break into other networks.
This is a critical problem because the richer and more organized your society, the more networked you are, the more reliant upon technology you are.
One of the classic examples is North Korea. We can hack North Korea all day long and it's not going to make a significant difference because they're still using technology from the 1960s, whereas, when they hack us, as they allegedly did in the case of Sony Pictures, they can cause extraordinary economic damages with very few risks.
Now this doesn't have to be state on state, this can be criminal groups, as we've seen quite recently hacking the Swift banking network repeatedly from different countries where they simply directly break into a bank and they create a false series of transactions to give them the bank's money.
These transactions often can't be reversed and they don't require any state level intelligence capabilities, because software is fragile, software is weak, because there's no benefit to vendors right now on building strong software, because it's too difficult to assess the difference even for experts between strong software and software that was created poorly -- this is closed source proprietary we're dealing with here, but there's two kinds of software basically: closed source and open source.
"Closed source" means you write a program, you package the program, you give it to someone but they can't see the actual code so they can't assess if it's vulnerable. "Open-source" software is you sell them a license to this software, you give it away to them, but you also publish the code so anybody could, the reason open source software is not used all over the world, for everyone in every business, is because, in theory, someone could take your code and use it themselves and sell it.
This is illegal because intellectual property laws and what not prohibit it, they would be violating your licensing agreements and everything like that, but because it's technically possible, and a lot of these companies don't want competitors or whatever, they just go 'screw it, we're not going to order closed source'. That makes it a little harder to beat against. But from a security perspective, it is disastrous . . . .
But right now if any one of these proprietary software vendors has a completely negligent fatal flaw in their code, for example, it's got a back door written into it which one of the programmers left in there because it would be convenient for them to troubleshoot problems with the customer, they don't have to ask for a password, they can just log themselves in, anybody else can find that weakness, and they do.
It doesn't have to be that great, there are subtle weaknesses that can be just as bad that could be prevented by following best practices or by using what are called memory safe programming languages. But very few people follow these practices, very few people use memory safe programming languages because there's no commercial incentive for doing so.
But in many other industries, when there aren't commercial incentives for doing something, for example, food safety, we use regulatory powers to accomplish these same goals.
Now again people from my tribe will be extraordinarily mad at me for suggesting regulation in the terms of negligence for software security because once you open that door, people are worried that regulation will only increase, it will become more intrusive and eventually they'll say you can't write certain types of programmes at all. And that's a real threat, it's a real danger and it shouldn't be discounted out of hand.
But the central issue of computer security, the central issue of our hacking crisis today is that until we start requiring people to create more secure software everything you buy that has a power button on it is going to be vulnerable and that's not a world that we can afford to risk.
AR: What can you tell me about your life?
ES: It's getting better. I'm more active, I'm more productive, I'm more open now than I've been since 2013. I very recently presented research at MIT.
AR: On your new magic phone? [Snowden has been reported to be working on developing a case for a phone that will show whether its radios are transmitting.] . . .
ES: I've got a lot I can talk about but my work at the Freedom of the Press Foundation is increasingly about beginning new technology projects for which there is no commercial market. These are efforts that are needed that are valuable but are not commercially viable, they can't really be marketed.
For example, anything that's created to protect journalists, which is the issue that's near and dear to my heart, is basically impossible to function as a commercial market because [newspapers] are poor and they are not looking to spend a lot of money nowadays.
Moreover, they can't see the threat clearly enough yet, in many cases outside of the most extreme, to justify those kind of expenditures that would be needed to actually support a market there.
So instead we use charity non-government organization volunteer efforts to try to bring together the world's best experts to think about how can we solve some of these problems? Can we create tools that will be useful to journalists that will protect them, that will keep sources safe and will keep stories that need to be public, public . . . .
So the majority of my time is spent working on efforts for the Freedom of the Press Foundation creating technology that's not commercially viable to protect journalists. I mean that's really the whole of it that we're doing right now. . . .
ES: The other thing is I'm doing more and more speaking at universities around the world.
AR: And how are you sustaining yourself in money terms?
ES: I'm paid for the speaking. I always lived a really humble life so for me that's actually quite comfortable.
. . . All my work's in English. Everybody I talk to I speak to them in English. I'm not like, people think it's a pat answer, but I said before, I sleep in Russia but I live all around the world and it's true. I don't have a lot of ties to Russia and that's by design because, as crazy as it sounds, I still plan to leave.
AR: And you spend a lot of your time online?
ES: It's my life, but it always has been, so.
AR: I've read, Ben Wizner [Snowden's lawyer] was going to try and get a petition to Obama. What can you say about what your hopes are for going back?
ES: I'm not actually in charge of the campaign. Of course I'm happy to see it and I hope they're successful but, for me, this has never really been about what happens to me. No matter how the outcome shakes out, it's something that I can live with.
AR: You talked about being critical of the Russians. Why do you do that? Some people might say 'Well in his shoes isn't he best to just shut up?'
ES: A lot of people who care about me tell me to shut up. They say 'What are you doing? What are you thinking? This is a mistake.' And they're right but sometimes the only moral decision is to make a mistake and to do it knowingly.
If I was married to my own self-interest, I never would have left Hawaii, and I don't think that should change just because of my circumstance. I'm not responsible for Russia. I can't save the, you know, I can't fix the human rights situation in Russia, and realistically my priority is to fix my own country first, because that's the one to which I owe the greatest loyalty.
But at the same time when you see something, when you know something, when you care about something, and you say nothing, you're doing an injustice. I would never obligate another person to risk their security against their own sort of self-interest but I can make that choice for myself. And even though chances are it will make no difference at all, no-one will notice, no-one will care, maybe I'll be wrong and maybe it'll help. . . .
So you have zero contact with them?
ES: I have zero contact. That's the whole point of having a Russian lawyer is that I don't want to have any contact, right?
AR: Do you go around freely in Moscow?
ES: I walked here, yeah; I don't even have a back pack.
AR: And are you past being noticed now? Do you get glances in the street?
ES: Occasionally I still get recognized without my glasses but it's got to be people who either just watched Citizenfour yesterday or something like that or they are just one of those people who have a talent for recognizing faces. So it's not bad.
AR: What do you miss most about America? I mean, are there sorts of tastes or smells or music . . .?
ES: I mean it's more abstract than that, I think. I mean everybody misses a sense of home, being close to their family, but technology overcomes most of that divide. For me, I'm a little bit of an outlier to begin with because, remember, I signed up to go work overseas for the CIA and overseas for the NSA. I've spent, I mean, I haven't done the math, but at this point I've probably spent the majority of my adult life, at least like real career life, working overseas. So it's really not that much different from the postings that I had for the US. The only difference is that I'm still posted overseas and I work for the US but they don't realize it.
AR: One final matter of detail about the film: the Rubik's Cube. Was that real or not real?
ES: Oliver confirmed in an interview that he gave recently that that's a touch of the dramatic license, but that's only because I wouldn't confirm or deny how it really happened. What I will say is that I gave Rubik's Cubes to everyone in my office, it's true. I really did that.