HIRO MIYAZAKI: Good afternoon. I'm Hiro Miyazaki, Director of the Mario Einaudi Center for International Studies. It is my pleasure to welcome you all to this year's Bartels World Affairs Fellowship Lecture. The Henry E. and Nancy Horton Bartels World Affairs Fellowship is part of the Einaudi Center's flagship programs. It is also one of the most prestigious awards Cornell has to offer.
This fellowship program was founded by Hank and Nancy Bartels in 1984 to foster a deeper and broader understanding of international issues among students and others in our Cornell community. Through this program, we have brought to campus a long list of distinguished figures, such as Archbishop Desmond Tutu and His Holiness the 14th Dalai Lama.
This evening, we have a very special guest who has tackled one of the most challenging global issues we face today, cybersecurity. Christopher Painter, the third Cornellian to be honored as a Bartels fellow-- actually, technically the second, because the third one didn't really graduate from Cornell-- served as a coordinator for cyber issues in the US State Department from 2011 until this summer. Mr. Painter's lecture is also part of an energetic effort at the Einaudi Center to address pressing global issues from multidisciplinary perspectives.
Global issues we face today are all complex due to the interconnectedness of the world, and complex problems, as you know, demand complex solutions. outside conventional thinking. And these problems also demand collaboration across disciplines, professions, ideologies, languages, and other barriers of difference.
Cybersecurity is an excellent example of this class of issues. It is a highly technical issue that has deeply social, economic, political, and legal dimensions, and has far-reaching consequences to citizens' lives across the globe. Mr. Painter has devoted himself over the last decade to building multilateral and private-public collaborative frameworks for searching for solutions, nuanced solutions to this complex global issue. We are very lucky to have this opportunity to be in dialogue with him.
I'd like to take this opportunity to thank my colleague, [INAUDIBLE] [? Mitchelson, ?] for bringing this complex event together so seamlessly. And I also thank other staff members of the Einaudi Center for coordinating this event. Special big thank you to Professor Fred Schneider and Professor Rebecca Slayton-- [INAUDIBLE]-- for the leadership they have provided for the Einaudi Center's special working group on cybersecurity over the last year or so.
The cybersecurity working group is a joint venture of the Einaudi Center, the law school, and CIS, along with several other colleges. An idea was born almost exactly two years ago when I met with Dean Greg Morrisett, Dean of the Faculty of Computing and Information Science. Thank you so much for coming this evening, and I now invite Dean Morrisett to introduce the 2017 Bartels Fellow, Christopher Painter.
GREG MORRISETT: Good afternoon, and thank you very much, Hiro. It's an honor to introduce our guest today, not only the world's foremost expert on cybersecurity, but also Cornell graduate from 1980. Yay. Go Big Red. So over the past 25 years, Chris Painter has led the United States' diplomatic efforts to advance cybersecurity, fight cyber crime, encourage multistakeholder internet governance, and promote responsible state behavior and cyber stability.
He was appointed cyber coordinator by then Secretary of State Hillary Clinton in 2011, and continuing in this role until his recent resignation when, if you read the articles that came out, he was branded "the weary "warrior" of cyber warfare in the United States. So he led many delegations to international cybersecurity meetings and chaired the G8 High-Tech Crime Subgroup from 2002 to 2012.
Prior to working in the State Department, Chris served in the White House as Senior Director for Cyber Policy, and as the Acting Cyber Coordinator at the National Security Council. And then previous to that, under President Obama, he was a senior member of the team that conducted the Cyberspace Policy Review and oversaw the International Strategy for Cyberspace, which was intended to synchronize US foreign policy position on crosscutting cyber issues. Christopher is the recipient of the RSA Conference Award for Excellence in the Field of Public Policy, the Intelligence Community Legal Award, and has been named to the Federal 100 list.
And on a personal note, when I started my career here at Cornell in the mid '90s, I was working on programming languages, and there was a very high-profile case at the time. Kevin Mitnick was prosecuted for various computer crimes. And the prosecutor was Chris Painter. So this is a case that caused me to re-examine, where is cybersecurity going? And to shift my own research focus in that direction. So it's a particular honor and privilege to introduce to you Christopher Painter. Please join me in welcoming him.
CHRISTOPHER PAINTER: Well it's a great pleasure to be back in Cornell. And I want to thank Cornell and the Center for International Studies for hosting me and honoring me in this way, and the Bartels Foundation. So thank you very much for all those things.
When I was here in the late '70s-- God, it seems like so long ago now. It is so long ago. It doesn't seem that long ago-- and 1980, certainly there was diplomacy. Diplomacy certainly existed, and it was a well-oiled and well-honed sort of discipline.
But computers were not that advanced, and certainly the whole idea of cyber diplomacy didn't exist at all. So to give you an example, I tried at Cornell to combine hard science courses and humanities courses. So I majored in government and history and minored in biology. I took courses like advanced differential equations. But at the same time, I took courses that had no answers. Some of the humanities courses and courses that had answers to try to balance that out.
But I also took a computer programming class. And back then-- remember this is 1980, 1979-- we programmed things on cards, and we would punch the cards and put them in these batch things. Very different from today, certainly.
Also my work study job back then was working-- this is a great work study job, if you can get it-- it was in the psych department, and I was playing a computer, a mainframe computer was PDP-11 made by DEC at the time-- I had to play hundreds of games of backgammon against the computer to prove something about game theory. But I think the way I acted was irrational, and they had to throw the data out. I'm not really sure. But it was a good work study job.
Now I'm a recovering diplomat. And as he said, I'm also a recovering lawyer. And the thing about being a lawyer is especially when it's the evening and people have had a long day, you can usually break the ice by telling a lawyer joke. But the problem is in my long career, I've determined that the problem with lawyer jokes is that lawyers don't think they're funny, and non-lawyers don't think they're jokes.
So I will dispose of it. I don't know of any diplomat jokes. If you have any, let me know after my talk. I am, or I just recently was, a cyber diplomat. And I'll tell you what that means and what the implications of that are. And it's a relatively new field.
But I also had, as is pointed out, over 27 years of experience dealing with various aspects of this, so I think I have a good perspective on both the threats and also how we've countered them. And not just domestically, but particularly internationally.
And I want to talk tonight about the growing threats that we're seeing in cyberspace and the role of diplomacy in combating those threats. And not just combating the threats, but seizing the many opportunities in cyberspace. People talk about and you see every day the threats on the front page of papers, but you know, combating the threats is only a vehicle to making sure that cyberspace is something that is a platform for economic and social growth.
So I definitely want to talk about that. There's a couple other things I want to cover. So by the end of my lecture, you'll know what this-- zombies-- have to do with this-- diplomats--
--and why I'm so obsessed with this. So let me start there. So my old office at the State Department, you know, offices in the State Department are usually festooned with various diplomatic things, right? Pictures of different places, gifts that you get in diplomatic exchanges, agreements or treaties.
You know, mine was a new office. It was an office dedicated to cyber diplomacy. And I'll describe what that means in a few minutes. So I wanted to make it unique for the State Department. I wanted to have a unique brand. So there's not a lot that really resonated. I could have some pictures of the internet, and I had that. I could have various network maps, and things like that.
But I decided that I was going to decorate the office, the office suite with me and my colleagues, with movie posters where hackers or computers were the main characters. And you know, there were a lot of them. I had over 30 of them, and there are new ones every day.
And these are not all good movies, I have to say. Some of these are good movies, some of these are not-so-good movies. But I had WarGames, of course, and The Matrix and The Net. The Net, not great movie. Sneakers, The Computer Who Wore Tennis Shoes back from the '80s, a Disney movie.
And most of those were all dystopian. It's hard to find a really uplifting computer movie. They're almost always dystopian in some way. Somehow, the world has gone to hell because of computers.
And this was actually my favorite, because this is the first movie, Colossus: The Forbin Project, which incidentally, I actually saw this movie-- it came out in 1970-- when I was a kid. I went to that theater in Cherry Hill, New Jersey, and sat through two full showings of this movie, which will tell you something probably not great about my personality. But it certainly foretold where I was going to go with my career.
And the thing about Colossus is it was the first movie where computers took over the world. So the US builds Colossus to control its nuclear arsenal and take the men out of the middle, to have perfect deterrence, because you have this automatic computer determining when you're going to retaliate.
And the Soviets-- it was the Soviets back then-- stole the information and they built their own computer they called Guardian, and they put it in the [INAUDIBLE]. And the two computers discovered each other, asked to talk to each other, and became self-aware. They exceeded the bounds of their programming. And then they decided to protect humankind from itself, they'd take away all civil liberties and take all over the world.
So it was the first movie where computers took over the world, long before WarGames, Terminator, any of those movies. But it also illustrates a tension in this area, which is the tension between security and how you do security and human rights online, and making sure you can have both. So I'll come back to that in a few minutes.
I wanted to talk a little bit about my background too. And my first computer was this, which was-- this is about the same time the Apple I, I think, Apple II came out. This is an Atari 800 computer. It was very powerful for its day. It had a cassette tape drive. So you actually used, as your storage, a cassette tape drive, and it was incredibly slow. But it was great. It had its first multitasking operating system. It was a great computer. And I was totally hooked on it.
And now of course, today, you have the power-- many thousands of times the power of that computer in your pocket. And we carry powerful computers with us everywhere. And compared to back then when computers are really a novelty, computers are now pervasive in every part of our lives.
Computers and computer networks-- the internet didn't exist back then, or it existed in very nascent form. The worldwide web certainly didn't exist back then. But now we are all dependent for financial transactions, for social transactions, to communicate, really for everything.
And generally, it's been a tremendous force for good. It has been a tremendous force for commerce and economics throughout the world. It's been a tremendous force for social interaction and connecting people around the world.
But it also has been the target of criminals, malicious state actors, terrorists, and others, and that is an issue. And while the internet has also allowed people to organize and to resist repressive regimes-- a good example is the organization during the Arab Spring, for instance-- it also has allowed those same regimes to monitor dissidents and undermine democratic processes that we've seen more recently, or seek to.
So that dichotomy where you have this tremendous good but also all these threats means you have to counter the threats and try to preserve the good. And when the internet was built, it wasn't built for security. It wasn't built for easy attribution of people doing bad things online. It was built for global connectivity, and it was built for resilience. It was built to survive attacks, so even if you knocked a couple of the nodes off, you could still have a survivable system where people can communicate.
Vint Cerf, one of the architects who created the internet, talks about this and said, security really wasn't in their jobs stack as they were trying to figure out how to create this thing. And while that makes a lot of sense, and even as a former prosecutor, I don't want an internet that is fully attributable. I think that that has profound human rights violations, or human rights implications and can be violations. The fact is that if we're not careful, the many threats that we see, both policy and technical out there, could really hinder or undermine this truly transformative technology that's really brought us where we are today, and is increasingly being used in the developing world and enabling them.
Not surprisingly, because of the global nature of computer networks, these issues almost always have an international dimension. And so it's been necessary for us to work together not just domestically to deal with these issues, but also to work globally, both to preserve and expand the benefits of computer networks and combat the technical and policy threats. And I've been very fortunate to be on the front lines, both observing but also participating in and helping drive a lot of the policies that we've dealt with, and trying to combat those threats, and also to enable more good things to happen.
My career was hit on just now a little bit, and I'll go into a little more detail, but not a lot. I started-- I didn't start life as a prosecutor, but I started as a prosecutor doing computer-- you know, when you start as a prosecutor in a big office like Los Angeles, you started doing kind of baby cases. Bank robbery cases, counterfeiting cases.
Los Angeles is the bank robbery capital of the world. And these were not always smart. Stupid criminals were always a boon to prosecutors. So I had one case, not a computer case, where someone, a robber, walked into a bank, had a mask on, and robbed it, not surprisingly. But he was not the smartest guy. He worked as a maintenance person in the building next door and he left his name tag on. So--
--that's the kind of criminal you want. Internet criminals are a little more clever generally. Not always, but a little more clever. And I quickly turned, just in the second year I was there, to start investigating computer crime cases. And not a lot of people did that back then.
So I investigated hacking cases, cases where the computers were used for fraud and other schemes. I did some of the first two stock manipulation cases on the internet, some of what we call denial of service cases, which I'll talk about in a little bit as well. And just a range of really, I think, interesting, then first of their kind cases.
And then after a few years doing that, I moved back to the mother ship at justice in DC and helped run the computer crime section there, which was both the case efforts but also the policy efforts, including some of the international efforts, As mentioned, the G8, when it was the G8. But a lot of other international efforts too, because these computer crimes cases were always international as well.
From there, after a while there, I went to the FBI as a Deputy Assistant Director of the FBI Cyber Division, which is a whole different angle of this. And then to the National Security Council at the White House when President Obama came in and worked on some of the kind of formative documents around cybersecurity and policies, including getting all the interagency players together to work on this, including the Cyberspace Policy Review, which was looking at all the things around that we've done, and making sure that our policies were really geared to what we needed [INAUDIBLE] and what the holes and the problems were.
And from there, one of the things we recognized is that there was a gap, that we needed to do more to up our international game. So I went to the State Department and created this new office in the State Department that did cyber diplomacy.
What I want to cover tonight is the evolution of the threat from my perspective, both the technical threat and the policy threat. And I'm going to punctuate that both with some stories about things I was involved in, but also particularly some of the cases I was involved in.
And then talk about how we have worked to combat these threats. You know, what some of the tools the government has, but most particularly the international tools, and really, the beginning of a new field of diplomacy in cyberspace that didn't exist before, and why that really is critically important to combating threats. But more importantly, to preserving an open, interoperable, secure and reliable information infrastructure and being able to have all those attributes at once. Not trading security for openness-- there's always some tension between them-- but having all those things together.
So let me start with the threat. You see this on the front pages almost every day. There's a new case, there's a new big story, often sensationalized, about a new intrusion, a new theft, a new potential attack. There's all this talk about cyber warfare all the time. Often that's a little hyped up, but there's always a lot of talk about that. People are worried, and I think people understand this is becoming a bigger issue.
Recently, we had the big Equifax intrusion, for instance, where lots and lots of personal data was taken from probably most of the people in this room, and from me probably. And that was certainly of great concern.
We've had the ransomware cases. And one of the examples was the WannaCry case recently where malicious software was being used, malware was being used to encrypt people's system, and they wouldn't decrypt it unless you paid a certain amount of ransom. And that had a big effect on particularly the national health system in the UK, and that certainly is a problem. Didn't have as much of an effect in the US because we had some procedures in place and some of the machines were not as vulnerable, but that was certainly a big issue.
It was mentioned, and I mentioned it briefly, the Kevin Mitnick case. So that was back in the late-- like, 1995 or '97. Kevin Mitnick was a hacker who was more what we called in this area a social engineer than he was a particularly skilled technical expert. He had some technical skills, but he was essentially a good con man. He was able to talk people out of things.
So he was able to call and pose as an employer and call up an employee and say, I'm the systems administrator. Can you give me your password? And they would. Or he would pose as the employee and call the employer and say, I'm on the road. I need access to the system. Can you give it to me? And they would.
And then he'd use that access to expand his access and get greater control, and then he'd use that to take information. And basically did that to many, many companies, dozens of companies, not just domestically but around the world. And so it was an international case. It wasn't just domestic.
In fact, one of the victims was Nokia in Finland. And because he couldn't get the computer data from them, he actually called up, posing to be Nokia employee. And to their credit, they taped him doing this. And when we arrested him and were prosecuting him-- and he ended up pleading guilty-- it was very powerful evidence to have his voice on tape, you know, pretending to be someone. That's helpful. That's good evidence. And one of this colleagues doing that.
And so I think those were-- we had certainly the computer data, but it was an interesting moment, because back then people didn't really-- they looked at computer hackers as kind of Robin Hoods, you know? He caused a lot of damage. He pled guilty to causing about $5 million of damage.
But to the ordinary citizen, it didn't really mean a lot to them. Their personal information wasn't being stolen. They didn't really understand what was happening. In fact, there was a campaign where a small plane flew over the courthouse that said, Free Kevin, you know? So it was sort of an interesting way of looking at this threat at the time. And it's certainly advanced since then.
Then another case I had was an internet stock manipulation case where a company called Pairgain, which was in Southern California, someone created a fake Bloomberg web page. And if you hit the links on the side, it would take you to the real Bloomberg web page, so it was a very convincing fake.
And it reported this company's stock was-- or this company was being bought by an Israeli company. And so the stock went up 30% on NASDAQ within, like, 30 minutes until it was revealed that this was a fraud. And lots of people lost money, including James Cramer, who is still around, because information is power and power and profits are based on time. And so a lot of people reacted to this, and it turned out to be completely false.
And we investigated it, and we were able to follow the digital footprints. We followed the physical footprints, but we didn't see any kind of unusual training. We were able to follow the digital footprints and find the person who was responsible for this, and it turned out to be someone who worked for the company whose excuse, when we caught him, was like, well, you know, I only sell bad things about the company all the time. I just wanted to say something positive for a change. And make money. He got cold feet. He didn't actually trade. It But it was the first internet stock manipulation scheme.
And then we had-- another case that I was involved in was back in 2000, which was more a computer attack case. There was a widespread what they call distributed denial of service attack. And so what this is is compromised computers all over the world. So you know, these slave computers. that the bad guys compromise, and then they use them as an army to try and take sites offline. So CNN, buy.com, E-Trade, a number of other companies and big websites around the world were either slowed down or taken offline.
And we investigated that, and it turned out to be a 13-year-old. And this was also international. It turned out to be in Canada, and they also had effects beyond that. And went by the moniker Mafiaboy.
And it turned out that we then worked with the Canadian authorities to try to trace this down, and the Canadians, the RCMP in Canada, did a wiretap, because we didn't know who it was. We just knew what house it was coming from, the computer it was coming from. And we put a wiretap on, and there was a recording that we got of the father threatening to whack one of his colleagues. So it was mafia dad and Mafiaboy, a great family.
But it was an interesting case, because it turned out when it first happened that people said, this is a very sophisticated attack. Obviously, it had to be a very sophisticated person. And it turned out to be a 14-year-old using tools that he borrowed from someone else, and it had really a disproportionate effect. It demonstrates the asymmetric nature of the technical threat.
The ransomware, more closer to now, the ransomware case, as I mentioned certainly, I think, captured a lot of people's attention. The case of Sony, the hack of Sony by North Korea was really another watershed case when I was at the State Department and working with the White House where I think you all remember this.
Because of a another movie-- which I can't say is the best movie in the world-- was making fun of the North Korean leader, North Korea hacked into Sony, released information. Also made physical threats.
And really created, I think, a real interesting issue, because we seldom went out and identified, when we thought a nation state was involved, that they were involved. We seldom made that public attribution. But in this case, President Obama made the attribution. He came out and said, we know this is North Korea. We know North Korea is doing this. We're going to take action. We're going to take a series of actions against this, some seen and some unseen.
But again, it was sort of a watershed event, because it got a lot of public attention. Also, you know, executives at the company ended up losing their jobs, which is a powerful incentive for them to take security more seriously.
But it was very interesting to see a nation state doing this kind of activity, that not only was hacking into a system, but it was meant to try to curtail freedom of expression rights. I mean, that was the point of this, is trying to get Sony to pull back on their distribution of this picture. It had the exact opposite effect, because the picture became popular probably well beyond what it would have been. And then that was another really interesting event.
And then, of course, we had the big intrusion of the Office of Management and Budget and the theft of all of the clearance and other information. Again, you know, very serious, but a different kind of thing.
So you have thefts, you have use of information, you have attacks. And they're two different kinds of things. The attacks are attacks on computer networks, even minor ones like denial of service attacks, and the thefts can be used for any kinds of purpose. It could be for intelligence purposes, it could be used for commercial purposes, for fraud purposes. You often don't know.
And you know, we're faced in this area with a range of threat actors, and they range everything from the lone gunman hackers like Kevin Mitnick, although he had some compatriots, to these days, transnational organized criminal groups who are very sophisticated who obviously, their number one goal is to make money, to the potential of terrorists. Although to be frank, terrorists, we've seen, use the internet to communicate, to proselytize, to try to raise money, to plan, but not used it to attack infrastructure on the internet. We're worried about that, but we really haven't seen it.
And then to nation states. And this is a threat that has grown in recent years, where malicious nation states are posing as threat actors. And the Director of National Intelligence for many years now, for four years I think in a row, and their public record has talked about these various threats. And it's talked about in nation states in particular, the key threats are Russia, China is the most capable cyber adversaries and actors, and then Iran and North Korea. Less capable, gaining capability, and increasingly posing a threat.
Because again, this is somewhat of an asymmetric threat, and you don't need a huge amount of resources to launch these attacks. You need some resources for a sustained effect, but you don't need to have a full infrastructure. You don't need to have an army. You don't need to have 100 tanks to have an effect that you can have in cyberspace. Particularly because the US and other countries who are so connected to the internet and so dependent, that creates itself vulnerabilities.
So some of the other things that we saw that were big deals for us is the wholesale theft of intellectual property and trade secrets. That was attributed to China. I think former General Alexander termed that the largest transfer of wealth in human history. I don't have any real metric to decide if that's true, but it certainly was something that not only was a national security issue, but was an economic issue.
We had the fear of-- we still have the fear, and we haven't seen much of it, but the fear of a debilitating attack against our infrastructure. So one of the things that I think we've been most focused on is the fear that hackers will, whatever their stripe, whether it be nation states who have more sophistication or others, will hack into the kind of control systems that control the water system, the power system, the financial system, that they will essentially take that down.
And although that is a low probability, it's a huge impact event that could have really long-term, terrible consequences. And it even can result in the loss of life. So it's not just a cyber event. It's a cyber and physical event.
We've seen some aspects of that. There was a case where the Saudi company Saudi Aramco, which is their oil company, there was an attack leveled against them that wiped out the data on all their computers, and so that caused a lot of problems and pain for them. Didn't cause physical death, but it actually wiped out the data on their computers.
In Estonia in 2007, or as they often call themselves, "E-stonia," because they're the electronic nation now, there was one of these distributed denial of service attacks that was very high profile and took various services including financial services offline for a short time. Now their president, Former President [? Tomas ?] [? Silvas ?] will say that wasn't as serious as people believe, but it did create a watershed moment where there was focus on this issue around the world, and they became a poster child for cybersecurity in electronic commerce more generally. And they really had been doing a lot in this area.
There was instances where you had a cyber attack that was part of a physical campaign. So Georgia, for instance, Russia and Georgia, used cyber tools as part of a physical campaign. And I said before, people will talk about cyber war. I don't think we'll see a cyber war just as a confined cyber war. I think you'll see cyber tools used in traditional conflict, and that's a good example of that.
And I think we see a new risk on the horizon, which we haven't seen a lot of, but I think are really-- oh, and another thing I should mention is just recently, a couple years ago, the attack on the Ukraine power grid that took power offline. So we imagine these parade of horribles. We actually saw something which took the power grid offline for a short time in the Ukraine. Now the good thing about the Ukraine in that instance is they didn't go to a fully digital system, so they actually had the manual control, so they could turn it back on, but that's not going to be true everywhere.
And then we have new threats that we're looking at. The internet of things or the internet of everything. You know, now everything is connected to the internet. This is an accelerating thing. And these are commodities. So there's not incentive to add security to bake in security from the beginning.
And so you had big cases. There was this case called the dine flood attack where lots of connected cameras were used to launch a denial of service attack against various sites. There was-- and this is an actual story-- there was also a group of connected refrigerators that were used to launch one of these denial of service attacks against financial institution websites, which I thought was a clever new use of the term freezing your assets.
Sorry. That needs research and development, that joke. And then there's one of the threats that I really think we haven't seen, but I worry about, and that's threats to the integrity of information. So you know, I care if I can't get to my banking website for an hour. It annoys me. I care a lot more if someone's got into my banking site and has corrupted the data so the trading can't sell at the end of the day, or my account is not reliable, and therefore, it causes me large-scale damage.
Or I care even more-- again, quoting my friend, the former president of Estonia-- if someone breaks into my medical records and changes my blood type, and then I get a transfusion and I die. I care a lot about that. So we haven't seen that, but that's another level of vulnerability.
And then there were things that frankly, I think, we weren't prepared for. So you know, we've focused on a lot of these infrastructure attacks. We were focused on the possibility of these low-probability things happening, but really worried about them. We were focused on thefts of intellectual property and trade secrets.
But we didn't really see-- I don't think the cyber community really saw the Russian interference with our democratic process and attempted interference in the election system to undermine our democratic process, either here or around the world. I think that wasn't really on our radar in terms of a priority threat. And I think there may be lots of reasons for that.
Part of it is it's not-- it's a hybrid threat. It's not just a cyber threat. Influence operations have been going on forever. And so the fact is that the cyber people would think about the cyber part of it, but cyber's only one part of that. Using that information, weaponizing that information is something I don't think we saw coming. And we have to be cognizant of these new threats going forward.
So there's a range of these technical threats and the required response. And there's a range of both technical and policy responses to them, but there are also policy threats in cyberspace. And what I mean by that is that different states have vastly different visions of the future of the technology and the internet.
We view information as a good thing and try to be as open as possible about it, and that's true for many in the Democratic and Western world. But there are countries who view information itself as a threat to their stability. They use terms like information security not in the technical sense, but information security in a policy sense rather than cybersecurity, because they really are trying to control the information.
And they believe-- and I think Russia and China are among them-- that they want absolute sovereignty in cyberspace. Absolute control of cyberspace. They want to draw a line around their country-- sometimes a firewall, but even more than that-- that actually controls content coming in and out. And that obviously has huge implications for things like human rights.
To give you an example, I often had dialogues with the Chinese. I went to Beijing. I got to my hotel room, and I saw this, which I thought was great, which is a listing of all-- and there are stylized, trademarked [INAUDIBLE] The New York Times, [INAUDIBLE].
And as you read right up at the top, it says-- you think, oh, this is great. I have access to all these things. And it says, they were not available here, including but not limited to-- so there was very much control and worry about this kind of information. And that was hanging on the door in the Hyatt there.
So you have this threat in terms of the future and how it's controlled and trying to control this information. And as I mentioned earlier, you also have countries who are using the internet to monitor and control citizens and curtail human rights, and often using cybersecurity as a proxy or as a excuse for curtailing speech.
And we've seen this more often, and that's a problem certainly for more repressive regimes, but it's also worrisome when countries who are in the middle are starting to think about this or are worried about security. But they start adopting policies in the name of cybersecurity, but it's really policies to control information. That is a real problem.
And one of the things I've seen is there's a real battle for the developing world. There are countries on one side of the fence, and there are a lot of countries on the other side of the fence.
And you know, if the internet becomes-- you know, when I said State Department, I guess I couldn't use the term Balkanized, but fragmented. If the internet becomes fragmented or Balkanized, then it doesn't serve the purposes, it's not as vibrant, it doesn't allow the communication that it's allowed in the past, and that is a bad thing. You have digital bubbles, if you will, that are not as robust.
But a lot of countries in the developing world, you know, they look at this, and they have some sympathy for this idea that they want stability. They want regime stability. They understand that.
Now on the other hand, they also want economic growth. And so you can't really often have both. Some countries are big enough that they can get around that, but a lot of these developing countries understand the importance of economic growth. And so I think the importance in terms of the outreach we've done is to make that case. And you know, and that's a real challenge, because more and more countries are on the fence.
And then you have multilateral bodies. So you know, multilateral bodies really around the world who want to put states in charge of the internet. Now if you know how the internet is governed, the internet is this odd thing in many ways that is not run by states, it's not run by governments. It's run by governments to some extent, but also the private sector, civil society, academia, some of the internet wise guys, the people who helped create it in the first place.
This is multistakeholder. This is a term we use. Multistakeholder governance model, which really doesn't have many other analogs anywhere else. And if you think about the importance of the internet and how transformative it is, the idea that the governance model is this model is very different than almost anything else.
And some states are not happy about that. They want to control the internet, some because they want to control information, some because they're worried about, for good reason, worried about religious and other violence that happens because of things, the content that occurs on the internet, and they are used to having the state in control. And they don't necessarily give credit to these other stakeholders.
Why should this company, why should this civil society group, why should this academic have the same right as me as a country? Don't I represent all my citizens? And so it's been a real battle, and it continues to be a battle in terms of what the future will bring.
And then in the sense of cyber conflict and cyber warfare, you know, nearly every country that can-- I think over 80 countries-- are developing cyber offensive capabilities. And that's not surprising. Cyber is used at-- militaries depend on it for logistics, for operations, and it's become a new domain of war. It's a commercial system, but it's also a new domain of war, and that creates instability in and of itself if you don't have some expectations and rules around that.
Indeed, issues regarding the future of the internet and how it's governed and run and how it can be used are being debated in virtually every international and regional body around the globe. And that creates a real risk to the kind of internet and the structure we've had in the past.
And I often have said when I'm giving speeches in the past that now cyber is the new black. Everyone cares about cyber. They don't really know what to do about it often, but they now care about it. And there's good and bad parts about that. It's good when it raised awareness. It's bad when everyone wants their turn in trying to change it and regulate it.
So then I want to turn to how we're combating these threats, and particularly internationally. First, a key observation that I've seen over the last number of years is it used to be you would go and you'd talk to a cabinet secretary or you'd talk to a minister internationally or you'd talk to even a CEO at a company, and you'd talk about cybersecurity, and their eyes would roll back in the back of their heads, and they would say, it's a technical issue. Have my technical people deal with it.
More recently, I think this has transitioned now to a key issue of national security, and it's being seen that way in the US and other countries around the world, of economic policy, of human rights policy, and ultimately, of foreign policy, which is what my old gig was.
And there's been, I think, a fair amount of progress. In our own government, there used to be silos of excellence. People would concentrate on their little things. Different agencies did their different missions, but they didn't really collaborate. That's changed, and I think that's been a good thing.
Back when I came to the White House, we did this Cyberspace Policy Review, which tried to bring departments together. But we also did the first International Strategy for Cyberspace, which was, what are our goals? What is our roadmap? What do we want to achieve in cyberspace, and how do all these things from warfare to economics to human rights fit within that rubric? So we're all speaking with one voice.
And I remember, I hosted the meeting where we had 15 different agents who show up. And it was sort of creative cacophony for the whole day. They weren't speaking the same language. Economic agencies were talking about internet policy. Security agents were talking about cybersecurity. But the process over a year and a half brought those agencies together, and an understanding of what the common goals were.
So both of those things led to the creation of my office at the State Department. And we were the first office in a foreign ministry anywhere in the world to deal with these issues at a policy level. And since then, about 26 other countries have emulated that.
There are 26 other countries, including both friends and frenemies who have followed this. So you know, Australia recently created an ambassador for this, China and Russia have them, India, Brazil, Germany, France, the UK, Norway. Many, many countries around the world have done it. Canada.
And that's important. That's important because it raises a level beyond the technical level. It really deals with these really weighty policy issues. And frankly, you don't have to be a technical expert. You don't have to understand how to code a computer to understand the deeper policy issues, just like you don't need to be a nuclear engineer to understand some of the implications there.
Now you want to have people who understand the technology there so you understand what the trade space is and you don't do something really stupid. But you shouldn't be scared of this. And senior decision makers who traditionally shied away from this I think now are grappling with it, which is good.
The other interesting thing about my office was the scope. So it wasn't just cybersecurity. It was a scope that included cybersecurity, combating cyber crime, international security, including norms of behavior and preventing cyber conflict, internet freedom, and internet governance. And also responded to threats using diplomatic tools. So it was a full sweep of issues in cyberspace, and it was important because those are often overlapping and mutually reinforcing.
To give you an example, when someone is using security as a proxy for restricting human rights, you have to be cognizant of both of those different interests. When you talk about internet governance where you want states to control-- again, to control information that has human rights dimensions and has security dimensions. So it's important to bring them all together.
And there are a couple different things that we did structurally to build this out. One is we built a cadre of cyber diplomats at our embassies around the world. So almost all of our embassies have someone. They might have two or three hats, but they have someone who understands these issues, who engages with their host government.
And we brought back to Washington to train and to talk and train them in the technology how the internet works. It actually helps to understand that a little bit if you're going to make policy. And also train them in a lot of the different policy things. And we'd have outside speakers from academia, from the private sector, but also a deep dive from the different agencies to prepare them for their job, and that was important.
We did what we call whole of government dialogues with both, again, friends and frenemies. The reason we did these whole of government dialogues, you bring everyone from your defense department to your commerce department. and you have the human rights people there too. And it forces the other government to have everyone these. You don't talk about things in silos. You have a more wide-ranging discussion that's much more helpful in this area, because cyber is really now permeating every aspect of what we're doing.
So we've had this with Japan recently and the Nordic Baltic countries, and India and Brazil and Korea and Germany and France and China and the EU, and Russia until the Ukraine, and then we cut that off, but doesn't mean we still don't have some contact with them. And those are very useful to advance cooperative measures.
And then there are various loose groups. There is something called the Freedom Online Coalition where a number of countries, 26 or maybe 34 countries now who have a similar interest in freedom online who come together to try to maximize their authority. So that's one aspect.
The second aspect of diplomacy in cyberspace is operational, operational international cooperation. So one of the things that my old group did was we had connectivity between our law enforcement people, our technical people, but we tried to enhance that. We tried to build the bridges between, for instance, law enforcement.
We pushed very hard on the first global treaty for cyber crime, which is about 15 years old now, called the Budapest Convention, which we think is a really good grounding and gets similar laws in place. Some countries don't like it because they weren't part of it, but it's a really useful, really important document.
And to give you a reason why that's so important is years ago, there was something called the ILOVEYOU virus that was traced back to someone in the Philippines, a hacker in the Philippines. But the Philippines didn't have a law that punished that, so there was nothing they can do. So you need to have these substantive laws and bring laws into the 21st and 22nd century.
We also created something called the 24/7 Network where different-- now 64 countries can cooperate and trade evidence back and forth, and that was good operationally. And then on the cybersecurity side, there's something called computer emergency response teams, the technical guys, the sort of ambulances that react to internet issues and make sure they get vulnerability information out. We try to get countries around the world to have those, and we promoted having national strategies to promote this issue around the world, and that was a big part of what we did too.
So that was some of the operational issues. But then we'd also use diplomatic tools in a novel way to combat operational threats. I mentioned the North Korea cyber attack. One of the things I did was when we knew President Obama was going to go and say, North Korea did it, I called up many of my counterparts around the world and said, President Obama's about to go out and said, North Korea did it.
We would appreciate your support. We're all sharing information with you, so we can have a united front so it's not just the US saying, this is impermissible conduct, that we have a broader international condemnation. And that, I think, was very helpful.
And you know, I think a really interesting example is, you remember I said, what do zombies have to do with diplomats? So this is how they hang together. So there was a large-scale, again, one of these botnets, one of these denial of service attacks, which are often called zombie networks. You know, the zombie computers are wandering and they're attacking these things.
So the zombie network of computers, tens of thousands of computers were being used, and it turned out by Iran who were later-- and there was a later law enforcement action about this-- to attack financial institution websites and take them offline or make them inaccessible. And again, not the end of the world, but sort of an advanced nuisance for these companies, and it was a sustained thing because it was a nation state or partly a nation state that was really problematic.
So we looked at the technical things we could do. We looked at the law enforcement things we could do. And those were OK, but they really weren't the silver bullet for this. But what I did is I called up 23 of my counterparts around the world where these bots were concentrated.
Like, Germany had many thousands of them there, and some other places had concentrations of them and said, look. You don't want these compromised computers. These are victim computers in your country. Use whatever means you have, which may be very different than what we have in terms of your laws, in terms of your relationships with ISPs to mitigate this. And of course, if you come to us, we're all going to be able to work with you on that too. So it created this kind of collective response to these shared threats.
And we use something called a diplomatic demarche. Before I went to the State Department, demarche just sounds like a bad thing, like I'm demarching you. But a demarche can be a good thing, and so [INAUDIBLE]. And they did, and that really was very effective.
Then the other tool of diplomacy is negotiation. A traditional tool of diplomacy, to be sure. And I think there are a couple good examples of that. For years, there was a concern about Chinese theft of intellectual property and trade secrets. That was a big, big issue. And we made a sustained diplomatic effort. And President Obama, over a period of two years, said this was not just unacceptable from a cyber standpoint, but also was unacceptable to the whole relationship, and we're willing to risk real disruption in the relationship because of this.
And I remember, I was one of the first people who in a meeting with a PLA general was saying, we know you did this, this, and this. They didn't agree. But he used a parable that basically said, you know, he used a parable. Our translator said, oh, it's too complicated. I'll tell you later on.
And the parable was the woodsman who goes into the woods and loses his ax, and he goes home, and he just blames the neighbor boy, because the neighbor boy seems like the kind of guy who would take his ax, you know? So obviously, the implication is, you're wrong. You're misblaming us.
And if I knew that, I would have been able to come back if I was clever and say, no, it's like the woodsman who goes into the woods and loses his ax. He comes home, goes back the next day, is ax is still gone, but now everyone else has similar axes. That would have been a better response.
But it was really a long term where we said, look. You know, every country does intelligence gathering, but intelligence gathering to benefit your commercial sector is something we think is off limits and shouldn't be done. And after a lot of pressure and the potential threat of sanctions and other things, we were able to reach an agreement with the Chinese that countries shouldn't do this, that there was a distinction between intelligence gathering and benefiting a commercial sector by this kind of theft.
It was then agreed in the G20 and it was agreed by the UK and by Australia with China, and it became a global norm. So that was a good example. We've done comprehensive cooperation agreements with countries like Estonia and India, and we've worked with regional organizations like the Organization for American States and the ASEAN countries. And so that's been important.
Then the next tool is capacity building. And so this goes to these countries that are on the fence. It's really doing two things. Helping them have better capabilities, but also convincing them that we have a common interest together so that they really will embrace the kind of open internet that we want. And there's been a lot of work that we've done in sub-Saharan Africa, in South America, in ASEAN to try to get countries up to speed, because a lot of countries are just beginning to grapple with this issue.
Then the last, and I think the one I want to emphasize is those are all reactive, but we're also trying to shape the international cyber environment in the long term. International law. So that's a framework, the framework for international cyber stability to shape this framework.
The first and foundational part of this is the international law applies in cyberspace, just like it does in the physical world. Now that may seem like a no-brainer, but it wasn't. There were a lot of countries who originally said, no. You need a whole new legal structure for cyberspace, which itself would be a problem. It would be destabilizing if you had not the predictability grounded in the physical world of cyberspace. Cyberspace is different, but it's not so different it's a whole different area.
And we got agreement, including agreement from the Chinese and others that this is-- and the Russians that international law applies. Now how it applies is still being worked out, and that's one of the things that we need to do ahead.
So international law at a very high level deals with cyber attacks and proportionality and distinction, and things like that. But it doesn't deal with a lot of things we see every day, the kind of efforts that we do. So what are the rules of the road? What are the norms-- [INAUDIBLE].
What are the norms for cyberspace? What are the voluntary rules of the road that we can come up with and get people to agree to that would make cyberspace safer for everyone? And it's not this norm or Norm Schwarzkopf or Norm Mailer or any of those norms, but it is these norms.
And we've been promoting this idea that there are a set of norms-- and these are not exclusive-- that countries should agree to. And we've gotten pretty broad agreement from a number of countries in a process in the UN called the Group of Governmental Experts, which includes all the Security Council countries and a number of others.
Things like, you shouldn't attack the critical infrastructure of another country that's providing services to the public. In wartime, there are rules for that. But in peacetime, we shouldn't do that. You shouldn't attack the computer emergency response teams. You shouldn't go after the ambulances of the internet. There's an expectation of cooperation. If you see malicious code, you ask for help. And this intellectual property norm I talked about before.
So that's been very important to get that level of understanding and agreement on some of these basic rules. It's still voluntary because we're still in the beginning of this, and we're looking at others. And that's been both a governmental effort, but also there's a new commission for the stability of cyberspace that I'm a commissioner on now, which is also looking at how civil society, academia, and others and industry can add to that.
The second part of this is-- or the third part of this framework is confidence building measures. So this is the one real analogs in the nuclear world. Nuclear and cyber are very different. You don't see launch plumes in cyber. You can't easily do attribution, although you can do attribution. So concepts in nuclear don't necessarily apply.
But the one thing in the nuclear world that was very applicable was this idea of building confidence and transparency measures with potential adversaries. And so it prevents inadvertent escalation. And especially, this is such a new area, and there's so much uncertainty that you want to avoid that escalation or inadvertent mistakes that will raise the stakes.
And so these are things that are-- they're not rocket science. They're things like having points of contact, exchanging doctrine, having resolution mechanisms, cooperating against sharing threats. They've all been very important, and we've done a lot in various institutions around the world.
We did this bilaterally with the Russians a number of years ago, and actually used some of those hotlines. One of the hotlines was the Nuclear Risk Reduction Center that was also used for cyber communication. We did it multilaterally in the Organization for Security and Co-operation in Europe, which really dealt with confidence building measures, and that was 57 different countries. So I think that was really important, and we've done quite a bit there.
You know, that's great. And we have to do a lot more there to actually get more countries to embrace this. Getting 25 countries to say it's good, fine. But you have to get countries around the world to embrace this to really make these norms stick.
And norms are great. Rules of the road are great. But they're kind of worthless if there's no enforcement. There's no consequences for bad actors. And I'd say this brings me to the last issue I wanted to talk about, which is deterrence. I don't think we do deterrence very well in cyberspace.
You know, classical deterrence means that you have credible actions you can take to deter your adversary, and you do that in a timely manner. And I don't think we've done either of those particularly well.
Now there's deterrence by denial, which is creating good defensive systems. And we are, in fact, doing better on that, but not perfect. And if you don't have consequences for bad actors, you create a norm of inaction. So if you don't do something, then it's acceptable conduct. It's like, well, you know, they didn't do anything. That's OK.
So we need to do a much better job than that. And I think part of the problem has been attribution, that there is this mystery that attribution is so difficult in cyberspace. There's a little cartoon that they have a dog on the internet, and it says, "On the internet, no one knows you're a dog."
But you know, attributing conduct to a nation state or an individual is, I think, more difficult than-- it's not simple, but it's not impossible, because you don't just look at the digital footprints. You look into other intelligence you have. You follow the money. You look at motive.
And at the end of the day, attribution is a political issue. It's not a technical issue. It's partly a technical issue, but you're never going to have 100% attribution and you have to make a decision. And so once you make a decision-- you know, for instance, the North Korea Sony attack, everyone said it was North Korea until President Obama said, it's North Korea. And then everyone said, well, it's not North Korea.
But you know, we knew it was, and we released as much evidence as we could. But states are never going to introduce or get all that evidence out there, so that creates a problem.
So the problem is we can do better and we can reach attribution quicker. We also have a limited tool set. We have diplomatic options, trade options, law enforcement options, cyber options that people think there's this big cyber button and you can push it and cyber tools will be used, which is really not true.
And then finally-- and you know, I think it's very overrated, because there has to be a lot of preparation. And one of my White House colleagues was fond of saying, you can knock someone down but not hold them down using cyber tools. And then kinetic. And you're unlikely to use a missile in a cyber event unless it actually causes death and injury.
So what we need is really a collective action to think about new consequences. And this is an area where I think both the governments and the private sector have to come together.
And the other thing we have to do is not just react on our own. It's better if we act as a group. When we have like-minded groups, not a huge tree organization-- although NATO's playing much more of a role on this recently, and made a lot of advances in saying, this is the domain that they're dealing with. But you know, loose groups of countries that can employ these consequences against bad actors and actually change their behavior, and that's important.
So that all comes down to the role of diplomacy. In all of this, the role of building alliances and shaping the environment, ensuring international cooperation is really paramount. And it's not just between governments, but with the private sector and civil society and academia.
And it also means really building bridges between the policy community and the technical community. And I think that's something that can be done in institutions like Cornell. Bringing those two communities together, often very different communities so that they have a dialogue and they can think about some of these solutions.
And I'd say that folks in this room, certainly students are part of this solution. This is a new area. You can have a lot of influence here. Leadership is being looked for. This is, I think, going to be a huge area going forward.
And remember when I talked about Colossus, the movie in the beginning. And Colossus took over the world. Well, this is the actual Colossus that is in Bletchley Park in the UK. And this is the place where Colossus was used. It was the first digital electronic computer, programmable computer. It filled a room, as you can see.
It was used to break what was called the Lorenz code, the code that Hitler used. High-level code. So this is more advanced than Enigma, which is the one that got all the attention in the movies and other places. This is the most advanced code they had, and they had to build this computer to break that code. So instead of taking over the world, you could say, in this case, a computer really saved the world.
And so if we want to use this technology and harness it for good and really fight the threats, we really have to work together. And if we want to avoid this kind of future, instead, by working together, we can have-- now this is not a computer movie-- this kind of future.
So with that, let me stop and I'll go for questions and answers.
GREG MORRISETT: Thank you. So I'm just here to moderate. And if you don't have questions, then I took a whole bunch of notes, so I have lots of questions to raise with Chris. But maybe we'll start with the audience. Yes, sir?
AUDIENCE: Thank you. Sidney Tarrow, Government Department in Cornell Law School. It was a very impressive presentation, not only because you're easy to listen to, but because of so many of the positive things you were involved in over the past eight years.
But I asked myself this question. So many of the norms and practices you described seemed to rest on a fundamental norm, and that's multilateralism. And for the past year, and possibly the next seven years, we are going to be governed by a regime that has publicly lambasted multilateralism.
Now we shouldn't believe everything we hear, especially in the case of the gentleman who's just coming back from Asia. But words have consequences, and one of the consequences is that people stop having the confidence that multilateral norms are going to operate. So I wonder how much of the structure and the norms that you've described are seriously threatened by the change in regime that we appear to be undergoing today.
CHRISTOPHER PAINTER: That's a good question. I'm heartened by this fact. The folks in the White House, the cyber czar, if you will, this guy Rob Joyce who came from the NSA and Tom Bossert who was the assistant to the President for counterterrorism, Homeland Security, and cyber, both have a keen interest in this.
And both in the time I was there endorsed this idea of norms and endorsed the idea, not throwing it out the window not simply because it's a new administration, saying, this is something that we don't want to do. And this has largely been a bipartisan or a nonpartisan issue with some exceptions that I am concerned about. But they've endorsed this.
Now what they've said is-- what Tom Bossert has said at one point is we want to do more bilateral discussions. But bilateral doesn't mean you don't do multilateral as well. Bilateral can be used to build a multilateral environment.
We did a lot of progress in the UN in this group of governmental experts. The last time, we weren't able to reach a conclusion. They weren't able to get agreements, a consensus group. And that's because there were real divisions in that group. And so maybe we've reached the practical limits of what we can do in that group for now.
But you can still do bilateral and small group discussions with other countries that build that larger multilateral community, or even use regional organizations. So I don't think there's an-- in cyber, at least-- I don't think there's an abandonment of multilateral. I think it's using both multilateral and bilateral together. And frankly, that's what we did in the last administration as well. And I think that's a good thing.
GREG MORRISETT: Great questions.
AUDIENCE: Good evening. I definitely believe in diplomacy, but I feel from your talk that you did not address-- I mean, I really think that we're in a lot of danger. I think having a lot of faith in the internet and in this world that we've created with computers is really disturbing to me. I don't have faith in the internet or using a computer or using a cell phone. Our privacy is always in jeopardy. There's a lot of issues you did not talk about.
And I also think that it's more dangerous-- you gave a very hopeful talk, which is good. It's good to be hopeful. But I think we're in a lot of danger. And we're also dealing with a presidency that could basically put the trigger on the nuclear bomb, and probably annihilate the planet.
But that said, you talked about probably Google and Amazon and all these other companies that are in charge of the internet that don't allow the police or a parent to look into the computer or cell phone of somebody who murders all these children in Sandy Creek. You know, I mean, who's controlling the internet?
And all these rogue people that go online and create all these attacks, I mean, we already know that having all these driverless cars, driverless this, hackers could go into that and stop a whole community. So basically, I will end with this. I don't feel that your presentation is relaying the serious dangers that we have.
CHRISTOPHER PAINTER: So I disagree. Look, I think we face serious dangers certainly on the internet. We also face serious dangers in the physical world. And you know, it's not unique to the internet, some of the dangers we face.
And indeed, you mentioned, just to unpack some of the things you talked about, there is a concern about privacy on the internet and how that interacts. And there has been a lot of discussion of making sure. And one of the things in our international strategy was the importance of privacy on the internet and how that works. The EU has a very different view of how privacy works than the US does, but they need to be interoperable for us to really succeed. So it is taken seriously.
The other issue you raised is the issue of encryption, which is a very thorny issue, and not an easy issue. And I think that really is a very difficult technical and policy issue. That on the one hand, strong encryption protects privacy, and it actually protects network security too.
On the other hand, if you can't get data-- and law enforcement uses the term going dark-- if they can't get data to fight terrorism and really violent crime, that's a problem too. So how can you thread that needle?
Right now it's been more of a discussion between the government and the industry to see if there are ways to do this. And there are differences between device encryption like the Apple case and encryption in transit, where I think the latter is harder than the first-- neither is easy, but the latter is harder than the first one.
But these are the kinds of tensions you see in physical world things too. And with all the dangers out there, this is exactly why we need to work to try to make sure we mitigate the dangers to the extent we can.
It's a dangerous world. That's true across the board. But trying to mitigate those dangers as best we can and promote the good things. So I am, even as a prosecutor, a former prosecutor, I'm more of a glass half full-type person, and I think that's the right approach to take.
GREG MORRISETT: Good. Good, good.
AUDIENCE: So you mentioned that we don't do a particularly good job of deterrence in terms of deterring cyber attacks. It was wondering if you could talk a little bit about what we should be doing. How do we deter them? And I was hoping you could specifically address the Russian interference in our election, which I'm not sure you would narrowly describe that as a cyber attack, but certainly, these were cyber activities that were very troubling.
And the question is, how do we deter them from doing that or something similar in the future? And do we use something like a retorsion the way Obama did immediately after? He punished them, expelled some of their diplomatic staff. Or do we take a heavier hand and do something that would be described as a reprisal, something that's much more aggressive?
CHRISTOPHER PAINTER: So I think you have to look at the threat actor and how well we do it. So with criminal actors, I think we've gotten better. We have stronger applications of transnational takedowns and other things on the criminal scene than we had before. But it's still seen by a lot of criminals, it's a costless enterprise, and we need to get much better at that.
On nation state and other actors, look, I think there were a lot of good actions taken at the end of the Obama administration. Expelling diplomats, easing the sanctions order for the first time.
But you know, I go back to what I said about if you have good deterrence, it has to be timely. And that was many months after the events happened. And so if you're trying to send a signal to the adversary, this is unacceptable, you have to send a strong and consistent signal.
The one thing I mentioned I worry about is I completely believe and completely am on board with the intelligence community in both administrations that have said, the Russians tried to do this. That to me is completely clear. It's worrisome when a commander in chief calls that into question, because the first step in actually making sure it doesn't happen again is to admit it happened the first time and to take actions to make sure it doesn't happen again.
And it will happen again. And we saw a similar activity tried in France and in the Netherlands and in Germany, and a myriad of other places. The UK just came out with a big warning through their National Cyber Security Centre today.
It's going to happen again. It is a repackaging of an old information operation thing that's been around for a long time. But now that we know that we may be vulnerable to this, we are idiots if we're not taking really every opportunity we can to be serious about making sure it doesn't happen again. Not just protecting voting machines, because that, I think, is just one small part of it. But really figuring out how to keep information from being weaponized.
Now the problem, though, is states should not be-- democratic states should not be the arbiter of what's true and what's not true. That's not a good position to be in. And that really leads to a lot of problems.
But I think a lot of the providers are beginning to look at what the source of the information is to try to identify that. That's an interesting way of looking at it. I heard an internet executive tell me, I'm not going to tell you if something's true or not, but I'll tell you if it's a real person or a bot, and that helps. So really raising awareness so people can make the choice.
And there are some worries in this by people sucking all their information through a straw of one particular provider, and that's another concern. So it's awareness, but it's also looking at policies and making sure that we're doing everything we can to prevent it, and call it out.
And when I think about deterrence more generally, we need more tools. So I talked about that limited tool set. So what tools can we use to bring pain to the adversary that's temporary and reversible? So you say, we're doing this you, and when you stop your bad action, we'll stop doing it to you. Those tools, you know, we have some, but they're not that great right now.
And so one of the things in the future, where I think this is going in the future, is we need to band together with other countries and have these dialogues, bilateral or multilateral, to have like-minded groups like we did with the Proliferation Security Initiative or money laundering in the day. And then we have to try to develop what new things we can do.
And I've had people suggest various things that all have second and third-order consequences, so you have to be worried about them. But that's a creative thing that's beyond just governments. That requires academia, it requires the private sector. What are the things we can do that will impose costs on an adversary who's doing bad things and will make them change our ways and expand our tool set? So that's one of the ways I see this going in the future.
AUDIENCE: Thank you. I wanted to get your opinion of Kubrick's how and cryptocurrency.
CHRISTOPHER PAINTER: So I think cryptocurrency is interesting, because there have been a lot of online currencies over the years, some used for good and some not. And there's been investigations and prosecutions of cryptocurrency that was used as an underground currency for things.
There's nothing inherently evil about a cryptocurrency in any way. It's how it's used and how it's regulated. That's a little beyond the stuff I've dealt with every day, but it's something that I think there's been a lot of attention on to make sure that you have proper controls in place on how this is being used, because it can be used as an untraceable currency, and you don't want that to happen.
The same time, now you have financial institutions embracing it, so what does that mean, and how is that going to scale going forward? So I don't know if that really answers your question. But I'm not against cryptocurrencies, but I see both promise and danger in them.
AUDIENCE: In a response to an earlier question, you mentioned that the EU and America have a different view of the relation between privacy and security. Could you explain to me the points at which they differ?
CHRISTOPHER PAINTER: Yeah. So I think the EU, they treat privacy as a fundamental human right. And when you think of something as a fundamental human right, then there's nothing you can balance against it, right? It's absolute. We actually do better enforcement often against privacy-- our Federal Trade Commission, when a company says, we're going to do something, don't do it, we go after them or the FTC goes after them-- than the EU, I think, traditionally has.
But it also depends on what part of the EU you're talking about. The law enforcement parts of the EU have one opinion, the privacy people have another opinion. And I don't know that dialogue has been as strong as it can be.
And to give you an example of a tension I see, there's a new General Data Protection Directive coming out of the EU, which creates a lot of different obligations. And the history, one of the interesting things about the history of data privacy in the EU is first they had a data privacy directive. Said you had to destroy or anonymize data if it wasn't used for a billing purpose within, I think, 60 days. Then they had the Madrid bombing, and then they had a data retention regime. And now they're kind of going back and forth, and so there's tension there.
But I also worry about the tension between that, and there's also a new EU Network Information Security Directive. And so how that-- and how you protect networks, because you have to share IP and other information to do that effectively, How does that, if you think of an IP address, it's personal information. How do those things square? And so I think that creates a lot of issues.
Some people in the EU don't think the US cares about privacy. They do. We do. We had a Privacy Bill of Rights that we put out. We're done a lot of things along that line. So you're never going to have completely similar systems, but you have to understand how each other's system works, and make sure it's interoperable. Otherwise, you hurt both businesses and consumers and just people. So that's been an ongoing effort.
AUDIENCE: Hey, Chris. So I'm a government student, and I'm currently writing a paper on this issue. So you previously talked about multilateral cooperation. So I'm just wondering, from an international law standpoint, how do you see this moving forward? Do you think-- like, what would be the best means to achieving sort of global compliance and creating these sort of consequences to deter these actions?
CHRISTOPHER PAINTER: So I think what will not work in the short term at least is trying to do some global cyberspace treaty, because that's been suggested by some. And it's interesting that that was actually the suggestion. A cyber arms control treaty was a suggestion I think Russia had, like, 10 years ago.
I don't know what a cyber arm is. There's often dual use technologies. I'm worried about the effects. And I think the way to go is to these norms. What effects are we trying to prescribe? And how do we get countries to agree to that?
So I think maybe you have a treaty long down the line. We're not even close to that yet. We need to get other countries to kind of accept these norms, and think about other norms that might apply, rules of the road that apply.
The other thing you do is you can both do this in various multilateral organizations, but you can build like-minded-- large tent like-minded groups that will help impose consequences on bad actors who can band together to help have a collective response to some of these shared cyber threats. That may change over time.
As I said, the examples I often give is the Proliferation of Security Initiative and money laundering, the way money laundering has changed over time. You know, Proliferation Security Initiative [INAUDIBLE] material, not good if it gets transferred around. And first a small group of countries said, we're going to prevent this from happening. We're going to take whatever actions we can. But it's a voluntary group. It's not a treaty group. It's a voluntary group. And you can imagine the same kind of thing for cyber.
AUDIENCE: Hi. I really enjoyed the talk. I think you did a really great job of talking about the threats and the threat actors. Just wanted to see if you had an opinion on the sharing on the weapons side, right? There's the offensive and the defensive.
And there's been a lot of talk between the government sharing with private in this country, but then there's also the multilateral nature to that. You know, what are your thoughts on sharing of our offensive capabilities? Because it's different than kinetic, right? You can replicate an exploit. You can't replicate a bomb the same way.
CHRISTOPHER PAINTER: Yeah, I think they should be more transparent about the fact that countries are developing this and what the doctrines are. I mean, the doctrine of the US has been a doctrine of restraint. You know, there is a presidential directive that talks about the factors should be taken in account. But it's not an open the floodgates sort of doctrine.
The other thing is, this is not so much the tools you have, but there's been a big debate around vulnerability disclosure. So these are what they call zero-day vulnerabilities. They could be used by criminals and nation states to cause harm, but also could be used to enable intelligence gathering or to find the bad guys and bring them to consequences. So it's not this binary sort of thing.
And just today, the White House and Obama administration, we came up with a vulnerability equities process-- I think we're the first country in the world that did this-- that had a group of people sit down. And from different standpoints, everything from the state standpoint, you know, the diplomatic standpoint to the commercial standpoint to the security standpoint, intelligence standpoint, and look at these and say, the default is disclosure. That maybe sometimes we can't do that, but the default is disclosure. And then really go through the equities.
And just today, the White House released a charter for vulnerability equities process that tries to make that even more transparent. Tells you who is involved in that. It's not going to tell you about the individual cases. But I think that's not a bad idea for other countries in the world to emulate. I think that could be an interesting norm for other countries to have that kind of process. I know the UK and Canada have been looking at it.
Now there's some countries that will never have that process, just like there are some countries that will never have oversight of their intelligence operations and will never admit they even do them. But I think the more countries that do, the safer we are.
And that doesn't mean that you release all the vulnerabilities you know about, because it would be unilateral disarmament, as I think someone in the White House said recently. And there are good reasons to maintain some of those for various reasons. But it means really leaning toward the network security side, and you're making sure that that ecosystem is secure.
GREG MORRISETT: Just a couple more. Yep?
AUDIENCE: Thank you. I was wondering if you could elaborate a little bit on where you come down on the issue of end-to-end encryption and balancing between the interests of law enforcement and privacy. Because this is obviously a very contentious issue right now with the US v. Apple case, which was not resolved because the FBI dropped the case. And law enforcement agencies wanting to build in back door to this, and companies potentially have a lot of power to either give or deny access. Where do you think--
CHRISTOPHER PAINTER: Well, I mean, this is a technical and policy question. I think I tried to say this before, that if I knew the answer to this, I'd probably would be a rich man, but I don't. And I think there are two really compelling interests.
It is true that encryption is good for privacy. It is true that encryption is good for network security. It is true that if you build certain kinds of back doors in that they could be exploited by the bad guys too.
On the other hand, if you have evidence-- let's say you had an attack, and a phone or email message or whatever it is gave you evidence of other people who are planning future attacks. You want law enforcement to have that information. So how can you achieve both things? And there really hasn't been a good solution to that.
As I said earlier, I think there are two problem sets. The device encryption is different, the kind of iPhone-- you can imagine all the very complicated regimes for escrowing and other things to allow that with various people holding-- you can imagine all kinds of regimes. It's far harder when you're talking about the kind of internet apps where things are encrypted in transit.
But you know, I don't think it's necessarily building a back door, but I think it's working with the companies to see if there's ways that they can get access, because it's unsustainable at some point. If there's a big terrorist attack-- and you've seen this in Europe. You know, the scales will tip toward trying to get companies to do this.
Now there are logical problems with that. Not all the companies who do encryption are located here or any other jurisdiction. And so there will be third-party companies-- we had the crypto wars, as they called them, back in the '80s where the US is trying to mandate that encryption software could not be exported. Which we really shot ourselves in the foot, because other countries were like, fine. We'll do our encryption software, and it didn't actually get the problem. And so that was abandoned.
This is a similar problem. But we do have to find a way to get access, because it is-- I have a lot of sympathy for both sides of the argument. I have sympathy for the argument that if we don't get access to this data, there could be real consequences. And you don't want policy being driven by bad sensational events. You want it to be more reasoned.
So that discussion has to continue. And I think the right people to have the discussion are the government and the technical experts, and just see what's in their own possibilities.
AUDIENCE: Thank you.
CHRISTOPHER PAINTER: Yeah.
GREG MORRISETT: Last question.
AUDIENCE: Hello. I had a question on New York Times article that came out a couple of days ago, detailing how the leak of the NSA's offensive cybersecurity weaponry really shook it to its core. And I was wondering exactly how our cyber offensive capabilities have been impacted by that leak, and whether we may have focused too much on our offensive capabilities and not shorn up our defensive capabilities enough to prevent an attack.
CHRISTOPHER PAINTER: So I can't comment on that particular thing, but I can say this, that I think that it is fair that to the extent governments have capabilities that they need to guard them defensively. You don't want inadvertent proliferation of those tools, because that is a problem and destabilizing itself. And so I think we need to make sure that we're putting all the controls in place to make sure that that kind of thing doesn't happen.
I can't really comment on what happened there. In fact, I've been gone for several months, so I'm not really sure what's going on. And I probably couldn't comment anyway. But I do think there is this danger. And if you think of less capable states who don't have any controls in place and they're developing cyber tools they got out in the wild, that could cause real problems.
AUDIENCE: Thank you.
GREG MORRISETT: Well, so I'm going to close one more question, if you don't mind. So you mentioned some nations do not have the same approach towards free and open information, and use security mechanisms as an excuse to lock that down.
But I'm a little concerned about the opposite effect, that if we look at the Russian information campaign a good question is, would that be successful, an external campaign like that, in a country like China, which may have more of a lockdown of the social networks in other ways that the influence would spread? And would that not be an excuse that people would then push to try to restrict freedoms and openness of information?
CHRISTOPHER PAINTER: Yeah. I mean, look, perhaps. But I would not make that trade-off. I don't think democratic countries can make that trade-off. And so I have seen this with some countries that are developing their national cybersecurity strategies start saying, well, we need to control this, quote, "terrorist group." And terrorism means different things to different people. It could be dissidence in some countries, and so that's problematic.
You know, I think the Chinese certainly would be very concerned about any activity, whether it be by Russia or anyone else, to try to destabilize their political system, because that was their number one concern, is stability. But I think adopting a system where you have the strict kind of monitoring of the internet where you're controlling any political content on them is far too great a price to pay. And I think we have to make the argument to the countries, again, that are on the fence of why that will hurt them in the long term.
Now China has its own giant market, and so it's been successful. But other countries who are smaller and don't have that built-in market, I don't think can afford to give up some of the economic and other benefits.
GREG MORRISETT: That's good. Yeah. All right. Well, let us close by first thanking Christopher--
--yet again. And then I have the privilege and honor of presenting you with this plaque, this certificate that commemorates the lecture and your visit here, and also a small token of our appreciation.
CHRISTOPHER PAINTER: Thank you very much. Thanks. Appreciate it. Thanks.
We've received your request
You will be notified by email when the transcript and captions are available. The process may take up to 5 business days. Please contact email@example.com if you have any questions about this request.
Christopher Painter delivered the 2017-18 Henry E. and Nancy Horton Bartels World Affairs Fellowship Lecture, “Cyber Diplomacy: New tools in the fight against hackers, attackers, and other threats,” on Wednesday, November 15, 2017 in Call Auditorium, Kennedy Hall.
In his talk, Painter described how cybersecurity has metamorphosed from a niche issue of interest to technical experts to a core concern for national security, economic security, human rights and foreign policy.
He also shared an insider’s view of several international cybersecurity incidents, including the North Korean hacking of Sony Pictures, attacks on critical infrastructure, and Russian interference in democratic processes in the United States and Europe.
Painter has been on the vanguard of cyber issues for more than 25 years. As the State Department's first cyber coordinator, he led the country's diplomatic efforts to advance an open, interoperable, secure, and reliable internet. Painter received his bachelor's degree in political science from Cornell in 1980 and went on to graduate from Stanford Law School.