Episode 104 of EFF’s How to Fix the Internet

How do we make the Internet more secure? Part of the solution is incentives, according to Tarah Wheeler, this week’s guest on EFF’s How to Fix the Internet. As a security researcher with deep experience in the hacker community, Tarah talks about how many companies are shooting themselves in the foot when it comes to responding to security disclosures. Along with EFF co-hosts Cindy Cohn and Danny O’Brien, Tarah also talks about how existing computer crime law can serve to terrify security researchers, rather than uplift them.

Click below to listen to the show now, or choose your podcast player:

play
Privacy info. This embed will serve content from simplecast.com

Listen on Apple Podcasts Badge Listen on Spotify Podcasts Badge  Subscribe via RSS badge

You can also listen to this episode on the Internet Archive and on YouTube.

Computers are in everything we do — and that means computer security matters to every part of our lives. Whether it’s medical devices or car navigation, better security makes us safer. 

Note: We'll be having a special, live event with Tarah Wheeler to continue this conversation on Thursday December 9th. RSVP or learn more.

On this episode, you’ll learn:

  • About the human impact of security vulnerabilities—and how unpatched flaws can change or even end lives;
  • How to reconsider the popular conception of hackers, and understand their role in helping build a more secure digital world;
  • How the Computer Fraud and Abuse Act (CFAA), a law that is supposed to punish computer intrusion, has been written so broadly that it now stifles security researchers;
  • What we can learn from the culture around airplane safety regulation—including transparency and blameless post-mortems;
  • How we can align incentives, including financial incentives, to improve vulnerability reporting and response;
  • How the Supreme Court case Van Buren helped security researchers by ensuring that the CFAA couldn’t be used to prosecute someone for merely violating the terms of service of a website or application;
  • How a better future would involve more collaboration and transparency among both companies and security researchers.

Tarah Wheeler is an information security executive, social scientist in the area of international conflict, author, and poker player. She serves on the EFF advisory board, as a cyber policy fellow at Harvard, and as an International Security Fellow at New America. She was a Fulbright Scholar in Cybersecurity last year. You can find her on Twitter at @Tarah or at her website: https://tarah.org/.  

If you have any feedback on this episode, please email podcast@eff.org. 

Below, you’ll find legal resources - including important cases, books, and briefs discussed in the podcast - and a full transcript of the audio.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators

Warm Vacuum Tube  by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch

Come Inside by Snowflake (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/snowflake/59564 Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba

Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone

reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721 

Resources

Consumer Data Privacy:

 Ransomware:

 Computer Fraud and Abuse Act (CFAA):

 Electoral Security:


Transcript

Tarah: So in 2010, I was getting married for the first time. And as I was walking down the street one night, I see one of the local bridal shops had its front door, just hanging open in the middle of the night.There's no one around, it just looks like someone has maybe thought that the door was closed and left out the back perhaps. So I, I poked my head in, I look around, Hey, is anybody in here? I closed the door, kind of latch it all the way until I can feel it rattle, and lock it from the inside, pull it shut. And I left a little note on the door saying, Hey, folks just want to let you know, your door was open in case there's something wrong with the lock.

And, and I left. Never heard back from them again. Not a single acknowledgement, not a thank you, not anything. And that is really the place that a lot of security researchers find themselves in when they try to make a third-party report of a security vulnerability to a company, they just get ignored. And you know, it's a little annoying. 

Danny: That's Tara Wheeler and she's our guest this week on how to fix the internet. we're going to talk to her about coordinated vulnerability disclosures and what should happen if you find a flaw in software that needs to be fixed and how people like Tarah can keep you safe. 

I'm Danny O'Brien.

Cindy: And I'm Cindy Cohen. Welcome to how to fix the internet a podcast from the Electronic Frontier Foundation helping you understand how we can all make our digital future better.  

Danny: Welcome Tarah. Thank you so much for joining us.

Tarah: Thank you so much for having me. Cindy's an incredible pleasure. Thanks so much, Danny.

Cindy: Tarah, you are a Cyberpolicy fellow at Harvard, an International  cybersecurity fellow at New America, you were a Fulbright scholar in cybersecurity last year and to our great delight you are also a member of the EFF advisory board. Suffice it to say that you know a lot about this stuff, and off the top, you told us a story about walking past a bridal shop, doing a good deed by locking the door and then never hearing back. Can you explain how that story connects to the coordinated vulnerability disclosure world that you live in?

Tarah: Absolutely so coordinated vulnerability disclosure is a process that's engaged in by multiple stakeholders, which translated down into normal human  means there needs to be a way for the company to get that information from somebody who wants to tell them that something's gone wrong.

Well, the problem is that companies are often either unaware that they should have an open door policy for third-party security researchers to let them know something's gone wrong. Security researchers, on the other hand, need to provide that information to companies without having it start off with sounding like a ransom demand, basically.

Danny: Right

Tarah: So let me give you an example. If you find that there's a vulnerability, something like, I don't know, a cross site scripting issue in a company's website, you might try to let that company know that something's gone wrong, that they have a security vulnerability that's easily exploitable and public to the internet.

Well, the question for a lot of people is what do you do if you don't know how to get that information to somebody at a company. The industry standard very first step is make sure that you have the alias security@company.com available as an email address that can take reports from third-party researchers. We in the industry and the community just sort of expect that that email alias will work.

Hopefully you can look on their site and find a way to contact somebody who's in technical support or who's on the security team and let them know something's wrong. However, a lot of companies have an issue with taking those reports in and acknowledging them because honestly, there's two different tracks the world operates on. There's sort of the community of information, security researchers who operate on gratitude and money. And then there's corporations that operate off of liability and public brand management. So when a company gets a third party report of a security vulnerability, there's a triage process that needs to happen at that company.

And I'm here to tell you, as a person who's done this both inside and outside companies, when you are a company that is receiving reports of a vulnerability, unless you can fix that vulnerability pretty quickly, you may not wish to acknowledge it. You sometimes get pressure from inside the company, especially from the lawyers, because that can be seen as an acknowledgement that the company has it seen a triage and we'll repair the vulnerability in a timely manner. Let me assure you a timely manner looks really different to a security researcher than to internal counsel.

Danny:  If somebody finds a vulnerability. And reports it to a company and a company either blows them off or tries to cover it up, what are the consequences for the average user?

Like how does it affect me?

Tarah: How does it affect a normal person who's a user of a product if somebody who's a security researcher has reported a vulnerability to that company and the company never fixes it?

Danny: Hmmm Hmmm

Tarah: Well, I don't know about you, but I'm one of the 143 million people that lost my personal information and credit history when Equifax decided not to patch a single vulnerability in their servers.

Behind every data breach is a story. And that story is either that people didn't know something was wrong, or people knew something was wrong, but they de-prioritized the fix for it, not understanding how severely it could impact them and consumers and the people whose data they're storing.

Danny: You talked about a coordinated, vulnerable disclosure, so who's coordinating and what's being coordinated? 

Tarah: When we talk about multiple stakeholders in a vulnerability, one of the things we're talking about is not just the people who found it and the people who need to fix it, but also the people who are advocating for the consumers who may be affected by it.

That's how you'll get situations like the FTC stepping in to have a conversation or two with companies that have repeatedly failed to fix major vulnerabilities in their systems when they're protecting consumer data. The EFF as a great example, tends to want to protect a larger community of people, not just the researchers, not just the people working at the company, but all the people who are impacted by a vulnerability. So a security researcher finds something that's wrong and reports it to a company, the company's incentives need to be aligned with the idea that they should be fixing the vulnerability, not suing the researcher into silence. 

Cindy: EFF’s had a pretty significant role in this, and I remember the bad old days when a security researcher really pretty much immediately either got a knock on the door from the law enforcement or, you know, service of process for being sued for having the audacity to tell a company that it's got a security problem.

And what I really love about the way the world has evolved is that we do have this conversation now more often than not in the software industry. But you know, computers are in everything now. They're in cars and refrigerators, they're in medical devices that are literally inside our bodies, like insulin pumps and heart monitors.

I'm wondering if you have a sense of how other industries are doing here now that they're in the computer software business too.

Tarah: I was recently on a panel for internet of things security at the Organization for Economic Cooperation and Development. And I was talking to somebody from who was previously from the Australian consumer product safety commission or their equivalent of it. And I am here to tell you that having computers in everything is a fascinating question as a consumer product safety person's entire perspective had very little to do with whether or not there was a software vulnerability in the computers that they were using, but whether or not the product that, that computer had been put in, dealt with temperatures 

So when we start talking about putting a computer in everything, we start talking about things that can kill people when we talk about temperature differentials, altering the temperature inside refrigerators and freezers changing, whether a sous vide machine has a readout that's appropriate or not on it, that's the kind of vulnerability that can kill people.

Danny: Do you think there's something particularly different from software compared to other disciplines where we've sort of sorted out the safety problem? Like bridges don't fall down anymore. Right. Is there something we're doing with bridges that we're not doing with software or is it just that software is hard?

Tarah: Number one, software's hard. Number two, I love an industry I'm going to bring up instead of bridges and that is aviation. And I will tell you what aviation does differently than we do in information security: they have a blameless post-mortem to discuss how and why something occurred when it came to accidents. They have a multi-stakeholder approach that brings in multiple different people to examine the causes of an incident. This is we're talking of course, about the NTSB and the FAA, the National Transportation Safety Board and the Federal Aviation Administration. And in aviation the knowledge exchange between pilots is perpetual, ongoing, expected, regulated, and the expectation for those of us who are in the aviation in the community, is that we will perpetually be telling other people what we did wrong. There's a culture perpetually of discussing, revealing our own mistakes and talking about how other people can avoid them in a way that is, there's no penalty for doing so. Everyone there will tell you what they've done wrong as a pilot to deeply convince you to not make the same mistakes. That's just not true in information security. We hide all of our mistakes as fast as we can bury them under piles of liability and that dreaded email subject line attorney, client confidential. That's that's where that comes from perpetually is this culture of secrecy here, and that's why we have this problem.

: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science, enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. 

Cindy: I think that is really a great place to pivot a little bit to how we fix it, because you know, one of the reasons why the lawyers are so worried is because the liability risk is well sometimes massively overblown, but sometimes real. And, the same thing is true on the other side, which is the law is set up so that the risk to a security researcher, the risk of telling somebody the truth, right. And then on the other side, the liability threat doesn't line up the company's incentives with what's best for society. That to me does not mean that we protect the company against any liability, when a plane falls out of the sky, we still expect the company to be held accountable for the damage that they do. But the liability piece doesn't get in the way of learning from the mistakes. And I think in the rest of software world, we're seeing the liability piece get in the way. And some of that is changes we can make to law and policy, but some of that is I think some changes we need to make to how these problems are lawyered.

Tarah: The lawyering in a recent case in a ransomware attack on a healthcare facility, I think it was the South, just resulted in a lawsuit from a patient whose daughter died after not receiving sufficient care during a hospital, experiencing a cyber attack. And she's filed suit now and the concern by people who are watching this process occur: people suing hospitals for not revealing that they were under cyber attack or not providing them appropriate care during a ransomware attack is not likely to create a situation where hospitals are more open about the fact that they're experiencing a network outage. It's likely to result in hospitals, turning patients away from hospitals during the middle of ransomware attacks.

Now that's the exact wrong lesson to learn. So when I look at the way that we're thinking about liability in critical infrastructure, the incentives are totally wrong to be publicly open and honest about the fact that a hospital is experiencing a cyber attack. The hospital chose to not pay the ransom, which it's important to note that they may never have gotten the records back or the ability to get their network up back anyway, or that it may not have happened in time to save this woman's daughter. But at the same time, we can't teach hospitals that the lesson that they need to learn from experiencing cyber attacks and lawsuits is that they need to shut up more and pay more ransomes. We can't teach institutions that that is the right way to respond to this fear of liability.

Cindy: One of the ways that we've helped fix this in California is the data breach notification law, right in California. If you have a data breach, and this impacts a lot of people around the world, because so many companies are based in California, the company's liability risk goes down not to zero- cause I think it's tremendously important that people can still sue if their kids die because of a data breach. Like you can't remove all accountability, but the accountability shifts, if you tell people in a timely manner about the data breach. So there are things we can do and you know, a national data breach notification law is one of the things that we could consider. We can set these incentives, such that we shift the companies, risk evaluation towards talking about this stuff and a way from not talking about it.

The California law is a good example, but you know, many of the federal cybersecurity laws are about, well, you got to tell the government that, but you won't tell anybody else that. 

Tarah:  There is a dearth of qualified senior cyber security professionals out there. And that is the legacy of the 1986 computer fraud and abuse act. The government did it to themselves on this one because the people who care enough to try to follow the law, but also have the curious minds to start doing work on information security research, to start doing offensive security research and trying to help people understand what their vulnerabilities are, are terrified of the CFAA. The CFAA stops independent security research to a level that I think most people still don't really understand. So as a result, we end up with a culture of fear among people who are trying to be ethical and law abiding, and, a situation where people who aren't just have to evade somebody in federal law enforcement, in the United States long enough to make a quick profit and then get out.

Danny: And this isn't just in the United States, right? I mean, one of the things that's been most disappointing, I think about the CFAA has been that it's been exported around the world and that you have exactly that same challenge for people being turned into criminals instead of good citizens, wherever they live.

Tarah: And we're looking at it at a law that should have had a sunset provision in it to begin with and a law that was created and put into place and supported by a judge who thought that you could whistle into a payphone to launch nuclear weapons. Look, people, computers are not magic sky fairy boxes full of pixie dust that can somehow change the world. I mean, unless you're mining for blockchain, cause we all know that that's the magical stuff. The same situation applies here that computers are not magic. There's some flaws in our legal system, in the United States and the CFA is often used to sprinkle over the top to get indictments. We already have laws that describe what fraud is, what theft is and saying that it's happening over a computer doesn't make it not fraud. Doesn't make it worse than fraud. It's just a different medium of doing it. 

We all know what right and wrong is. Right. And so adding a law that says, if you use a magic pixie box to commit a crime, then it's somehow worse. Doesn't make any sense to those of us who work with computers on an everyday basis. 

Cindy: I think that one of the lessons of the CFAA is maybe we shouldn't write a law based upon a Matthew Broderick movie of the eighties. Right. The CFAA was apparently passed after president Reagan saw war games, which is a very fun movie, but not actually realistic. So please go on.

Tarah: So the nature of the CFAA, going back to the real story here, is that its being used by people who don’t understand it, to prosecute people who never intended to break laws, or who if they did, we already have a law to cover that question. So the CFAA now is being used mostly from what we're able to see in industry to stop exiting employees of large corporations from setting up competing businesses. 

That's the actual use, quietly behind the scenes of the CFAA. The other very public use is to go after people who have no business being prosecuted with that law. They might be bad people. The police officer who collaborated with criminals to harass women and used his departmental computer to look up information in the recent United States vs Van Buren. The problem we have here is that this police officer was charged under the CFAA. Now he wasn't a good guy, but we already have a name for the law that he broke. It's abuse of the public trust, there's fraud, there's theft. 

And the problem we're having here is that the crime he committed is the exact same whether or not he looked it up on his laptop or whether he looked information up on these women, by going back and looking through a file cabinet that was built entirely out of paper and wood back at the department.

So we're inventing a law to prosecute information security researchers, employees who are leaving companies or unfairly to prosecute bad people who committed crimes we already have names for. We don't need the CFAA. We already know it when people have done the right and the wrong thing, and we already have laws for those things, but it's very easy for a judge to be convinced that something scary is going to happen on a computer because they don't understand how they work.

Cindy: Let's switch a little bit into, you know, what does it look like if we get this right. So we've already talked about one, which is the computer fraud and abuse act isn't being used to scare people out of doing good deeds anymore. And that this idea that we have a global cooperative network of people who are all pointed towards making our security network is something that we embrace instead of something that we disincentivize. And we need to embrace that both on the individual level with things like the CFAA and on the corporate level, aligning the corporate incentives and then on the government level, right where we encourage governments, not to stockpile vulnerabilities and not to be big buyers on this private market, but instead to give that information over to the company so the companies can fix it and make things better.

What else do you think it looks like if we get it right, Tarah?

Tarah: If we get it right, security researchers who are reporting vulnerabilities to companies would be appropriately compensated. That doesn't mean that a security researcher who reports a vulnerability that hadn't been found but is a small one should be getting a Porsche every single time.

It does mean that researchers who try to help should at the very least experience some gratitude. When you find a vulnerability that's reported to, you report to a company that is a company killer, fully critical could take the entire system, the entire company down, you should be receiving appropriate compensation.

Cindy: We need to align the incentives for doing the right thing with the incentives for doing the wrong thing is what I'm hearing you say.

Tarah: That is correct. We need to align those incentives.

Cindy: And that's just chilling, right? Because what if those security researchers instead sold that vulnerability to somebody who wants to undermine our elections? I can't imagine something that's more important that it be secure and right, and protected, then our, you know, our basic right to vote and, and live in a democracy that works. When you set up a situation in which you're dealing with something that is that important to a functioning society, we shouldn't have to depend on the Goodwill of the security researchers to tell, you know, the good guys about this and not tell the bad guys we need to set the incentives up so it, it always pushes them in the direction of the good guys and things like, you know, monitors for our health and the protection of our vote, are where these incentives should be the strongest. 

Tarah: Absolutely. That same mindset that lets you find vulnerabilities after the fact lets you see where they're being created. Unfortunately there's first, not enough companies that do appropriate product security reviews early in the development process. Why? Because it's expensive and two, there's not enough people who are good qualified product security, reviewers and developers. They're just, there just aren't enough of them partially because companies don't welcome that a great deal of the time. Right at this moment, the cybersecurity field is exploding with people who want to be in cybersecurity. And yet at the most senior levels, there is simply no doubt that there is a massive lack of diversity in this field. It is very difficult for women, people of color, queer people to see themselves at the top of companies when they don't see themselves at the top of companies, right. They don't see themselves succeeding in this field, even though I am here to tell you, comparatively, the wages are really good in cybersecurity.

So please all of my friends out there, get into a training class, start taking computers apart because this is a great field to be in if you like puzzles and you want to get paid well, but there's a massive lack of diversity and to open those doors fully to the number of people that we need in this field, we have got to, got to, got to start thinking differently about what we think of cybersecurity expert looks like.

Cindy: There seems to be a lack of imagination about what a hacker looks like or should be like.  Tarah, you don’t look like that image….we really need to create better and a wider range of models of what somebody who cares about computer security looks like and sounds like.

Tarah: We do. What you're actually looking for is a sense of safety, a feeling of security in yourself that you hired a smart, educated person to tell you everything's going to be okay. That's this human frailty that gets introduced into cybersecurity again and again. It's hard to reassure somebody that you're an expert if you don't look like what they think an expert looks like, and that is the barrier for women, people of color in, and people of color in cybersecurity right now, they have to be trusted as experts. And it's just a problem to get through and to break through that barrier.

Cindy: This is a community project. to me is a piece of recognizing we are on other people’s computers all day long, and sometimes other people are on our computers. When somebody comes along and says “hey, I think you’ve got  a security problem here, the right thing to do is to thank them. Not to attack them, much less throw the criminal law at them.  

Tarah: If you are a person inside a company, I want you to send an email to security@yourcompany.com, whatever that is. And I want you to find out what happens. Does it bounce back? Does it go to some guy that doesn't work for the company anymore? Does it, as I have previously discovered go to the chief of staff of the CEO. So like, just take a look at where that goes, because that's how people are trying to talk to you and make your world better. And if nobody's checked that mailbox in a while, maybe open it up and, you know, stare at it with one eye closed behind one of those like eclipse viewers. Cause it's going to explode.

Cindy: I could totally do this all day, Tarah. It's so fascinating to talk to you and to see your perspective, because you really have been in multiple different places in this conversation. And I think with people like you we could get to a place where we fixed it if we just had more people listening to Tarah. So thank you so much for coming and talking to us and giving us the kind of straight story about how this is playing out on the ground.

Tarah: It's incredibly kind of you to invite me. Cindy and Danny, I just, I want to hang out with you and just, you know, drink inappropriate morning wine with you and yell about how everything's broken on the internet. I mean, it's a wonderful pastime at the same time it's a wonderful opportunity to, to make the world a little bit better, just recognizing that we are connected to each other, that these, the fixing of one thing in one place doesn't just impact that one thing, it impacts everybody. And it's wonderful to be with you and get a chance to make things a little better. 

Cindy: Well, that was just terrific. You know, Tarah's enthusiasm just spills out and her love for computer security and security research. Just, it's infectious. And it, it really made me think that we can do a lot here to make things better. You know, what, what really struck me is that, you know, I have been an enemy of the computer fraud and abuse act for a very long time, but she really grounded it in how it terrifies and chills security research. And ultimately, you know, hurts our country and the world. But what she said was very specific, which is it's created a culture of fear among people who are trying to be ethical and law abiding. I mean, that really ought to stop us cold. And you know, the good news is that we got a little bit of relief out of the Supreme court case van Buren that we talked about, but there's just so much more to go. 

Danny: I think she really managed to convey the stakes here and the human impact of these sort of vulnerabilities. It's not just about your credit rating going down because personal data was leaked. It's about how a child in a hospital could die if people don't address security vulnerabilities. 

Cindy: The other thing that I really liked was Tarah's really focusing on aligning financial incentives, kind of on both sides, the penalties for the companies who don't fix or talk about security vulnerabilities and compensating the security researchers who are doing us all a favor by finding them. You know, what I like about that is you talk a lot about the four levers of change that Larry Lessig first identified law code norms and markets.And this one is very focused on markets, and how we can align financial incentives to make things better. 

Danny: Yeah. And I think people get very nihilistic about solving the computer security problem. And I think that Tarah’s citing of an actual, real, pragmatic, inspiration for how you might go about improving it was really positive. And that was the airline industries, where you have a community that comes together across businesses, across countries, and works internationally in this very transparent and methodical way to defend against problems that have a very similar model, right? The tiniest error can have huge consequences and people's lives are on the line. So everybody has to work together. I liked the fact that there's something in the real world that we can base our utopian vision on.

Cindy: The other thing I really appreciated is how Tarah makes it so clear that we're just in a networked world now.  We spend a lot of time connected with each other on other people's computers and the way that we fix it is recognizing that and aligning everything towards a networked world. Embracing the fact that we are all connected is the way forward.

Thank you to Tarah Wheeler for joining us and giving us so much insight into her world.

DANNY: If you like what you hear, follow us on your favorite podcast player. We’ve got lots more episodes in store with smart people who will tell you how we can fix the internet. 

Music for the show is by Nat Keefe and Reed Mathis of BeatMower

How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology. I’m Danny O’Brien.

CINDY: And I’m Cindy Cohn. Thank you so much for joining us today. Until next time.