ROUGH EDITED COPY ELECTRONIC FRONTIER FOUNDATION (EFF) May 6, 2021 3:00 p.m. ET CART SERVICES PROVIDED BY: www.whitecoatcaptioning.com * * * * * This is being provided in a rough-draft format. Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings. * * * * * >> CINDY COHN: Oh, hello. Hi, everybody. I am Cindy Cohn, and I am the Executive Director of the Electronic Frontier Foundation, and I can hardly contain my excitement at our event today. It's going to be so much fun because it's combining some amazing people with a topic that I am incredibly passionate about, and that is the effective surveillance and privacy on all of us. So welcome. This is a part of our 30th anniversary year. And what we have been doing this year is having a series of conversations that we're calling EFF30 Fireside Chats, commemorating the 30 years at EFF has been fighting for you and fighting for digital freedom. So remember, that we are saving the final 15 minutes of this program for you. Drop your questions into the chat at twitch.tv/efflive, and we will pick them up and respond to as many as we can. Please also note EFF has a code of conduct that we want everyone to be kind to each other today so make sure you keep that in mind. So this series of live discussions look back at some of the biggest battles in Internet history, their effects on the modern web and what they can teach us about where we go from here. Today, we're looking a closer look at surveillance and the way that privacy and its absence affect each of and you say our communities and I want to talk about what it will look like if we get it right. So joining me today are three very special guests first, Policy Analyst Matthew Guariglia. I always screw that up. I'm sorry, Matthew. Matthew's worked on EFF's activism team focuses on surveillance and privacy issues at the local state and federal level. The folks at Ring can attest to his effectiveness. He's historian of policing and his buy lines appeared in MSNBC news, Washington Post, Slate, Motherboard and EFF's own blog. Thank you, Matthew. And next is up our Director of engineering for Certbot, Alexis Hancock. Alexis Hancock is the leader of the HTTPS Every Web Encryption Projects, an expert on digital freedom and on consumer technology. Her latest research focuses on online rights and our digital identities in the age of COVID‑19. Welcome, Alexis. >> ALEXIS HANCOCK: Thank you for having me. >> And finally I'm pleased to introduce someone you may have heard of maybe a little bit. His name is Edward Snowden. He is a former intelligence officer who in 2013 blew the whistle on the NSA's massive unconstitutional surveillance apparatus or gave us standing as well as other global spying programs affecting millions of ordinary people around the world. He is a best‑selling author, a technologist and cybersecurity expert and the president of the board of our dear friends, the Freedom of the Press Foundation. And of course a renowned model of hoodies and EFF stickers. Thank you so much for joining us, Ed, and for barring all time zones and being with us here today. >> EDWARD SNOWDEN: This is really a pleasure and the first time I can see comments. So everyone commenting on live chat, hello. >> CINDY COHN: Today we will talk about surveillance. >> And we want to bring together a few strands of issues that EFF has been working on for a long time. And these strands are often separated in the way that people think about it where we think about surveillance as a national security issue. We think about things that we call the EFF street level surveillance like cameras all around us these days. And the technologies that help protect us against surveillance. And counteract mass surveillance especially. So around the world, these involve separate technologies, separate bodies of law, separate law enforcement agencies and administering agencies. but to me at the bottom they're all asking the same question of us. Is it going to be possible to have a private conversation in the digital age? If so, what are we going to gain from that? And how do we get there? If not, what are we going to lose if we lose the ability to have a private conversation in the digital age? And it's very much a playwright now. And I also want to say today we will focus on Government mental surveillance. And not as much about corporate surveillance, but if anybody tells you that you have to choose between the two of these or one is more important than the other they're kind of being phony with you, truth is. And Matthew's work really highlights this. The surveillance done by companies is both available to and often spurred by governmental pressure and incentives. There's a kind of what I think of as an unholy alliance between the corporate trackers and the government trackers at this point. The companies build technologies that have surveillance built in because they want to know everything you're up to and often sell it into a somewhat squeezy ad market, or even themselves, just use it, to create information around building technologies and I'm looking at you building technology Internet people. And also because they want toe track and you say target ads to us and that has become the holy Grail of the Internet. And then the governments love it to. It means they can just go to a company and find out what you have been up to and don't have to bother with pesky things like making sure you know that they're tracking you. And they can use mask techniques like tapping into the Internet backbone or sending a vice request to Google or Facebook to find everything they have. These things work together. And anybody in the says we should focus on one or the other or we have to pick between the two of them they're not dealing in an evidence‑based world, and I believe it's important that we do that. So, this is why EFF has long pushed not only against governmental surveillance but for baseline privacy regulation around the world. And it's why we think it's long time for companies to start thinking about you as someone they owe an obligation to do. Lawyers call this a fiduciary duty, this idea that when somebody holds your data, they need to have the same responsibility for it that your accountant does when they hold your information or your doctor does when they hold your information or your lawyer when they hold your information that they have to be primarily responsible to you for it and can't be two faced for it and tell you one thing and actually do something else. So with that baseline of what we're going to talk about today, we're going to talk until about 12:45 and then switch to questions. So if you have questions go ahead and put them in the Twitch stream and our team is picking them up and putting them in a place where we can get to them. So let's get right to it. Oh, my goodness. I need to get to the right page. So let's start, ed, with you. 2013, you did a little thing and it woke up the world about government surveillance. I'm assuming if you're tuning into an EFF livestream you probably know a lot about what Ed did and how it helped us. But what has changed since then? What are the changes that you see from where you sit in governmental surveillance of the kind that you showed us all? >> EDWARD SNOWDEN: Let me dial back just an inch from that. Because there are a lot of people that are familiar with what I did but it's been, what, 8 years now, since 2013. Some of the people who are on the stream were probably chewing on crayons eight years ago. And for those who don't remember generally, there was a time when people didn't understand that everything you did on the Internet was watched to the degree that we know is true today. Like there were many people who were spying on you for different things but if it was a company that was doing it, and there's only so much they could know and it was very limited. It was not clear that there was a very intimate embrace between corporations and government ‑‑ not just one government, not just the United States Government, which is the one exposed but also other governments around. In the United States there was what is called the Five Eyes Intelligence Alliance, which was the anxious lick can countries, the United States, United Kingdom, Australia and New Zealand. This was a very close chamber partnership to the point where they all had their sort of spy sensors on the Internet that were watching communications as they raced pass it's past. So Yukon September although eyes that, just think for a second. How is this stream getting to your laptop? Or your phone? Right? From these people who are on different sides of the planet. How functionally does that work. The Internet is not made of magic. A little app that you tap on your phone is not made of magic. And you can say it's through Wi‑Fi or I'm connected to AT&T Verizon ‑‑ sure, but how does that work? The reality is all of our communications today are intermediated by other people's computers and other people's infrastructure, cables buried in the ground, cell phone towers on a building down the street, or satellites you know, that cross the sky. But all of these medians are owned by someone. It can be the Starbucks that you're sitting in, back when the world was open. And the idea is, all of these different line that he is the you cross, wireless access, it has to be connected to a cable somewhere, the coax comes out and then the host, Comcast or whoever your provide of provider is. And Comcast has an agreement with someone else. They have a big cable somewhere that goes to another service provider which goes to a data center that goes across the ocean to another one and another one and another one that eventually lands in someplace in Russia where I am or San Francisco where Cindy Cohn is, or New York or something where you might be. And all around the world this giant meshed together network of many different providers, you didn't know what was happening. Once that communication hit the cable, that goes out of your wall or hit the radio waves as it exits your phone and hits the cell phone. All of these lines that you're writing across, the people behind them were taking notes and communications there are largely unencrypted by means they were translating this network electronically naked. There was a time when everything you typed in the google search box not only was visible to Google, stand by still true today, but it was also visible to everybody else who was on that Starbucks network with you; right? It was visible to your Internet service provider who knew this person who paid for this account served for this thing on Google. Facebook and all of these other guys, anybody who is in between your communications could take notes on these things. And then what is still true today, even as our communications become encrypted, meaning in short armored as they transit this hostile ‑‑ it's like wearing clothing or, you know, being in a car with blacked out windows. They can still see your communication running across the network but can't see what is inside. This is not true of all communications but it's an increasing fraction, especially with emails which people don't send encrypted. So the bottom line is, our innocence was being exploited. We didn't realize how hostile the narrative was. I came out in 2013 and showed not only upstream communications, the things that I described were happening but this slide which you may not see because it's tiny but what is called the PRISM program is which is where the government was working with ‑‑ and this is literally the USA working with the FBI to go to Google, to Apple and Facebook and say what want to see what is in this person's account and if you weren't American ‑‑ and we have a lot of persons on the call, but 95 percent of the world's population is not American, they didn't even need a warrant for this. They could ask for it based on a reasonable articulated suspicion, which I was taught at NSA is really just a gut feeling that you can write down and they can get everything in your account. For an American they could still get the information. There's just a slightly higher legal bar. But for now what this means is your communications, as they cross the wire, can be taken by anyone because they could and were there. Anyone who was holding information, especially the big corporate sides with the massive Internet companies, and every "like" you put on Instagram, every link you put from Facebook, so on and so forth ‑‑ they had these giant dossiers were which becoming perfect records of our private lives and sharing them. So Cindy asked, way back from the history lesson, since 2013, since we learned, as many of us know, those of us paying attention, that this was happening, what has changed? See, the funny thing about how this works in open societies, at least in theory, is we can't collectively decide on response toe a problem until we agree on what the problem; until we know what is happening. And the most important thing happening since 2013 is that people now know we are being exploited, right, and an awareness of what is happening has risen. What this means is that we can start to respond to it bit by bit. It takes time. Some of these require technical solutions, some of these require legal solutions which means they require political solutions and this is why we have groups like the EFF. If you're not a lawyer, in you're not a politician, if you're not an activist, if you're not a lobbyist or whatever, you are not going to be able to influence the policy and review process. But maybe you can help EFF or the ACLU or some other group and they can do this on your behalf and move forward. And this has really changed. EFF, ACLU, many different organizations around the country have been suing the government and forcing the Court to prove their programs and with the investigations over the last through years, the reports are raisings an eyebrow on what the government is doing and saying, well, they're violating the Constitution. Well, courts don't like to jump to the Constitution thing. They say you have violated the law and likely violated the Constitution. Soon enough they will say you have violated the Constitution. But courts always act like 10 or 20 years late. What happens in that intermediate period? Well, what people like you do. They install a different app. Maybe stop using Facebook. Maybe change your behavior. Maybe you don't take your phone with you if you go to a certain place where you don't want to be tracked. Maybe you start to use encryptors instead of something less secure like SMS. Picture and this is the idea. When we think about this problem of surveillance on the Internet, there are many different problems. There's communications in transit, what we describe as the hostile path where all of our stuff was naked. Once we encrypt it becomes more difficult to see into. Mass surveillance becomes more difficult. So everybody starts trying to spy through easier and cheaper ways. What is that? Is that going to the companies. But then what if you make sure the companies have less information by using sort of peer to peer communications where the company doesn't hold information that they can see. This is zero knowledge encryption. Now they can't get information to the same degree as it crosses the Internet. They can't get the same degree of information from the companies. You have protected yourself >> CINDY COHN: That's great. >> EDWARD SNOWDEN: What about not everybody who is a specialist. There's still so many problems and we're still fighting to fix them, but we're not there yet. There's so much work to be done >> CINDY COHN: I really appreciate you calling out and explaining about this kind of pathway that things go on. That scenario where we really have taken great strides, and Alexis I would love for you to talk about Certbot and our role in encrypting the web because it's one of the bigger changes, and I think it doesn't get the attention that it does because policy people like to talk about policies. But let's talk about the tech great. >> Alexis yesterday I checked on the bench mark that said most web track is encrypted above 95 percent which is a great deal of news >> EDWARD SNOWDEN: That's a huge change. >> Alexis: Yes. It's leaps and pounds from 2013. And 2015 was a turning point, especially since major browsers like Chrome started to input messaging saying this site is not secure. And sites started to scramble ask say, how do we do this? This is where Certbot steps in. It's a free open source software at EFF that we work on to automatically supply XSL certificates or secure certificates for transit, automating it for web sites, web service everywhere. And they have issued millions of certificates by now. >> EDWARD SNOWDEN: Let me jump in for a second here. For those people in the chat who are not familiar with what is a certificate and why is the EFF issuing certificates. It's not like a paper certificate. All of the encryption that we're talking about for web traffic, this big change that she mentioned, we're up at 95 percent now is because of the issuance of these kind of security certificates as they're called, which is really just a way to make the idea of public encryption more legible. You can look it up on Wikipedia or on YouTube but this is the way all of your communications are protected and this has worked and what Alexis and the EFF are doing. Your communications are more protected because of this issuance of security certificates. >> Alexis: Yes. Security certificates help keep data private in transit so you will see in your URL browser, you will you see HTTP. That was the norm for a while. And HTTS, you can look at the S as the secure part of the once you see that happening in your browser you will know that your data is encrypted in transit so that way it can have a layer of protection over what is traveling between your request and that website server. So that's, like, more of an explanation there. But I'm really happy that now that we have had such more benchmarks according to, you know, the monitors of, like, chrome and Firefox saying, there's 5 percent of traffic now ‑‑ Firefox just last year put in an ACCS only mode in their browser and other browsers also offer default ACCS as well, and there's other browsers starting to do things like keep the protocols up to date around TLS, the protocol that dictates HTTPS. There are other things happening, of course, where mixed content blocking. That's more in the weeds of what things happen where you will have an HTTPS secure site but may be unencrypted links on the page still. So blocking that. Things of that nature. So Certbot helps to automate those certificates for the website which creates the little padlock that you will see on HTTPS in your browser and using tools like that really helps along where it makes HTTPS the default and makes HTTPS just work in the background for people. So I think it's one of those things that don't get talked about a lot because of the fact that it's becoming more of a normal process. It's not as many arguments against HTTPS. It's going to be performance problems and hard to deploy on the users. But now we have all seen that is not actually the case and going forward, there's a lot more accessibility to actually do that in Certbot and they helped to make that a thing. >> CINDY COHN: HTTPS everywhere. Those are the pieces that you don't see and shouldn't restraining order to see unless you're setting up a website but it gives people the same kind of security that ‑‑ you know, nobody sells you a car without brakes, no one should sell you a browser without security. Let's switch gears just a little bit and talk about the kind of what we call street level surveillance, surveillance that people are having in their day. And I want to bring Matt into this because this is really his area, and of course, ed, chime in as we go. But what are the pieces of surveillance that, you know, we also may not see but that are really tracking us every day and why does it matter? >> MATTHEW GUARIGLIA: Yeah, I think as we said earlier, I think the boundaries between not just private and public surveillance are blurred but between federal and local surveillance, with things like fusion centers which are emerging and all of the information that can be collected by local police and sending it up the chain of command to the federal government. But if you are living in the United States today, you are likely walking past or carrying around with you street level surveillance everywhere you go, and this is double if you live in a concentrated urban setting or in an overpoliced community. So you're likely to drive past automated license plate readers which are going to photograph and record and timestamp your car every time you pass by a camera. Private and public security cameras ‑‑ this is where it gets a little tricky. Because even if it's an Internet of things doorbell camera that you put on your door, police can get a warrant or request from you or your neighbor that footage or even in San Francisco we have business improvement districts as many cities do which put up their own semi private security cameras. And you think so what, the store knows that I could Connell in. But the police can often rely on this ‑‑ this footage to request and to bring warrants. And a lot of the manufacturers of street level surveillance technology are counting on it as part of their marketing. Because they have seen the police make effective marketers. If they build a special interface that police have access to they're more able to go out to the community and sell to it stores or HOAs to put it up because they have the benefit of putting up private surveillance that is accessible to them without going through the bureaucratic loopholes or having the city pay for it themselves. The facial recognition, shot spotter, which are the microphones in cities which supposedly can defect and record supposed gunshots, although now there's increasing speculation that they have a high rate of false positive at this for fireworks and car backfires. And the reason why all of this matters is because on .1 hand think trigger more police interaction. As we have seen with news of immigrants police interactions often have vital outcomes. If a shot misfires and sends armed police to the site of what they think is a shooting there's likely to be ‑‑ the higher chance of is a violent outcome if they get there where the kids may be shooting firecrackers and the police think they're shooting and the same thing is with facial recognition which police claim ask just an investigative lead. And they immediately go out and arrest that person and they follow up to see if they were even in the city of the supposed robbery they're following on. The other way this affects a community, any time you put a community under the microscope like some cities are under in terms of constant street level surveillance on every block via geo location or license plate readers, police are going to find a reason to arrest and harass people. Because a majority of what police doe every day are not finding serial killers and terrorists. They're ticketing people for Jay walking and they're arresting people for jumping turnstiles at the subway. So if you put any community under the microscope that these oversurveilled communities are under, you're going to find a reason to harass and arrest and ticket people. This provides a statistical problem. Because swales is part of the self‑fulfilling philosophy. The more you surveil them, the more you ticket them and then the numbers make it look like this is an area that needs policing and this impacts communities of color, vulnerable communities and communities already overpoliced. >> And of course if you had machine learning on to that, you double down on the targeting so it's one of the things that we have all, I think, hopefully have learned at this point that ‑‑ even more data out of machine learning systems and what do you think about the opening up of I think more and more awareness and more and more surveillance at the hyper local level. >> EDWARD SNOWDEN: Thought talked about things like automatic license plate recognition and we have all of these cameras everywhere that are now networked. Those are being increasingly analyzed, not necessarily at the local level but things are sent back, when there's object recognition and it goes, this look like a face. Why don't we send that face back to the mother ship. Then you have a timestamp for it. This person was at this location. Or this face ‑‑ maybe we don't have an identity but then we see the face pop up somewhere else and somewhere else and somewhere else. And then three years later you have a perfect record of this person's attendance. Anywhere they interact with commerce or with a developed area. Maybe it doesn't have you at your home, but the minute you turn on to the interstate, your license gets on the interstate and you drive three states over and you don't even have a phone with you, but your car turns up here and they know you went from this place to this place. Then you go to the local mall, you know, and you enter the parking garage. Guess what, another plate review is there and they know you are here and where you're at. The truth is, should we compile records that are so comprehensive that this isn't like ‑‑ we're not talking to the CIA here. We're talking any company with enough money to pay a data broker to go look. We just want the fire hose. We want all of the stuff that you get. Or at least the ability to query this, because, you know, we don't really care what it is. We relate it with something else. We saw this license plate show up outside of our store at a strip mall and want to know how much money you have, we want to know what neighborhood? We want to go, is there a public record this license plate is associated with this house, for example, and they're starting to blur those now. Can we put this through an image search? Was this license plate photographed anywhere, and then print the license plate picture from the back of your car into an image search and sometimes you will get result. When you combine this with faces and with the presence of your phone ‑‑ remember, your phone is starting a lot of identity fires, to. It storms off Bluetooth beacons and it's telling Facebook what kind of wireless networks are around you, because you know how your phone always ‑‑ you pull down the lamp shade menu and it goes which Wi‑Fi access point do you want to connect to if you're not at home, even if you are at home? Those are a proxy for location, with all of these wireless access points they're radiating so these can be mapped and are mapped. People's apps from the phones report this. And location services is enabled at the same time you have this, then you enable locations and you can interact with the access points and you get this location information. Then they have a GPS fix for all of these Wi‑Fi numbers. Then someone else has their GPS turned off but they are able to see Wi‑Fi, and they still know where you are. And then you turn off all of the location sensors on your phone, but the cell phone still knows where you're at. Like how does the network know? You dial the phone number and that phone rings, wherever it is, instantly. How does the network where you're at if you have location services turned offer?Y have Wi‑Fi and Bluetooth turned off and GPS is turned off but you're still connected to the network; right? Well, you're registering to the nearest cystic fibrosis tower to you saying here I am, here I am. Your phone is constantly screaming out here I am. Is there a cell phone tower that here me. And the cell phone tower says I hear you. You create a handshake and whoever owns that cell phone tower can register you in a network. And then they share this upstream. Picture the phone globally knows where all phones are at all times. That's how they can signal to each other wherever you are. And location services doesn't turn that off. Because it can't turn that off >> CINDY COHN: And it shouldn't. This is where we started off talking about how, you know, there are technical issues, there are legal issues, there are policy issues. Technically I want my phone to ring no matter where I am. You know, that's like one of the great benefits of modern technology. Some people call it a curse, but that you can get a phone call no matter where you R I want that. That's an area where there may be technical things we can do but we need legal protections. And the good news is that the Supreme Court has started recognizing that, that just because you're out in public doesn't mean that there isn't some glomration of information or information over time about you that doesn't trigger a warrant requirement. Now, that's some starter pieces but that's an area to me, when we think about the tools in our toolbox, the technical tools are a little ‑‑ probably ‑‑ they're not going to get to the place where, I don't think, where the company doesn't knows where your phone is because there are ways where we want that to be. But the legal protections can make sure that the company is very limited in what it can do with that limitation, especially when the government comes knocking. So I want to shift a little in the last ‑‑ well, we have a little more time. I wanted to talk a little bit about COVID because we're in COVID times, and a lot of the work that Alexis has done at EFF is to balance the needs of this pandemic with the concerns about surveillance and the recognition that I think, as ed knows well, things that you do in the moments of crises, they don't just go away; right? Like you have to think carefully about what you do in a moment of crises, especially around surveillance. We're in 2021 and trying to stop things that were implemented after the 9/11 attacks that don't have much utility anymore I think it's fair to say. So Alexis let's talk a little bit about COVID and the surveillance concerns we have had and what you guys on the EFF teams especially have done to try to mitigate those. >> ALEXIS HANCOCK: So we have been tracking things since contract tracing came on to the scene. I think the baseline that we've been trying to do is find out where technology can and can't help. And when is technology being presented as a silver bullet for certain issues around the pandemic when it technically should not be. When people are the center of being final bring us out of this; right? And health and access to health care. So one of the things was contact tracing and the other part that has become more recent is digital passports or credentials that are proposed as solutions to present if you're vaccinated or not. And in different context. The context people worry about was the scope creep. We do have situations, international travel, schooling, educational institutions where we present that we're vaccinated or not in certain context and those systems already exist. But what these proposals were presenting are contacts that don't exist currently. Presenting if you're vaccinated going into a grocery store, prevented if your's vaccinated going into services. That's when we started to step in there because talk about, okay, what exactly are we trying to solve here? And are we creating more barriers for people to access health care in a way that makes sense? And storing your medical data with private companies who want to present private solution it is to other private companies doesn't necessarily deem a solution. Picture so we want to be able to talk about that. And with contact tracing we are parameters up, right, to actually have privacy in mind first and foremost because of the fact that, you know, if you wanted contact tracing to be effective it has to be widely adopted. In order to be widely adopted you need people to trust it. People don't trust it, then it won't work. That's one of the things that we have had happen and track much. With vaccine passports most solutions don't talk about what happens after all of this is quote/unquote over. What do we do with the data? What is the data retention policy? Will the context of digital credentials be expanded? We don't know. And that is the actual problem that I actually have is the long‑term consequences that can happen out of this. You mentioned 9/11. We have had entire institutions created just from 9/11. So when we treat everything as the bad guy in the room and protecting us from said bad guys and creating permanent solutions that actually affect people that we were technically saying that we were locating and saving and keeping secure and safe. We have the DHS and ICE and all of these other institutions created out of that time of fear. And we don't want to create more institutions that are permanent and cause digital barriers and cause more detriment to everyday citizens out of fear. We should be creating digital solutions that help. One of my main things is keeping technology in a way where ‑‑ and creating technological solutions that are actually helpful. I'm personally tired of technology feeling so nefarious, I make a phone call or a text and all of these investigators attack me at all places. And I'm personally tired of that. I want technology to work. I love technology. And there needs to be people in place to actually think about it when they're creating new solutions and ask themselves is this actually helping humanity? If not, you need to hold that process and not build that out. Want to re hoping with the digital vaccine passport solutions that they actually help and think about it before they deploy something else that will need to be worked out and fixed. And I don't think the pandemic is a time to roll out experimental technologies like that. >> EDWARD SNOWDEN: That's a great, great response. I just want to follow up on that because, you know, Alexis I couldn't agree with you more. I think a metric that we stopped asking ourselves in the United States, which is really sad relative to, you know, when I was a young man, you know, when I was in school and the teacher was being ‑‑ kids were saying reflexive, it's a free country, isn't it? It feels like people don't say that so much. Like Matthew what you were talking about with local policing and all of these things, and Alexis what you were talking about with more and more increasing layers of commissioning, more and more gates that we have to walk through and in order to go to the store, in order to travel, in order to get a job, we have to prove ourselves and prove ourselves and establish ourselves. The satisfaction of who? To a government? To a corporation? To some kind of institution? To a person? Again and again and again. Whereas, it used to be when it was not bobble to know as much. It was not required that we establish as much about ourselves. And this is something that you and I have been struggling with a lot. When you talk about privacy to other people, you know, it comes off all very abstract and strange. But you know, we as Americans, even if it's wrong, even if it's a false identity, there are things that for a lot of us we believe. With holiday, we watched a show and it felt right like we're the good news and we're wonderful, and a lot of that frankly is wrong. But the idea can still be true. And what is liberty today. >> Is it being able to install whatever app you want? Whether you want Facebook to screw I couldn't remember or google to screw you. Liberty is freedom from permission. It's being able to act without having to ask. And what happened to that? When we begin to intermediate every connection that we have, every connection to our relative to someone that we love, to a friend, to the pizza store, with a sheet of glass, right, that we stair into, that we pay for but doesn't really belong to us because it doesn't answer to us. Increasingly, we don't control it, but it controls us. You can send this or send a picture or you can send whatever and we have to get through this. Have we become more free in the last 20 years or less free? If we're less free, why is that? And what can we do about this. Cindy brought up something interesting in the beginning, this idea of fiduciary duty. It sounds like a crazy legal term like fiduciary or banking or something like that. But think about it. If you go to the hospital, and the doctor knows everybody about you, they know the size of your feet, and they've got it in your medical charts, they're not supposed to sell that to Nike. Nike is suppose supposed to send you coupons for a certain sized shoe. But with the Internet and all of this data collection technology, everybody is using this very personal information they get to effectively build billboards outline outside of your house. If your doctor did that or accountant did that, we burn their office down. Why do we accept that it's okay here? Maybe Cindy is right. Maybe we need to establish a fiduciary duty for these guys in legislation the same way we do for every other class that has privilege and access to information about our lives. Maybe we say this data doesn't belong to them anymore. Maybe this data belongs to us even if they hold it if it's about us. And our devices, they force us to click these he user consent agreements. Signing up for an account and it says click okay and continue. There's no choice like "no thanks" but some of these things you would love to install the app and you have to agree in order to get a job and pay your bills, in order to buy a car. Teslas have these things on them. More and more things, whether they're cars or computers or whether it's your phone, you have to click "okay" to continue. And if you must do something, it's not consent. It's not a choice. We're being given the illusion of content. And all of our laws, all of our politicians are pretending it is equivalent to that. That's why I'm glad to have this conversation with you guys today is say thank you. >> CINDY COHN: So it's great to have you ‑‑ so great to have you on our team, ed, and you really are. Let's go to the questions. I. promised everybody, but I do have a final question for everybody that I'm going to get to at the end and I want you to think about. Give me ‑‑ I want a couple of points from each of you. What will the world look like if we get this; right? What's it going to look right >> EDWARD SNOWDEN: Let's do round Robin >> CINDY COHN: I want you to think about it and I will ask at the end and I wanted to give you a chance to know. This is where I'm focused a lot right now. I feel like ‑‑ and some of this is of course my job is to tell you ‑‑ is to, you know, help run an organization that tells you all the ways that you're being tracked and all of the ways that things are bad and the fights we're in and I feel like we need to take a moment to talk about what this is going to look like if we get it right. We're going to get it right. That's our job. We're going to get there and we need to start thinking about envisioning that. Let's get to the questions. Because we have a lot of them. We will not get through all of them. I apologize in advance. My crack team is pulling them out of the thing and organizing them. The first one is: Obviously facial recognition ‑‑ this is from ick Russ redid you explain ‑‑ facial recognition is a hot button issue with cities taking up initiatives to ban use by governmental entities but facial recognition is something that necessarily piggybacks off of pervasive video surveillance irrespective of the bias of algorithms. What emerging technologies or collaborations of technology do you think pose the biggest risk towards mainstreaming that as city, state, national or international level. >> Matthew I'm going to go to you first because I know you're deep in the facial recognition fight. >> MATTHEW GUARIGLIA: I think one of the startling things is just how casual and easy to use the technology has become. I mean, you have police officers or drones or, even robots to some extent equipped with this technology who can just whip their phone out in a protest, scan over the crowd and get the names and faces and everywhere where their faces appear on the Internet. And it has become casual and ubiquitous, you don't need a warrant to use it often. Oftentimes police enter agreements with the companies without permission from anybody. Oftentimes the companies will mark it with free trials directly to the officers without going to administrators. So I mean I think that the big fear is that, you know, we're going to reach a moment where a drone can zip over a crowd in a protest and every single person in that crowd can be identified, their social media accounts can be linked to the photograph and that opens up people to their reprisal of retribution from the government with what they're protesting, especially if what you're protesting is the police force surveilling you. So it's really about out ubiquitous and easy to acquire it is and easy to use it is and how few regulations that are in how to use it which is why we advocate for banning facial recognition and why so many cities have taken a step to do it >> CINDY COHN: This is a good time to mention the electronic frontier alliance, a network of groups across the country that are really taking this local and if any of you are either members of EFF groups, thank you. If you're not and you want to join one or start one in your group let it go because a lot of this work, EFF is happy to support but it's the local people showing up at the Board of Supervisors or the state legislators and it makes the difference and demonstrates to lawmakers this is something that matters to the community. The next one is directed to Ed. It's R. Zubare. Thank you for joinings us. Constitutional protects are extremely limited and somewhat suspended at the border, airports, et cetera. Fourth amendment and First Amendments respectively. That being the case, how can we enforce the government to protect our digital privacy at the border and how do we pressure the legislatures into passing the necessary impending bills that will intact protect us? >> EDWARD SNOWDEN: It's a great question. The question is how do we pressure legislators to do anything? The sad reality is when you look at the way our system functions, today it's really functioning. This is not socially means. This is political scientists studying the system, and they say if you look at all of the different ways that you can incorporate the U.S. political system as currently working, it's either dominated by economic elites, which means, the CEO's of companies and whatnot are the ones that are able to effectively get their well, represented in Congress and laws passed, or by sort of a plurality of interest groups. These could be corporate or nongovernmental but it has to be a very sophisticated sort of institutional effort in order to get Congress to do anything. And I mean if you look at the history of the U.S. legislation in the last decade, you see it doesn't really represent the public interest. This is very clear. You simply look at public polling and then you look at the legislative history of the issue and you see they don't align. How do we fix that? This is like a Lawrence type of question. You need to structure to the way our politics work with changes in campaign funding to ensure it's not just very rich people who are the only ones, but ‑‑ and I don't need that ‑‑ this is another question I saw come up in the chat who said do I see open source software or nonproprietary software as a primary response to this? The answer of course is yes. I might say free software, but the idea here is there are avenues that the system provides to us for us to sort of appeal for the address of grievances. This is sort of the proper channels argument, whistleblowing. It says go to zero your boss or to your manager and they will fix everything. What if they won't? What if they can't? What I believe we have here, the responsibility to provide and create new systemic avenues for providing for the common good when presented to us, creating, contributing and using free software is a way to pursue this. It's not enough. And this is why organizations like the EFF are so important. Even if you're not part of that interest group, the EFF is the kind of organization that can at least try to provide us some kind of awareness, we're not completely ignored. Remember that is not enough. There's what you can do right here as a developer and maybe you can use these tools. Maybe you're interested in technology and know about it. Maybe you're not. But there's probably something else that you could do, something else that I don't know you can do; right. >> Something that you know you can do. But if you care enough, if you look enough into these things, if you investigate, if you have a curious mind, you will find these three make a difference. We all can individually and also collectively. >> CINDY COHN: Thank you. That's such a great call to action. And of course EFF, our tech projects are open source and we're always looking for volunteers. Our EFA groups and other groups are doing a lot of this work. When it comes specifically to the border, EFF and the ACLU have a piece of litigation where we won tremendously? The District Court and lost horribly in the Circuit Court and thousand we're asking the U.S. Supreme Court to take on and vindicate your first and fourth amendment rights at the border so we're going to do the activism. We're going to do the technology work and we're going to do the legal work because, you know, when people ask me which of these should we do, and the answer is of course always all of the above. As ed points out you need to look into your own heart to figure out where you best fit in trying to help make the world a better place. But everybody has a thing that they can do to help be part of this movement. >> EDWARD SNOWDEN: I just want to underline that Cindy had a much better actual answer to the question there. Of course we're taking it to the Supreme Court. >> CINDY COHN: It's a team effort, my friend. One other question and this is one that I think will be great for folks to hear. This is from J. Paul CVSP. Surveillance culture, given the ever increasing granular surveillance at scale isn't the eventuality of living in a world without anonymity ultimately may be inevitable? If so, do you think it will be possible to balance power between governments and the people? And how would a potentially threatened democracy begin planning for this event all the? More oversight and regulation? So are we ‑‑ this privacy is dead, get over it? Is that the place where we need to be? And I'm going to start with Matthew on this because I know you think about it a lot and then we will go around. >> MATTHEW GUARIGLIA: First of all, I don't think it's inevitable. I think there is a looming backlash that we see at the street rely level, at the federal level of people who have had quite enough of how exhausting it is that tech is betraying you, that you are noble all the time and that these things have consequences, especially for criminalized population. So I think there is a common backlash but I also think that there is an alternate future in which being noble is not as scary as it is now if there's actual a trust and feeling of benevolence between citizens and government, that, like, you know, centuries of criminalization and racialization have created a system where you're afraid to be noble by your government because it's a punitive institution. So I think that future is not inevitable, but if it is, there's a way to imagine that future that is less scary. >> CINDY COHN: That's great. Because you segued into my last question is how do we imagine a better future. Alexis, do you have anything to add? >> ALEXIS HANCOCK: Sorry. I had to un‑mute myself. So I always tell people during my security trainings that no technology makes you a ghost online. None of it. Even the most secure tools out there. And I don't think it comes down to necessarily your own personal burden. Like you're not burdened alone when it comes to that feeling and I think people can relish in the fact there's a more collective unit now that are noticing that burning and it's not yours to bare. So there's that. As far as like future, as far as our private goes it will take fire around with activism, technology and she goes. It's going to have all three of these parts. What I tell people is just don't ‑‑ just know there are people fighting out there that do want to make a better future for you see. And whatever technology they're using, if you feel like your ISP Bo rat you out or your phone would rat you out, do you feel like AT&T or whatever Telecoms that you're using will rat you out, then start targeting the power analysis in your life I can guarantee you somebody is standing there fighting. I guarantee it once you start looking on who is fighting, you will find them. And it's a really corny thing to look on but I go by a quote that if you look for documents that's all you ever see. But if you look for light you will often find it. That is from animation called the last air Bender and very corny but something I live by when I do my privacy work and that's something I usually do and look to. Do tell people to not necessarily just lay all arms down and stop fighting. Somebody is still there and there's something you can to do help them >> CINDY COHN: That is so great. And we have 3 minutes left. Ed, you get the last word on this. >> EDWARD SNOWDEN: I'm not going to promise I will hit 3 minutes. First that was a great show and amazing that you quoted that on stream picture I just want to respond to, like, what Matthew said earlier. The question is basically, like, isn't it true that nobody cares? And whether it is or not, you know, can things be better? And first of all it's not true. If he says nobody cares, so many ‑‑ I think they feel they can't do anything. And I have given talks about this in all kinds of audiences and I always get some ancient Charlie rose looking moderator at these like conferences, very serious people, saying isn't it true young people don't care that much? And I say no, no, it's not true. And I give the same line of an audience with an older generation and younger generation about privacy, and younger people have a stronger response. Younger people understand the intrusion because they've been burned by it. They have a dumb friend who put something they shouldn't on their Facebook tag line or they know someone. They see the damage. Or at least they can understand. But they feel ‑‑ and probably quite rationally ‑‑ that they can't do anything about it. But the thing is, this raises the question of does it have to be that way. We talk about facial recognition, license plate readers. Like these biometrics are being increasingly collected. You can't change your face. You can't change ‑‑ like you don't really get to change your license plate. That's why they do that. And that's why they want your phone number. It's appealing to change your number. They're trying to get ways to track you and make it actively for you to change. When we talked about positioning before and all of the companies do this when new technology comes out, they buy the facial recognition scanners and whatnot. They don't ask for permission. Weep talked about a permissionless world. These governments live in a permissionless world but our world is increasingly unfree. Does it have to be? And particularly talking about data collection and digital surveillance. I think what we need to do is we need to think about the system and what are the points at which it can fail. What are the principles that the operation of these system he say of surveillance and bulk collection and whatever you call them, the presumptions these systems make that they need to operate, what happens if those presumptions fail, like on the corporate prong here we're talking about things like the fact that surveillance as practice is legal, in many cases it's lawful. They say it's legal because you clicked okay to continue, even if you couldn't read the agreement and there are studies showing that if you tried to read these things in a year, you don't have enough time. So how can it be that you see didn't. There's no possible way you could agree. You can't even greed these agreements. And if you move away from corporations to governments, we're not talking about what is legal, because in the words of Henry Kissinger, they said, the legal we do immediately. The unconstitutional takes a little bit longer. They do it because mass surveillance is cheap. It's very simple. It's very easy to do at scale. Okay we make these capabilities costly? I think in all cases the answer is yes. And if, like Alexis said, you're looking around for other people, in the first place, you can't find somebody and the answer there is the person you're looking for is you. When I was at the NSA wearing the EFF hoodie, remember they can't do the work unless people like you contribute to them, you know, I talked to my coworkers, I talked to my colleagues and I showed them what was happening. And it's like nobody was going to do anything about it because nobody felt they could. And the reality is any one of us could do something if only we dared to try. And sometimes the person you're waiting for is you >> CINDY COHN: Well, there's no bigger Mike drop moment than that one, let me tell you. Thank you so much for this conversation. I want to thank ‑‑ we have Alexis back. Thank you Matthew and Alexis and Edward for joining us to and everyone that contributed to the discussion in the chat and who did such great questions and all of you watching around the world. It's frustrating to be in COVID times but this is an event that we couldn't have done at the stale we did if it weren't in COVID times so we look for our silver linings where we can get them. This is EFF's 30th anniversary and I am duty bound as Executive Director to tell you that our movement to protect rights and freedoms begins with you. EFF has led the charge for privacy, free expression and innovation for over 30 years and counting and we need your help to keep up the fight as ed pointed out so generously. I encourage you to go to EFF today and become a member at EFF.org /30. And that's all the time we have today. Thank all for joining us. And I hope that you will continue these conversations about freedom and technology with your colleagues and your friends and loved ones, wherever you are. Thank you so much, and we will see you at the next EFF30 Fireside Chat. Thanks, everybody. >> Thank you, guys, for all of the work you do and everyone for joining us. Stay free.