Episode 101 of EFF’s How to Fix the Internet
If you get pulled over and a police officer asks for your phone, beware. Local police now have sophisticated tools that can download your location and browsing history, texts, contacts, and photos to keep or share forever. Join EFF’s Cindy Cohn and Danny O’Brien as they talk to Upturn’s Harlan Yu about a better way for police to treat you and your data.
Click below to listen to the episode now, or choose your podcast player:
Today, even small-town police departments have powerful tools that can easily access the most intimate information on your cell phone.
When Upturn researchers surveyed police departments on the mobile device forensic tools they were using on mobile phones, they discovered that the tools are being used by police departments large and small across America. There are few rules on what law enforcement can do with the data they download, and not very many policies on how the information should be stored, shared, or destroyed.
Recently Upturn researchers surveyed police departments on the mobile device forensic tools they were using on mobile phones, and discovered that the tools are being used by police departments large and small across America. There are far too few rules on what law enforcement can do with the data they download, and not very many policies on how the information should be stored, shared or destroyed.
Mobile device forensic tools can access nearly everything—all the data on the phone—even when they’re locked.
You can also find the Mp3 of this episode on the Internet Archive.
In this episode you’ll learn about:
- Mobile device forensic tools (MDFTs) that are used by police to download data from your phone, even when it’s locked
- How court cases such as Riley v. California powerfully protect our digital privacy-- but those protections are evaded when police get verbal consent to search a phone
- How widespread the use of MDFTs are by law enforcement departments across the country, including small-town police departments investigating minor infractions
- The roles that phone manufacturers and mobile device forensic tool vendors can play in protecting user data
- How re-envisioning our approaches to phone surveillance helps address issues of systemic targeting of marginalized communities by police agencies
- The role of warrants in protecting our digital data.
Harlan Yu is the Executive Director of Upturn, a Washington, D.C,-based organization that advances equity and justice in the design, governance, and use of technology. Harlan has focused on the impact of emerging technologies in policing and the criminal legal system, such as body-worn cameras and mobile device forensic tools, and in particular their disproportionate effects on communities of color. You can find him on Twitter @harlanyu.
If you have any feedback on this episode, please email email@example.com.
Below, you’ll find legal resources – including links to important cases, books, and briefs discussed in the podcast – as well a full transcript of the audio.
EFF is deeply grateful for the support of the Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology, without whom this podcast would not be possible.
This work is licensed under a Creative Commons Attribution 4.0 International License. Additional music is used under creative commons license from CCMixter includes:
Warm Vacuum Tube by Admiral Bob (c) copyright 2019 Licensed under a Creative Commons Attribution 3.0 Unported license. http://dig.ccmixter.org/files/admiralbob77/59533 Ft: starfrosch
Come Inside by Snowflake (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/snowflake/59564 Ft: Starfrosch, Jerry Spoon, Kara Square, spinningmerkaba
Xena's Kiss / Medea's Kiss by mwic (c) copyright 2018 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/mwic/58883
Drops of H2O ( The Filtered Water Treatment ) by J.Lang (c) copyright 2012 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/djlang59/37792 Ft: Airtone
reCreation by airtone (c) copyright 2019 Licensed under a Creative Commons Attribution (3.0) Unported license. http://dig.ccmixter.org/files/airtone/59721
- Riley v. California
- EFF’s Case Page on Riley v. California
- EFF Blog Post: State v Burch
- EFF Amicus in State v Burch
- New Jersey’s State v Carty
- Minnesota’s State v Fort
- Harlan Yu of Upturn
- EFF Blog Post: So-called “Consent Searches” Harm Our Digital Rights
- 2020 Upturn report on law enforcement searching mobile phones
- 2020 Upturn written testimony to DC Council on police budget and surveillance technologies
- EFF Blog Post: Congressmembers Raise Doubts About the “Going Dark” Problem
- The Voluntariness of Voluntary Consent: Consent Searches and the Psychology of Compliance
- EFF Blog Post: Secret Court Orders Aren't Blank Checks for General Electronic Searches
- EFF Blog Post: Appeals Court Avoids Hard Questions About the “Collect It All” Approach to Computer Searches
- Access Now: What spy firm Cellebrite can’t hide from investors
Transcript of Episode 101: What Police Get When They Get Your Phone
Harlan: the fact that all of this information is collected and saved on your phone, right? Your web browsing history, your location, history. This is all information that is now kept digitally in ways that we'd never had records of this. And so over this past decade, smartphones have become this treasure trove for law enforcement.
Cindy: That's Harlan Yu. And he's our guest today on How to Fix The Internet. Harlan is the executive director at Upturn where he's working to advance equity and justice in the way technology is used.
Danny: Harlan's going to talk to us about some of the tools used in policing. This tech makes law enforcement much more powerful when it comes to street level surveillance, and we'll explore some of the dangers in that.
Cindy: Harlan has solutions that will make us all safer and protect our privacy. One of our central themes at EFF is that when you go online or use digital tools, your rights should go with you. Harlan is going to tell us how to get there.
Cindy: I'm Cindy Cohn, EFFs executive director.
Danny: and I'm Danny O'Brien and this is how to fix the internet, a podcast of the Electronic Frontier Foundation.
Cindy: Harlan. Thank you so much for joining us. At Upturn, you have been working in the space where technology and justice meet, and I'm really excited to dig into some of this with you.
Harlan: Thanks so much for having me, Cindy.
Cindy: So let's start by giving an explanation about what kinds of tools police are using when it comes to our digital phones.
Harlan: last year and over the past two years, my team and upturn and I, we published and have been doing a lot of research on law enforcement's use of mobile device forensic tools. Now what a mobile device forensic tool does is it's a device where law enforcement will plug your cell phone into that device. It allows law enforcement to extract and copy all of the data. So all of the emails, texts, photos, locations, and contacts even deleted data, off of your cell phone. And if necessary, we'll also circumvent the security features on the phone.
Harlan: So for example, device level encryptio in order to do that extraction, once it has all of the data from your phone, these tools also help law enforcement to analyze all of that data in much more efficient ways. So imagine, you know, gigabytes of data on your phone, it can help law enforcement do keyword searches, create social graphs, make maps of all of the places that you've been.You know, so an officer who's not super tech savvy, will be able to easily pour over that information. So it can help officers automatically detect photos and filter for photos that have, say weapons or tattoos or do text level classification as well.
Cindy: Yeah, there were some screenshots in that report that were really pretty stunning. You know, a little cute little touchscreen that lets you push a button and find out whether people are talking about drugs. Uh, another little touch screen that lets you identify who the people are that you talk to the most often.
Cindy: you know, really user-friendly
Harlan: these tools are made by a range of different vendors. The most popular being, uh, Celebrite, gray shift, which makes a tool called gray key, magnet forensics. And, you know, there's a whole industry of vendors that make these tools. And what our report did was we submitted about 110 public records request to local and state law enforcement agencies around the country, asking about what tools they've purchased, how they're using them, and whether there are any policies in place that constraints their use. And what we found was that almost every major law enforcement agency across the United States already has these tools. Including all 50 of the largest police departments in the country and state law enforcement agencies in all 50 states and the District of Columbia.
Cindy: Wow, all across the country. How much are police using it?
Harlan: We found through our public records requests that law enforcement have been doing, you know, hundreds of thousands of cell phone searches, and extractions, since 2015. This is not just limited to, you know, the major law enforcement agencies that have the resources to purchase these tools. We also found that many smaller agencies can afford them. So cities and towns with under, you know, tens of thousands of residents with maybe a dozen or two dozen officers, places like Shaker Heights in Ohio or Lompoc in California, or Walla Walla, Washington. The breadth and availability of these tools, was pretty shocking to us.
Cindy: You know, people might think that this is something that the FBI can do in national security cases or that, you know, we can do in other situations, in which we've got very serious crimes by very dangerous people. But the thing that was stunning to me about the report you guys did was just how easy it is to do this, how often and how mundane the crimes are that are being, uh, that are being identified through this. Can you give me a couple more examples or talk about that a little more?
Harlan: Yeah, that's exactly right. I think one of the main takeaways from our report is just how pervasive this tool is. Even for the most common. You know, I think there's this narrative, especially at the national level around encryption back doors, right. And the way that story gets told is that, you know, that law enforcement will use these tools in high profile cases, cases like terrorism and child exploitation, you know, they even use a term. Around exceptional or extraordinary access, which kind of indicates that access will be rare. I think what our report does is that it challenges this prevailing wisdom that law enforcement is going dark.
What law enforcement is saying as far from the entire story as a report points out where these kinds of tools and the law enforcement interest in accessing data on people's cell phones happens, not only in cases involving major harm, but we documented our report where across the country, these tools are being used to investigate cases, including graffiti, shoplifting, vandalism, traffic crashes, parole violations, petty theft, public intoxication, you know, the full gamut of drug related offenses, you name it. These tools are being used every day on the streets, in the United States right now.
Danny: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science, enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.
Danny: So you say that, that these devices not only can scan for data, but also make copies. Is there any kind of understanding we have about how long those copies are kept?
Harlan: That is a really important issue. One thing that we asked law enforcement agencies to provide to us through our public records requests were whether they have any policies in place. Just about half indicated that no policies at all among those only about nine had policies that we would consider detailed enough to provide any meaningful guidance to constrain what officers do.
So I think for in large part law enforcement agencies don't have specific policies in place around the use of these tools and that includes, you know, how long a law enforcement agency can retain and save that data. Now, maybe I'll just raise here a recent case in Wisconsin state vs Birch, which is a case that the eff, the ACLU and, Epic recently filed an Amicus brief in which was a case in Wisconsin where this suspect Birch was involved in this hit and run. So the police verbally asked Birch whether or not they could see his text messages. The suspect said yes, and the police had Mr. Birch sign kind of a vague consent form to search the phone. Right.
And rather than just searching and looking at text messages based on the vague consent form, law enforcement did a full forensic extraction of the phone and copied all of the data. Ultimately they found no evidence in that particular case, but then they stored that data. Now months later, Brown County Sheriff's office was investigating a homicide and they suspected that Mr. Birch was somehow involved.
And so based on the extraction that a different police department did and retaining that data, Brown County Sheriff's office then was able to get a copy of that extraction and searched the phone again and found that the suspect viewed news about the murder and there was location data on the phone that indicated that he might be around the location.
And in any case, he was then arrested and charged with the homicide an entirely different case from the first extraction. So I think that case illustrates, I think the dangers of, well, not only consent searches, which we can talk about, but the dangers of indefinite retention and the use of these tools overall.
Cindy: Oh, it's just chilling, right? I mean, essentially the police have a time machine. Right. And if they get your information at any, any point in time, then they can just go back to it later and look at it. And I think it's important to recognize that the cases that we hear about, like this case in Wisconsin, are the cases in which they found something that could be incriminating, but that says nothing about all the stories underneath the surface, where they didn't find anything, but they still did a time machine search on people.
Cindy: I want to talk about consent in a second, but I think one of the things that your report really points out is that given the racial problems that we have in our law enforcement right now, this has very serious implications for equity, for who's going to get caught up in these kinds of broad searches and time machine searches And I wonder if you want to talk a little bit more about that.
Harlan: overall we see law enforcement adoption of these tools as a dangerous expansion in their investigatory power. Given how widespread and routine these tools are being used at the local level, and given also our history of racist and discriminatory policing practices, across the country that continues to this day, it's highly likely that these tools disparately affect and are used against communities of color. The communities that are already being over policed.
Danny: What are the kinds of things they can get from these searches?
Harlan: Mobile device forensic tools can access nearly everything, all the data on the phone, even when sometimes when they're locked, right. You know, in creating mobile phone operating systems, designers have to balance security with user convenience, right? So even when your phone's locked, it'd be really nice to get notifications, to know when there's an email or a, an event on your calendar.
Moreover, many Americans, especially people of color and people of lower incomes rely solely on their cell phones to connect to the internet.
Harlan: And so over this past decade, smartphones have become this treasure trove for law enforcement, where, you know, the information that we store on our phones, arguably. Contains much more sensitive information than even the physical artifacts that are in our homes. Which has traditionally been, perhaps the most sacred place, in terms of constitutional protection from intrusion from the government.
Cindy: Now I want to talk a little bit about, you know, how the courts have been addressing the situation we know, and a great victory for, uh, for privacy. We won a case called Riley vs California in the Supreme Court a few years ago that basically said that you can't search somebody’s phone incident to arrest without a warrant. You need to go get a warrant.
Harlan: Law enforcement is required to get a warrant to perform these kinds of phone searches, but there are many exceptions to this warrant requirement. One of them being the consent exception- this is a really common practice that we're seeing on the ground, right? When there's a consent search, those searches are then not subject to the constraints and oversight that warrants typically provide.
Now, that's not to say that warrants actually provide that many constraints in reality. And we can talk about that. We see that more as a speed bump, but even those basic legal constraints are not in place. And so this is one of the reasons why one of the recommendations in our report is to ban the use of consent searches of mobile phones, because this idea of consent search in the policing context is essentially a legal fiction. There are several states that have banned the use of consent searches for in traffic stop situations, New Jersey in 2002 Minnesota in 2003.
Earlier this year the DC police reform commission, they made the recommendation to the DC council that the DC council prohibit all consent searches, not just for mobile phones, but a blanket prohibition across the board. And if DC council takes up this recommendation, as far as I know it would be the first full ban of consent searches anywhere across the country.
And so that's where Upturn believes that, the law should go.
Cindy: Yeah. I just think that the idea of even calling them consent searches is a bit of a lie, right? You know, the, the, you know, either let us search your phone or let us search your house, or we're going to take you down, you know, and book you and hold you for how many hours they possibly can, like that isn't a consent, right?
I think that one of the things that we're doing here is we're trying to be honest about a situation in which consent is actually the wrong word for what's going on here, you know. I consent to lots of things because I have free will. These are not situations like that.
Danny: And I don't think that people would necessarily understand what they were consenting to. I mean, this has been eye-opening for me and I, I feel like I track this kind of thing, but if we're talking about banning consent searches using this technology, do you think the technology as a whole should be banned, do you think police should have access to these tools at all?
Harlan:I think the goal needs to be, to reduce the use of these tools and the data available to law enforcement.
Danny: So, would that be a question of wrapping the use of these tools into sort of serious crimes or putting some constraints about how the data is used or how long it is stored for.
Harlan: I mean, I, I would worry about even legitimizing the use of these tools in certain cases, right? Again, when there's a charge, it's just the accusation that a person committed a particular crime. And I think no matter what the charge is, I think people should have the same rights. And so I don't necessarily think that we should relax the rules for certain kinds of charges or not.
Cindy: It's, it's a big step to deny law enforcement a tool, and so what's the other side of that?
Harlan: Well, I think we can look toward, all of the costs that our system of policing has on our society, right? When people get roped up into the criminal legal system in the United States, it's extremely hard to then, you know, with a criminal record, get a job, have other economic opportunities to the extent that these tools are you know making law enforcement more powerful in their investigative powers. I'm just not sure that that's the direction that our society needs to go. Right. The incarceration rate in the United States is already, you know, far outside the norm.
Danny: I think the way I tend to think about it is that we have this protection, as you say, in a how a home and possessions, but when you talk about mobile phones, you're actually getting much more closer to kind of your people's internal thought processes and it feels more like either an interrogation or in some cases when you can go back and forth like this, a kind of mind reading exercise. And so if anything, these very intimate devices should have even more protections than we have to our closest living environments.
Harlan: One commentator called the use of these tools in particular, create a window into the soul. Right? These searches are incredibly invasive. They're incredibly broad. And yeah, as you're saying, you know, traditionally the home has been the most sacred place. There's an argument today that our phones should be just as sacred because they have the potential to reveal much more about us than any physical search.
Cindy: We talked about the fourth amendment briefly, but it plays a role here too right?
Harlan: The fourth amendment requires warrants to describe with particularity the places to be searched and the things to be seized. But in this context, oftentimes law enforcement agents also rely on the plain view exception which effectively allows law enforcement to do anything during these searches, right?
Harlan: This is a problem that legal scholars have wrestled with and EFF has wrestled with for decades where for physical searches, the plain view exception allows law enforcement to seize evidence in plain view of any place that they're lawfully permitted to be right. If it's immediately obvious that the incriminating character of the evidence is there.
But for digital searches, you know, this standard makes no sense, right? This idea that digital evidence can, can exist in quote unquote, plain view, right? In the way that physical evidence can considering how the software can display in sort, oversees data. I think is just incoherent. The language can vary from warrant to warrant, but they all authorize essentially an unlimited, and unrestricted search of the cell phone. So I think there's a questions here too, even in the search warrant context is whether these warrants are sufficiently particular. I think in many cases, the answer has got to be clearly no.
Danny: So these tools to analyze these phones are made by companies all around the world. Do you think they're used all around the world?
Harlan: Yeah, I think, human rights activists have been seeing this happen all around the world, especially for journalists who live in authoritarian countries, where, yeah, we're seeing, you know, lots of governments, purchasing these tools and using them to limit freedom of speech and freedom of expression in many other places, in addition to here.
Cindy: So let's, let's switch gears and talk a little bit about what the world looks like if we get this right. This is unlike a lot of difficult problems, this is one where you've really clearly articulated a way that we can fix it. So let's say that we ban law enforcement use of these devices, or we ban evidence you know, collected through the use of these devices from being admissible, some kind of extension of the exclusionary rule. How's this going to feel and work for those of us who have phones, which is by the way, all of us.
Harlan: I think, you know, people will probably need to worry a little bit less, or less frequently about the ways that powerful institutions like the police can have that window into your soul to have an inside look to the things that you're thinking, the things that you're searching online, the things that you're curious about, the places that you're going right with location data being stored on the phone. Whether you're going to a doctor's office or a church or other, other religious institution, all sorts of sensitive information will at least be accessed less frequently by law enforcement in a way that hopefully will provide a greater sense of freedom and liberation especially in the society that we live in here in the United States.
Cindy: the freedom and the space of privacy that we get is not just for the individual whose phone is seized. There's a broader effect on this, not just for the people who are, you know, find themselves pulled over by the cops. It's going to be for all the people who ever talked to interact with, learn from, or read about, uh, the people who get pulled over by the cops.
Harlan: Yeah, that's absolutely right. Right. The photos on my phone have some pictures of me, but are also of my family are also of my community. And my text messages also include obviously sensitive data that other people are providing to me. The contacts in my phone, right? Just my social graph.
Danny: So one of the things that I think can make people feel a little bit hopeful in what can feel a very oppressive story. Is what they can do to change this. And what is the role of individuals in transforming, this story?
Harlan: I'm not sure that individual decisions are really gonna. Us to the future where we want to be. Right. We can't tell individuals to buy a higher end cell phone if they don't have the resources to do so. Right. Or to have every individual, you know, configure their phones in just the right way. I'm not sure that that is a realistic outcome. To get to where we want to be. I think, you know, the better approach is to look more systemically at the problems with our law, with the problems in law enforcement and problems where, you know, we can fix it, for everyone, you know, at the systemic level. And I think that those are the areas of opportunity in which we should focus.
Danny: In this positive vision where we're, we're presenting, is there a role for the phone companies themselves? Is there some capacity that they should be playing even in a sort of utopia where the laws and policies in the courts support, protecting your privacy?
Harlan: Yeah. The phone manufacturers have essentially been playing a cat and mouse game with law enforcement over decades, right. Uh, these tools that are being created by celebrite and gray shift, you know, they can break into the latest, you know, iPhones and the highest end Samsung, Android phones, with rare exception, right? And so there's this idea too, that even in the case of a locked phone, that law enforcement is having trouble getting access to even if, you know, you just turn on the phone and there is device encryption, there's actually a significant amount of information on iPhones that remains un-encrypted outside of the encrypted portion of the phone, what technical folks called before first unlock now after the first unlocks of the user unlocks the phone and then it gets locked.
Even more un-encrypted data becomes available. Right.
Danny: Why is that?
Harlan: That's a design decision that most manufacturers make to provide users with, you know, convenient features. This is just what they believe is the right balance. And so, yeah, I think there's a role here for the phone manufacturers to continue to address vulnerabilities and to make it more difficult for law enforcement to get access.
I think there's also a potential role here to play by the vendors of the mobile device forensic tools. Right? I think one thing that we suggest in our report is that the vendors of these tools ought to maintain an audit log for every search, right, that details, the precise steps that a law enforcement officer took when extracting and analyzing the phone. The goal here would be better equipped. In cases to push back and to challenge the scope of these searches. If we could, for instance, played back using, say automatic screen reading technology, play back exactly what an examiner looked at or the process in what the examiner took in doing the search.
This would allow the judge and the defense lawyers, a chance to ask questions, and for defense lawyers to have a better chance potentially of suppressing over seized information.
Cindy: What does public safety look like in this world?
Harlan: Public safety is not the same as policing. Right? I think public safety means communities and individuals who have economic safety, who have economic opportunity, have stable housing, have job opportunities, have a good education. Right? I think we need to, you know, as, many black feminists have laid out the vision around defunding the police, right? The idea here isn’t just to tear down the police, but the process of what we have to build up.
Cindy: I really agree with you Harlan, getting this right isn’t about whether we give or take away a particularly sophisticated law enforcement tool. It’s about shoring up the systems in communities that are too often unfairly targeted by surveillance. At EFF we say we can’t surveil ourselves to safety and I think your work really demonstrates that.
Harlan: The idea here isn't just to tear down the police, but really the process of what we need to build up to support people and their families and their communities, which is things that don't look like surveillance tools and law enforcement as we have it today, but the absence of that and the creation and the existence of other structures that are supportive of people's livelihoods and ability to thrive and to be free.
Cindy: Oh, Harlan. This has been so interesting. And we really enjoyed talking with you. And the work that you guys do at upturn is just fabulous. Right? Really bringing a deep tech lens into the tools that law enforcement is using and recognizing how that's going to impact all of us in society, but especially the most vulnerable.
Cindy: Thank you so much for taking some time to talk with us. And, uh, let's, let's hope we move towards this, this vision of this better world together.
Danny: Thanks Harlan, it’s been great.
Harlan: Thanks so much for having me.
Cindy: Well, that was just terrific. You know, one of the things that struck me on this as we've spent a lot of time on this podcast, and of course, EFF has, you know, fighting for the ability for people to have strong encryption, especially on their devices. One of the things that the research that Upturn did demonstrated is that's just a tiny little piece of things. In general, our phones are broadly available and everything that's on our phones and even stuff in the cloud that's accessible through our phones is widely available to law enforcement. So it really strikes me as funny that we're spending on this tiny little piece where law enforcement might have some small problems getting access to stuff where in the gigantic piece of it they're already having free access to everything that's on our phone.
Danny: Well, I think that there's always this framing that the world is going dark for law enforcement because of encryption. And no one talks about the fact that it's lighting up like a huge scanning display when it comes to the devices themselves and every technologist you talk to says, yeah, all bets are off once you hand a device to someone else because they can undo whatever protections that you might have on it. I think the thing that really struck me about this, though, that I hadn't realized is just how cheap and available this is. I did have it in my head that this was an FBI thing, and now we're seeing it used by really quite small local town police departments and for very low level crime too.
Cindy: Yeah, it's eye opening. I think the other thing that's opening about this work is about how law enforcement is using consent, or at least the fiction of consent, to get around a very powerful Supreme Court protection that we got in a case called Riley vs California in 2014 that bans search incident to arrest without a warrant and the cops are just simply walking right around that by getting, you know, phony consent from people.
Danny: I've been in that situation going through immigration where I'm asked to hand over my phone and that it's very hard to say no, because you just kind of assume they're going to flick through the last few entries and that's not what happens in these situations.
Now Harlen wants to ban these consent searches completely. Do you agree with that?
Cindy: Yeah, I really do and the reason I do is because it's so phony. I mean, it's the idea that these are consensual, it doesn't pass the giggle test, right? The way that power works in these situations and the pressure that cops put on you to call this consent, I think it's really not true. And so I don't think we should embrace legal fictions and the legal fiction that these searches are consensual is one that we just need to do away with because they are not.
Danny: So while we're talking about banning consent search is one of the more positive things I got out of this discussion is there's no implication that we should be banning or forcing people to be more cautious in how they, they use their phones. Harlan called these tools essentially creating a window into the soul. But I think they also enhance our lives. I mean, they're not just a window into the soul. They actually give us ways to remember things that we would forget. They give us instant access to the world's knowledge. They make sure that I will never get lost again. And, and all of these things are things that we should be able to preserve in a free society. Despite the fact that they are so intimate and so revealing, I think that just means that they have to have the same protections that we would give to the thoughts in our head.
Cindy: I think this is one of the ways that we need to make sure that we fix things. We need to fix things so that people can still have their devices. They can still have their tools. They can still outsource their memory and part of their brain to a device that they carry around in their pockets all the time. And that is protected. The answer here isn't to limit what we can do with our devices. The answer is to lift up the protections that we get from law enforcement in society over the fact that we want to use these tools.
Danny:Thank you for joining us on how to fix the internet. Check out our show notes, which will include a link to Upturn’s report. You can also get legal resources there, transcripts of the episode, and much more “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation's Program in Public Understanding of Science and Technology.
And the music is by Nat Keef and Reed Mathis of Beatmower. Thanks for being with us today. And thanks again to our guest Harlan Yu from Upturn. I'm Danny O'Brien.
Cindy: And I'm Cindy Cohn.