Writers sit watching a stranger’s search engine terms being typed in real time, a voyeuristic peek into that person’s most private thoughts. A woman lands a dream job at a powerful tech company but uncovers an agenda affecting the lives of all of humanity. An app developer keeps pitching the craziest, most harmful ideas she can imagine but the tech mega-monopoly she works for keeps adopting them, to worldwide delight.
You can also find this episode on the Internet Archive.
The first instance of deep online creepiness actually happened to Dave Eggers almost 30 years ago. The latter two are plots of two of Eggers’ many bestselling novels—“The Circle” and “The Every,” respectively—inspired by the author’s continuing rumination on how much is too much on the internet. He believes we should live intentionally, using technology when it makes sense but otherwise logging off and living an analog, grounded life.
Eggers — whose newest novel, “The Eyes and the Impossible,” was published this month — speaks with EFF’s Cindy Cohn and Jason Kelley about why he hates Zoom so much, how and why we get sucked into digital worlds despite our own best interests, and painting the darkest version of our future so that we can steer away from it.
In this episode, you’ll learn about:
- How that three-digit credit score that you keep striving to improve symbolizes a big problem with modern tech.
- The difficulties of distributing books without using Amazon.
- Why round-the-clock surveillance by schools, parents, and others can be harmful to kids.
- The vital importance of letting yourself be bored and unstructured sometimes.
Dave Eggers is the bestselling author of his memoir “A Heartbreaking Work of Staggering Genius” (2000) as well as novels including “What Is the What” (2006), “A Hologram for the King” (2012), “The Circle” (2013), and “The Every” (2021); his latest novel, “The Eyes and the Impossible,” was published May 9. He founded the independent publishing company McSweeney’s as well as its namesake daily humor website, and he co-founded 826 Valencia, a nonprofit youth writing center that has inspired over 70 similar organizations worldwide. Eggers is winner of the American Book Award, the Muhammad Ali Humanitarian Award for Education, the Dayton Literary Peace Prize, and the TED Prize, and has been a finalist for the National Book Award, the Pulitzer Prize, and the National Book Critics Circle Award. He is a member of the American Academy of Arts and Letters.
I worked at Salon.com when they were new. So it was one of the first online magazines. This was ‘94. There was only like six of us that worked at Salon back then, and we were having a ball. And he showed me a screen that looked like a regular search engine screen. This was in the era of, like, Alta Vista. But instead of you doing a search, you were watching other people do searches. You could watch them typing in words and then getting a result and then typing in the next word. So you were seeing somebody look up like gonorrhea, or treatment for eczema or whatever it was. And it was real people, it wasn't just some sort of demonstration. And we watched that for five, 10 minutes. And then there's just this creeping sense of just disgust, you know, like who came up with this? Who is enabling us to look through this mirror? What mind would think of this to let somebody else access this, even if it's so-called, you know, anonymized.
And at that moment, I remember it like it was yesterday, cuz that was a turning point // And it was a little bit of a foreshadowing of just how creepy things would get, and, and that's a word that I think comes up again and again, you know, is that creepy aspect of it that should have been just pure delight and access and democracy and sharing everything from articles to cat pictures, every, all of these great things about it. But who decided to insert the creepy?
I think it was some very strange minds that got ahold of some of the levers of power very early on and, and had too much of it. And some of them are now billionaires. But I think that that was that moment when I thought, ‘Uh oh.’
That’s author Dave Eggers and he’s talking about an early experience with the potential for creepiness online – a topic he has since satirized in several of his novels. His recent novel The Every describes apps that weaponize surveillance to determine the truthfulness of your friends, the quality of your parents, how happy you are based upon the products you’re buying, and eventually a “SUMNUM” that summarizes your entire worth in a three-digit number.
I’m Cindy Cohn, the Executive Director of the Electronic Frontier Foundation.
And I’m Jason Kelley, EFF’s Associate Director of Digital Strategy.
This is our podcast series: How to Fix the Internet.
One of the things we’re trying to do with this show is to share a bunch of visions of what it looks like when we get the internet RIGHT. We’re grounding ourselves in the four levers of change articulated by Larry Lessig in his seminal work, Code and other laws of Cyberspace: law, code, markets and norms. And I’m excited to talk to Dave today because his books about the problems in our digital world focus on the norms part – how we get sucked into digital worlds despite our own best interests and why. And once we’re there, the other levers – laws, code, markets, become less powerful and can feel almost irrelevant, or easily neutralized. There was a grain of truth there, and I wanted to challenge him to flip the script and think about what happens if we get it right.
Dave is an award winning author, as well as an educator and activist. And in two of his recent novels, he explores, and pretty hilariously satirizes, many of the issues that we deal with here at EFF – social media monopolies,surveillance, and what happens when the problems with the internet grow so big that tackling them becomes nearly impossible.
As Cindy described, in his book The Every, Dave takes all the worst and scariest ideas about online life to their most horrifying conclusion – and no one ever says “ENOUGH!”
So we started with a simple question: What compels him to imagine these worst-case digital scenarios and put them on the page?
I think that we sometimes need the darkest version of where we're going as a species - we need to be able to paint that picture so that we can avoid it. We need to see how bad it could get if we don't change course. And I think that's always been the point of dystopian fiction. And typically it's written by people that are fiercely humanistic and want the best outcome for us as a group and as a species, and are maybe horrified by where we're going and maybe have an idea of like, okay, this is how bad it could get, but we have the time, we have the power to course correct.
That's my goal. You know, I think that we have so much power. So much of this is fixable. And, um, and I think that what I was trying to do is exaggerate it to the point so that it was comical, and also mix that with sheer horror. And this is the sort of way, anytime I'm experiencing digital life, it's a mixture of horror and comedy.
The thing that I really loved about the conceit of it is that there's this, you know, activists tell themselves that if everybody just saw how horrible it could get, you know, everybody would say, whoa, we shouldn't do that and change course. And so you set your protagonist to do that, and then it, it doesn't work, right? Because no matter what crazy idea she comes up with, like, are your friends, your friends and your parents, were they good parents? And all of those kinds of things that everybody just adopts them,and it ends up that it's pretty hard for her to create that moment where we, we all recognize how bad it's gotten and we make the great leap forward to something like that. And I wondered how you, how you thought about that because it, it, you know, as an activist and as somebody who's often trying to bring those moments about, I shared her frustration.
I think that over the last couple decades, especially in this century, I've been surprised at how often what seems to be the worst possible ideas, that I think are not going to get out of the gate, are quickly adopted. And, and made sort of ubiquitous. I mean, Zoom is a good example of, I think crude and flawed technology that makes everyone feel hollow and exhausted. And yet everything, every meeting that everybody does now has to be done via Zoom.
I don't know if any technology has been so globally adopted so quickly, at least not in my lifetime as Zoom. And we haven't necessarily paused and said, do we like this ? Is it the right technology? Do we enjoy it? Does it make us feel better? I think that the look, everybody in their closets, everybody in their garages and their living rooms, they have to think about what their backdrop is, everybody that, that fisheye technology that makes it all look like a hostage video. It's so strange to see how universally adopted it is, when I've never met anybody that likes it. And so, I don't know, that's just one of the thousand ways that I think that we quickly embrace every specious or maybe questionable or in some time, in some cases, purely absurd new twist on digital life. It's weird. I have such infinite hope in humanity, and then sometimes I'm just kind of startled by the things that we'll embrace.
One of the main apps in the book is called, at first it's called Authenti-Friend, and it determines whether or not the person you're talking to, who's your friend, ostensibly, is being honest and sincere and, like, whether or not they are your friend, basically. And there's a moment where the character, the main character is pitching this and she says that she also thinks it could help with depression and maybe even suicidal ideation and things like that because it can detect what kind of thoughts people are really having based on what their face looks like and what they're saying. And she's intending this to be, you know, a terrible idea, which of course is immediately adopted.
And just a few years ago, my partner works in mental health and there was a hackathon where literally this, that exact idea was presented. And all of the other judges who were all VCs were like, this is the best idea anyone's ever had. And she was like, this is a terrible idea. You can't diagnose someone with depression, just with a metric. And then no follow up. There's so many reasons why that wouldn't work. And as I read this, I was like, wow. Are you, were you at that hackathon or is this just like, like where did these terrible ideas come from? As someone who doesn't, like you said, tries to spend less time than probably people like us at e ff looking at new tech, how did you come up with these terrible ideas? I mean, this is a weird mindset to put yourself in, but I feel like it is literally the mindset of a lot of, uh, company founders, you know?
I have the same impulse. I'm trying to think of the worst and most, um, destructive apps and platforms and algorithms possible, but also something that would have some ostensible use or some superficial appeal. Because that's where I think that the line is always interesting to me, is that there are maybe useful applications, and I always used to call it like a fifty one forty nine.
Like 49% of people might say, this is actually really useful. I'm gonna adopt this tomorrow. And then the 51 would say, this is gonna end the species as we know it. And so much of technology that we live with is exactly that. There's so many wonderful uses, but on balance, it's actually diminished us as a species.
And I think that the question is, would people feel like that's too far? If to say, I'm gonna, while I'm, you know, FaceTiming with a friend, it's gonna tell me whether they're lying to me, whether they're being truthful about what they think of my boyfriend, whether they're being truthful about where they were last night when they said that they were staying home, but in actuality, they were out with friends without me. All of these things. I think that the temptation to use that kind of personal interpersonal surveillance technology is so overwhelming that it would be popular and universally adopted kind of overnight, even despite the fact that it's based on mistrust of your friends. It's based on spying. You know, it requires sort of a kind of surreptitious spying on your friends. It itself is deceitful. You know what's so funny is because it's a one way technology where you are spying on your friend to see if your friend is truthful to you. You know, and you're not telling your friend that you're using this technology while you're using it.
But I think that anything that gives people, and this is all of us, I'm not judging because I think that we all are trying to eliminate uncertainty in our lives. And so anything that tells us or seems to answer a question that beforehand had seemed unanswerable – is this person I love and care about being truthful with me – um, we will instantly adopt that technology. And I think that that's the kind of species level pivot that we've made in those last 10, 15 years that I think is especially unsettling.
Think about how we live with credit scores, which a lot of parts of the world do not. And we have three companies that are private and opaque, and they govern our most of our access to opportunity. So through a three digit number that's based on whether you did or didn't pay a credit card bill when you were in college, or whether or not you, you know, your dentist reports you, uh, to a collection agency because you're a month late on a bill, if you dip below a certain number on your credit score, you cannot rent an apartment. You can't buy a car. You can't sometimes be employed. And these companies answer to nobody. And yet it is an outrageous system. It's incredibly anti-democratic. It's most destructive to the most marginalized people.
And yet there's no legislation really that governs them effectively. There's been no pushback. We all accept it because somehow we think that they probably know what they're doing. And we, and we also trust that if there's three numbers, if there's a three digit number, there's some definitiveness to it that we implicitly trust. Like, oh, well, you know, I'm sure it's some incredibly complicated formula that, uh, really knows me. And it is outrageous. It's incredibly crude. And it is absolutely anti-human. And, you know, the countries that don't use credit scores this way are just aghast at how much it governs our lives here.
But the fact that we live with them and accept them and have for decades means that we would probably accept a much more intrusive, all-encompassing number to say, okay, let's incorporate parking tickets. Let's incorporate high school grades. Let's incorporate late fees at the library, everything together to determine your value, your validity, your worth as a human.
And I think that people would accept it because again, the illusionary definitiveness of that number.
Yeah. We've gotten involved with the credit agencies because of their data breaches, right?
They not only collect a lot of data about us, so they can't keep it very well. And, um, and it's amazing in that context, the power of the, well, you know, the American dream is based upon easy credit. That is so empowering to people that we need to tolerate, you know, not only the system, but the fact that the system breaks in ways that really hurts people, um, in the ways that you're talking about, but also in, you know, with identity theft and other things when they can't keep a hold of their data.
It's an amazingly powerful argument, but I think you're right at the, about the underlying emotional thing. The other thing I liked about your book a similar way is, you know, this idea that you can surveil yourself to safety, right? This is something we fight in a lot of the work that we do, that the more, the more we're watching each other and our streets, you know, and, and, and our, you know, our public behavior, but even our private behavior, you know, the safer will be.
I'm always sort of more interested in the average human, how we are empowering the life of 24/7 surveillance and the power of monopolies, how we are giving all this power willingly away. That's always been the most interesting thing to me.
On that note, the jungle as it's called in The Every, is really just an obvious stand-in kind of for Amazon and, and it is monopolies that you're kind of talking about when it comes to, um, the circle combining with this other large sort of shopping and distribution company that is represented as The Jungle in the book. That was something that you actually tackled when you put out the book by not putting it on Amazon. And I think that that was, it sounds like a very difficult thing to do right in the, in 2022 or whatever year, uh, we are, we're at now to put out a book that isn't on Amazon. Sounds like it was surprisingly complex. And I wonder like, how, how did that happen? How did you manage that and how is it working?
Well, we, you know, we have a little publishing company here in San Francisco called McSweeney's, and right now we are five people. And so it took us maybe six months to plan out and work around all of the tendrils and arms of the Amazon Octopus because they are involved in every aspect, not just their channel, but every distribution means.
If you want to be distributed by a given distributor, they have a wraparound deal with every distributor that any book that that distributor distributes also has to be on Amazon, if that makes sense. So you can't go around them without them breaking their overall contract with the distributor that you want to go with.
So our distributor, which is a small one, had to, they had a deal that any book that they put out has to go through Amazon. So we actually had to go through like a, a weird guerilla subset of that distributor to get the book out and not have it go through the same metadata, to have the same system and everything that, that every other book goes through. Because Amazon is the default, they distribute all the data about every book in existence more so than any other system. So to go around that because they are the keeper of all of that metadata, uh, was exceedingly hard. And you sort of have to be vigilant about it every hour and every day so that it doesn't end up there because there's also second, third party sellers that could very quickly pick up a box of the books and put it on Amazon in their own way.
And so it was so hard and such a pain in the ass, but we felt like we had to try it. And we first tried this in 2002 with my second book, You Shall Know Our Velocity. We put it out without Amazon and it was infinitely easier then, but they hadn't, they hadn't taken over so much of the industry. You know, whether it's Goodreads or ABE Books or you know, all of these different aspects of the publishing industry they have swallowed.
And so back then it wasn't all that hard. We could quickly, you know, direct distribute to 500 stores and direct bill them and everything, but it's been so much harder now. And I think that this is a monopoly to end all monopolies. I think that you just don't realize just how much power they have and it's, it's power that we gave them and it's power that's very hard to take back at this point. Anyway, it mirrored very closely some of the dystopian, uh, outcomes in the book to see in, in real life how hard it is to work around them.
I want to jump in here for a mid-show break to say thank you to our sponsor.
“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. So a tip of the hat to them for their assistance.
And now back to our conversation with Dave Eggers. Even though he spends a lot of time sounding alarm bells about the dark side of digital life – much like us at EFF – he’s not anti-technology. Actually, back in the ‘90s, he was a self-described early adopter.
I was a pretty early adopter, I had the first, the second, the third Apple, you know, the Mac. I learned a desktop publishing because of how they sort of democratize access to those of us that can't, uh, do any math. You know, like, computers before the Apple computers, it was completely inaccessible to a mind like me, but suddenly, uh, because of how Apple made it aesthetically inviting and sort of, you know, created an interface that, you know, the humanities people could, uh, understand. It made me a publisher. It made me a graphic designer. It gave so much access in so many ways. And so I was such a devotee way back when. And uh, and I think because I was so enamored with so much, so much of this technology that I got really disappointed when it took so many dark turns of like the conglomeration of wealth and power, the surveillance aspect of it. Every new twist to her just made me really sad because, uh, because I love so much of it. Uh, before that, I mean, as an example, I still have, the laptop I work on on a daily basis is 19 years old. And all my software I bought, then I own it. And I know that EFF talks about this a lot. Like, uh, I have, I can repair it, you know, so it gets into Right To Repair the software. I never have to update because I bought it. I own it. I shouldn't have to have an ongoing relationship with this company. I own the stuff. I know how to use it.
And I think that that relationship to technology is sort of how I like it. Like, you buy something, you buy a great tool that brilliant engineers put together, and that's it. That's the relationship. I say it. See you later. Thank you very much. I admire what you've done. Now leave me alone. Let me use the tool that you created. But I think that this relationship that you can never be free, that you, that, uh, you have to report back to them, that they essentially own what you bought.
This machine that you thought you bought is sort of on loan in a way or on tether to the mothership I think is very disturbing, and very unfree. And very much different than the relationship that we had or in the early days to these, uh, these great machines and technologies, right?
Yeah, absolutely. I mean, I, I really, this, this really sets up one of the main things I wanted to talk about with you, which is what does it look like to you if we get it right, what are the kinds of things that we would experience in our, our day-to-day lives involving technology? But how would it, how would it feel different?
What would, what would it look like if we, if we, if we came up out of this dystopian into something that really worked, and already I'm hearing a piece of it, which is, you know, we, at EFF we say, you know, you bought it, you own it, right? Like the, the, the tools that you are using every single day are yours. And you can tinker with them and play with them and decide whether you, you know, how you wanna interact with them and they're not tethered to a, to a mothership for that. That's a beautiful one. I'm wondering about that, but also if there's any other things that you think of.
Yeah, I think none of these tools should require an internet connection, you know? I mean, until recently I never, I didn't have one at home and, um, I didn't, I went to intentionally to a library or something to do email or any sort of interaction I needed to do there. And it was great.
It was like intentional. That was my hour online. But the fact that so many of these things require being online at all times to use them, um, and that when I see kids, like even if they're writing a paper, they have to get online to access their Google Doc or something, just seems so absurd. But, um, so I think that one, that too, you know, you should have the option of doing any kind of work offline.
And overall, I think we know that every additional minute we spend on a screen, we're less happy. I mean, every sociologist, every psychologist, every study, every everything points to that. Especially when we talk with young people, you know? Um, we know that there's stratospheric levels of teen depression, especially among girls. And, uh, we know that this goes up and has gone up with and parallel with the rise of social media.
So how do we as a species, allow ourselves to live an analog life and choose an hour here, an hour there to be connected, but not to channel everything that we do through a screen, through a phone. How do we make sure that we live intentionally and make those choices and, um, and use the technology when it makes sense and avoid it when it doesn't make sense. We're at a weird inflection point right now where much of education is being channeled through screens and that, you know, accelerated a thousand fold during the pandemic.
But we need to come back and say, we know it's unhealthy for kids to spend so much time on screen. So the educational system has got to be at the forefront of getting them off screens. So you shouldn't have to be turning in your homework on a screen, you shouldn't be reading on a screen unnecessarily. There needs to be a diversity of experiences where paper books are used in the mix and classroom, and three-dimensional sort of in-person experiments and tinkering and outdoor learning. All of these different things with screens being what, 10, 15% of that experience, you know, or whatever that healthy balance is.
There's something almost very American about this idea. Like, if a little is good, a lot's gonna be better. Right? We continue to hear from a lot of kids who having, uh, digital life that isn't tracked by their parents, um, can be a real lifeline for a lot of kids, but that we have as a society.
Doubled down and tripled down and quartered down on the fact everything is tethered, everything is centralized. Of course, everything tracks you, which is a lot of the work that Jason does with student privacy. Um, because the other side of this, it's not just that the kid is being forced to be online, it's the kid is being watched all the time.
But that we, we take something where a little of it might be great and the next thing you know, everybody has pie for dinner every night, you know?
Right. I mean, That's exactly it. And you know, when it comes to like parental and student surveillance, it's all coming from pretty much a good place. We love our kids, parents love their kids. Teachers wanna make sure, you know, administrators wanna make sure everybody's safe. And so there's cameras in every classroom and, and parents track their kids through, you know, Find My Phone because they care about them. They want to know that they're somewhere. But at what cost? You know, for a young person that's been surveilled every day of their life since they're 10 years old, it's a very different experience. And there's no wonder that so many students, young people have trouble with independence and problem solving and, and dealing with, uh, so-called adulting is because they've been, you know, living in a state of constant surveillance since they've been cognizant.
And I think that we have got to be comfortable with some mystery, some nuance. When your, when your teenager goes out, they tell you where they're gonna go, and you have to trust them, you know, and then they come back, which is the way it was for 10,000 years. But I think that we, you know, we've gotta think about the way that we're using these technologies and the way that we are, I don't know, I guess accelerating this very rapid change in the species to be one that, uh, is comfortable living in a, in a panopticon as opposed to one that is truly free.
I look forward to the utopia that you write about as the sequel to the every, I think that's something we, not that you're planning on doing that, but we would love, I think, to hear all the good things that could happen instead of all of the terrible things that do happen.
You know, it's so easy. I mean, I know I, I sound like such a grump, but like, I, uh, it's so easy to live in balance. I just think you have to make choices. I can't have a smartphone because I find them too addictive. So I have a flip phone. I've always had a flip phone, and to me, that's the right balance. And even then I check, I can get this one weird little newsfeed on my flip phone and I check that too much. But I know what's too much from me and, and how I, and it took me a lot of years to find that balance. I knew if I had internet access all day, I wouldn't work. And so I had to make that choice to say, I'm not finishing the day where I want to finish it. I don't feel good.
And I think if we all, and employers allowed this, if schools allowed it, parents allowed everybody to make that intentional choice of how they best feel at the end of the day, then we're a happier species.
We have to remember that we are animals, you know, , we are, we, we do really benefit from being outside and being unstructured and being bored sometimes and being totally free and not being, you know, checking in on, you know, the mothership all the time and just being completely sort of untethered. And I think that's what's missing and that's what's making us sick. You know, that's why there's a societal malaise so many people feel, and that kind of empty feeling that they're feeling is because they're not giving themselves all of these things that for, you know, 10,000 years we've needed as a species.
And I think we just have to remember what we need to feel good to feel fully human. And if we can put all those things in balance and use these wonderful tools, you know, in balance, then I think, uh, we're gonna be okay. But we gotta think about it.
It was really fascinating to talk to Dave Eggers because his new novel really articulated a lot of the concerns that E FF has. And also he's someone whose books I've been reading for frankly, decades. So I was really excited to speak with him and learn, you know, how he came to these conclusions. And in that conversation I learned a lot about what he sees as the best solutions that people can take to fix some of the problems he’s described with our digital lives. Cindy, what parts of the conversation really struck you as being something that you'll come away with?
I mean, I think the thing that Dave is, um, a master of is kind of thinking about, again, what Larry Lessig called social norms, but more like what inside us is drawing us to build some of these bad tools that are not in our interest and kind of excavating that a little bit. And I think it's important, both because it's really good storytelling and it really brings things to life, but also, you know, recognizing the role that we as individuals or as a society play in, you know, going towards surveillance as a solution to every problem. And believing in the modern day phrenology that you can tell, you know, what's going on in somebody's head from their facial expressions or their clicks on a keyboard or something like that. Um, I think that that's his job. Um, I don't think that that's the only way to think about solutions. I think that law and.
The way we code and markets can all play a role in getting out of this hole. But I also think he points out that once you get one company that is very, very powerful, those other tools seem less and less available. And you know, we, we, we have a story in the every about a legislator, um, who's trying to bring some balance into the world.
And you know, the fact. This company is surveilling everybody means that they can bring out whatever, um, piece they need to bring out to, to neutralize them. So the other three levers become less important or less available to us. The more we give power to one company or one set of leaders to do all the rest of it.
So I think that, I think he just does a beautiful job of bringing that to light. And then it's our job, you know, as the activist to try to figure out how to, how to take that and combat it. But it, it's important to recognize that there are powerful forces that are leading a lot of people to choose things that really aren't the right things for themselves or.
I think that's 100% right. You know, I was really struck by his idea of the right to an analog life and the hope that we could build systems that allow people the ability to step away from tech if they want to. What did you think about that?
Yeah, I think that it's really right. And I think for, you know, for some people that's a matter of personal choice, but for others it's a matter of building systems that really allow it, especially in the context of, say, school, and he talked about the need for kids to be able to not have to be in front of screens the whole time that they are in school and having this balance and this mix.
And I think that's really right. And I, again, I think there are individual choices here too, but there's also societal choices that facilitate that. And, and the more you know, the more marginalized you are in society, and the less power you have in society, the fewer of those choices you have.
And the right to an analog life is really just a sort of interpretation of another idea he brought up, which was our ability to have a balance. An hour a day, if you will. I'm sure it's a lot more for most of us, and that's fine, but an hour a day of using devices, for example. Um, I think that ability to be able to set up that balance for ourselves is really important and part of that right to have that analog life or at least have a part of your life, be analog.
One of the things that he talked about there is really, you know, lands in the code question, which is we now have, all of our tools are tied to being online all the time. And, you know, in a position where he can refuse to do that and use older versions of software. But all the rest of us should be able to do that as well. And that's a huge piece of what EFF is trying to do to try to urge people to build tools that support other choices other than being online all the time, especially in the context of the surveillance business model. I appreciate that Dave has basically been able to build for himself, uh, uh, a more analog life, and I think we wanna figure out ways in which we as a society can support that choice for more people.
That’s it for this episode of How to Fix the Internet.
Thank you so much for listening. If you want to get in touch about the show, you can write to us at email@example.com or check out the EFF website to become a member or donate. We are a member-supported organization, so we appreciate all the help that you can give so that we can protect digital rights.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. You can find their names and links to their music in our episode notes, or on our website at eff.org/podcast.
Our theme music is by Nat Keefe of BeatMower with Reed Mathis
How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.
This is the final episode of this season – thanks for listening, and we’ll talk to you again soon.
I’m Jason Kelley…
And I’m Cindy Cohn.
This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by its creators
CommonGround by airtone featuring simon littlefield.
Additional beds and alternate theme remixes by Gaëtan Harris.