Dr. Seuss wrote a story about a Hawtch-Hawtcher Bee-Watcher whose job it is to watch his town’s one lazy bee, because “a bee that is watched will work harder, you see.” But that doesn’t seem to work, so another Hawtch-Hawtcher is assigned to watch the first, and then another to watch the second... until the whole town is watching each other watch a bee. 

Privacy info. This embed will serve content from simplecast.com

Listen on Apple Podcasts Badge Listen on Spotify Podcasts Badge  Subscribe via RSS badge

You can also find this episode on the Internet Archive and on YouTube.

To Federal Trade Commissioner Alvaro Bedoya, the story—which long predates the internet—is a great metaphor for why we must be wary of workplace surveillance, and why we need to strengthen our privacy laws. Bedoya has made a career of studying privacy, trust, and competition, and wishes for a world in which we can do, see, and read what we want, living our lives without being held back by our identity, income, faith, or any other attribute. In that world, all our interactions with technology —from social media to job or mortgage applications—are on a level playing field. 

Bedoya speaks with EFF’s Cindy Cohn and Jason Kelley about how fixing the internet should allow all people to live their lives with dignity, pride, and purpose.

In this episode, you’ll learn about: 

  • The nuances of work that “bossware,” employee surveillance technology, can’t catch. 
  • Why the Health Insurance Portability Accountability Act (HIPAA) isn’t the privacy panacea you might think it is. 
  • Making sure that one-size-fits-all privacy rules don’t backfire against new entrants and small competitors. 
  • How antitrust fundamentally is about small competitors and working people, like laborers and farmers, deserving fairness in our economy. 

Alvaro Bedoya was nominated by President Joe Biden, confirmed by the U.S. Senate, and sworn in May 16, 2022 as a Commissioner of the Federal Trade Commission; his term expires in September 2026. Bedoya was the founding director of the Center on Privacy & Technology at Georgetown University Law Center, where he was also a visiting professor of law. He has been influential in research and policy at the intersection of privacy and civil rights, and co-authored a 2016 report on the use of facial recognition by law enforcement and the risks that it poses. He previously served as the first Chief Counsel to the Senate Judiciary Subcommittee on Privacy, Technology and the Law after its founding in 2011, and as Chief Counsel to former U.S. Sen. Al Franken (D-MN); earlier, he was an associate at the law firm WilmerHale. A naturalized immigrant born in Peru and raised in upstate New York, Bedoya previously co-founded the Esperanza Education Fund, a college scholarship for immigrant students in the District of Columbia, Maryland, and Virginia. He also served on the Board of Directors of the Hispanic Bar Association of the District of Columbia. He graduated summa cum laude from Harvard College and holds a J.D. from Yale Law School, where he served on the Yale Law Journal and received the Paul & Daisy Soros Fellowship for New Americans.


One of my favorite Dr. Seuss stories is about this town called Hawtch Hawtch. So, in the town of Hawtch Hawtch, there's a town bee and you know, they presumably make honey, but the Hawtch Hawtcher one day realize that the bee that is watched will work harder you see? And so they hire a Hawtch Hawtcher to be on bee watching watch, but then you know, the bee isn't really doing much more than it normally is doing. And so they think, oh, well, the Hawtch Hawtcher is not watching hard enough. And so they hire another hot hocher to be on bee watcher watcher watch, I think is what Dr. Seuss calls it. And so there's this wonderful drawing of 12 Hawtch Hawtchers, you know, each one and either watching, watching watch, or actually, you know, the first one's watching the bee and, and the whole thing is just completely absurd.

That’s FTC Commissioner Alvaro Bedoya describing his favorite Dr. Seuss story – which he says works perfectly as a metaphor for why we need to be wary of workplace surveillance, and strengthen our privacy laws.

I’m Cindy Cohn, the executive director of the Electronic Frontier Foundation.

And I’m Jason Kelley. EFF’s Associate Director of Digital Strategy. This is our podcast, How to Fix the Internet.

Our guest today is Alvaro Bedoya. He’s served as a commissioner for the Federal Trade Commission since May of 2022, and before that he was the founding director of the Center on Privacy & Technology at Georgetown University Law Center, where he was also a visiting professor of law. So he thinks a lot about many of the issues we’re also passionate about at EFF – trust, privacy, competition, for example – and about how these issues are all deeply intertwined

We decided to start with our favorite question: What does the world look like if we get this stuff right?

For me, I think it is a world where you wake up in the morning, live your life and your ability to do what you want to do. See what you wanna see. Read what you wanna read and live the life that you want to live is unconnected to who you are in a good way.

In other words, what you look like, what side of the tracks you're from, how much money you have. Your gender, your gender identity, your sexuality, your religious beliefs, that those things don't hold you down in any way, and that you can love those things and have those things be a part of your life. But that they only empower you and help you. I think it's also a world… we see the great parts of technology. You know, one of the annoying things of having worked in privacy for so long is that you're often in this position where you have to talk about how technology hurts people. Technology can be amazing, right?

Mysterious, wonderful, uh, empowering. And so I think this is a world where those interactions are defined by those positive aspects of technology. And so for me, when I think about where those things go wrong, sorry, falling into old tropes here, but thinking about it positively, increasingly, people are applying for jobs online. They're applying for mortgages online. They are doing all these capital letter decisions that are now mediated by technology.

And so this world is also a world where, again, you are treated fairly in those decisions and you don't have to think twice about, hold on a second, I just applied for a loan. I just applied for a job, you know, I just applied for a mortgage. Is my zip code going to be used against me? Is my social media profile, you know, that reveals my interests gonna be used against me. Is my race gonna be used against me? In this world, none of that happens, and you can focus on preparing for that job interview and finding the right house for you and your family, finding the right rental for you and your family.

Now, I think it's also a world where you can start a small business without fear that the simple fact that you're not connected to a bigger platform or a bigger brand won't be used against you, where you have a level playing field to win people over.

I think that's great. You know, leveling the playing field is one of the original things that we were hoping, you know, that digital technologies could do. It also makes me think of that old New Yorker thing, you know, on the internet, no one knows you're a dog.

(Laughs) Right.

In some ways I think the vision is on the internet. You know, again, I don't think that people should leave the other parts of their lives behind when they go on the internet. Your identity matters, but that it doesn't, the fact that you're a dog doesn't mean you can't play. I'm probably butchering that poor cartoon too much.

No, I don't. I don't think you are, but I don't know why it did, but it reminded me of one other thing, which is in this world, you, you go to a. Whether it's at home in your basement like I am now, you know, or in your car or at an office, uh, uh, at a business. And you have a shot at working with pride and dignity where every minute of your work isn't measured and quantified. Where you have the ability to focus on the work rather than the surveillance of that work and the judgments that other people might make around that minute surveillance and, and you can focus on the work itself. I think too often people don't recognize the strangeness of the fact that when you watch tv, when you watch a streaming site, when you watch cable, when you go shopping, all of that stuff is protected by privacy law. And yet most of us spend a good part of our waking hours working and there are. Really no federal, uh, uh, worker privacy protections. That, for me is, is one of the biggest gaps in our sectoral privacy system that we've yet to confront.

But the world that you wanted me to talk about definitely is a world where you can go to work and do that work with dignity and pride, uh, without minute surveillance of everything you.

Yeah. And I think inherent in that is this, you know, this, this observation that, you know, being watched all the time doesn't work as a matter of humanity, right? It's a human rights issue to be watched all the time. I mean, that's why when they build prisons, right, it's the panopticon, right? That's where that idea comes from, is this idea that people who have lost their liberty get watched all the time.

So that has to be a part of building this better future, a space where, you know, we’re not being watched all the time. And I think you're exactly right that we kind of have this gigantic hole in people's lives, which is their work lives where it's not only that people don't have enough freedom right now, it's actually headed in the other direction. I know this is something that we think about a lot, especially Jason does at EFF.

Yeah, I mean we, we write quite a bit about Boss Ware. We've done a variety of research into Boss Ware technology. I wonder if you could talk a little bit about maybe like some concrete examples that you've seen where that technology is sort of coming to fruition, if you will. Like it's being used more and more and, and why we need to, to tackle it, because I think a lot of people probably, uh, listening to this aren't, aren't as familiar with it as they could be.

And at the top of this episode we heard you describe your favorite Dr. Seuss tale – about the bees and the watchers, and the watchers watching the watchers, and so on to absurdity. Now can you tell us why you think that’s such an important image?

I think it's a valuable metaphor for the fact that a lot of this surveillance software may not offer as complete a picture as employers might think it does. It may not have the effect that employers think it does, and it may not ultimately do what people want it to do. And so I think that anyone who is thinking about using the software should ask hard questions about ‘is this actually gonna capture what I'm being told it will capture? Does it account for the 20% tasks of my workers' jobs?’ So, you know, there's always an 80/20 rule and so, you know, as with, as with work, most of what you do is one thing, but there's usually 20% that's another thing. And I think there's a lot of examples where that 20%, like, you know, occasionally using the bathroom right, isn't accounted for by the software. And so it looks like the employee’s slacking, but actually they're just being a human being. And so I would encourage people to ask hard questions about the sophistication of the software and how it maps onto the realities of work.

Yeah. That's a really accurate way for people to start to think about it because I think a lot of people really feel that. Um, if they can measure it, then it must be useful.


In my own experience, before I worked at EFF, I worked somewhere where, eventually, a sort of boss ware type tool was installed and it had no connection to the job I was doing.

That’s interesting.

It was literally disconnected.

Can you share the general industry?

It was software. I worked as a, I was in marketing for a software company and um, I was remote and it was remote way before p the pandemic. So, you know, there's sort of, I think boss ware has increased probably during the pandemic. I think we've seen that because people are worried that if you're not in the office, you're not working.


There's no evidence, boss wear can't give evidence that that's true. It can just give evidence in, you know, whether you're at your computer –

Right. Whether you're typing.

Whether you're typing. Yeah. And what happened in my scenario without going into too much detail was that it mattered what window I was in. and it didn't always, at first it was just like, are you at your computer for eight hours? And then it was, are you at your computer in these specific windows for eight hours? And then it was, are you typing in those specific windows for eight hours? The screws kept getting twisted, right, until I was actually at my computer for 12 hours to get eight hours of ‘productive’ work in, as it was called.

And so, yeah, I left that job. Obviously, I work at EFF now for a reason. And is was one of the things that I remember when I started at EFF, part of what I like about what we do is that we think about people's humanity in what they're doing and how that interacts with technology.

And I think boss ware is one of those areas where it doesn't, um, because it, it is so common for an employer to sort of disengage from the employee and sort of think of them as like a tool. It's, it's an area where it's easy for to install something or try to install something where that happens. So I'm glad you're working on it. It's definitely an issue.

Well, I'm thinking about it, you know, and it's certainly something I, I care about and, and I think, I think my hope is, My hope is that, um, you know, the pandemic was horrific. Is horrific. My hope is that one of the realizations coming out of it from so many people going remote is the realization that particularly for some jobs, you know, uh, um, a lot of us are lucky to have these jobs where a lot of our time turns.

Being able to think clearly and carefully about a, about something, and that's a luxury. Um, but particularly for those jobs, my, my suspicion is for an even broader range of jobs that this idea of a workday where you sit down, work eight hours and sit up, you know, and, and that is the ideal workday I don't think that's a maximally productive day, and I think there's some really interesting trials around the four-day work week, and my hope is that, you know, when my kids are older, that there will be a recognition that working harder, staying up later, getting up earlier, is not the best way to get the best work from people. And people need time to think. They need time to relax. They need time to process things. And so that is my hope that that is one of the realizations around it. But you're exactly right, Jason, is that one of my concerns around this software is that there's this idea that if it can be measured, it must be important. And I think you use a great example, speaking in general here, that of software that may presume that if you aren't typing, you're not working, or if you're not in a window, you're not working, when actually you might be doing the most important work. You know, jotting down notes, organizing your thoughts, that lets you do the best stuff as it were.

Music transition

I want to jump in for a little mid-show break to say thank you to our sponsor.

“How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science and Technology. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians. So a tip of the hat to them for their assistance.

Now back to our conversation with Alvaro Bedoya.

Privacy issues are of course near and dear to our hearts at EFF and I know that's really the world you come out of as well. Although your perch is a little, a little different right now. We came to the conclusion that we can't address privacy if we don't address competition and antitrust issues. And I think you've come someplace similar perhaps, and I'd love for you to talk about how you think privacy and questions around competition and antitrust intertwine.

So I will confess, I don't know if I have figured it out, but I can offer a few thoughts. First of all, I think that a lot of the antitrust claims are not what they seem to be. When companies talk about how important it is to have gatekeeping around app stores because of privacy and this is one of the reasons I support the bills, I think it's Blumenthal Blackburn bill to, um, to change the way app stores are, are run and, and, and kick the tires on that gatekeeping model because I am skeptical about a lot of those pro-privacy, anti-antitrust claims, that is one thing. On the other hand, I do think we need to think carefully about the rules that are put in place, backfiring against new entrants and small competitors. And I think a lot of legislators and policy makers in the US and Europe appreciate this and are getting this right and institute a certain set of rules for bigger companies and different ones for smaller ones, I think one of the ways this can go wrong is when it's just about the size of the company rather than the size of the user base.

I think that if you are, you know, suddenly of a hundred million users that you're not a small company, even if you have, you know, a small number of employees, but I, I do think that those concerns are real and that that policy makers and people in my role need to think about the costs of privacy compliance in a way that does not inadvertently create an unlevel playing field for, for small competitors.

I will confess that sometimes things that appear to be, uh, um, antitrust problems are privacy problems in that they reflect legal gaps around the sectoral privacy framework that unfortunately has yet to be updated. So I think I can give one example where there was the recent merger of, uh, Amazon and One Medical, and, well, I can't go into the antitrust analysis that may or may not have occurred at the commission. I wrote a statement on the completion of the merger, which highlighted a gap that we have around the anonymization rule in our health privacy law. For example, people think that HIPAA is actually the Health Information Privacy Act. It's not, it's actually the Health Insurance Portability Accountability Act. And I think that little piece of common wisdom speaks to a broader gap in our understanding of health privacy. So I think a lot of people think HIPAA will protect their data and that it won't be used in other ways by their doctor, by whoever it is that has their HIPAA protected data. Well, it turns out that in 2000 when HHS promulgated. The privacy rule in good faith, it had a provision that said, Hey, look, we want to encourage the improvement in health services. We want to encourage health research and we want to encourage public health. And so we're gonna say that if you remove these, you know, 18 identifiers from health data, that it can be used for other purposes and if you look at the rule that was issued, the justification for it is that they want to promote public health.

Unfortunately, they did not put a use restriction on that. And so now, if any, doctor's practice, anyone covered by HIPAA, and I'm not gonna go into the rabbit hole of who is and who isn't, but if you're covered by HIPAA, All they need to do is remove those identifiers from the data.

And HHS is unfortunately very clear that you can essentially do a whole lot of things that have nothing to do with healthcare as long as you do that. And what I wrote in my statement is that would surprise most consumers. Frankly, it surprised me when I connected the dots.

What I'm hearing here, which I think is really important is, first of all, we start off by thinking that some of our privacy problems are really due to antitrust concerns, but what we learn pretty quickly when we're looking at this is, first of all, privacy is used frankly, as a blocker for common sense reforms that we might need, that these giants come in and they say, well, we're gonna protect people's privacy by limiting what apps are in the app store. And, and we need to look closely at that because it doesn't seem to be necessarily true.

So first of all, you have to watch out for the kind of fake privacy argument or the argument that the tech giants need to be protected because they're protecting our privacy and we need to really interrogate that. And at the bottom of it, it often comes down to the fact that we haven't really protected people's privacy as a legal matter, right? We, we, We ground ourselves in Larry Lessig, uh, four pillars of change, right? Code, norms, laws, and markets. And you know, what they're saying is, well, we have to protect, you know, essentially what is a non-market, but the, the tech giants, that markets will protect privacy and so therefore we can't introduce more competition. And I think at the bottom of this, what we find a lot is that it's, you know, the law should be setting the baseline, and then markets can build on top of that. But we've got things a little backwards. And I think that's especially true in health. It's, it's, it's very front and center for those of us who care about reproductive justice, who are looking at the way health insurance companies are now part and parcel of other data analysis companies. And the Amazon/One Medical one is, is another one of those that unless we get the privacy law right, it's gonna be hard to get at some of these other problems.

Yeah. And those are the three things that I think a lot about first, that those propri arguments that seem to cut against, uh, competition concerns are often not what they seem.

Second, that we do need to take into account how one size fits all privacy rules could backfire in a way that hurts, uh, small companies, small competitors, uh, who are the lifeblood of, uh, innovation and employment frankly. And, and lastly, Sometimes what we're actually seeing are gaps in our sectoral privacy system.

One of the things that I know you've, you've talked about a little bit is, um, you're calling it a return to fairness, and that's specifically talking about a piece of the FTC’s authority. And I wonder if you could talk about that a little more and how you see that fitting into a, a better world.

Sure. One of the best parts of this job, um, was having this need and opportunity to immerse myself in antitrust. So as a Senate staffer, I did a little bit of work on the Comcast, uh, NBC merger against, against that merger, uh, for my old boss, Senator Franken. But I didn't spend a whole lot of time on competition concerns. And so when I was nominated, I, you know, quite literally, you know, ordered antitrust treatises and read them cover to cover.


Well, sometimes it's wonderful and sometimes it's not. But in this case it was. And what you see is this complete two-sided story where on the one hand you have this really anodyne, efficiency-based description of antitrust, where it is about enforcing abstract laws and maximizing efficiency and the saying, you know antitrust is about protects competition, not competitors, and you so quickly lose sight of why we have antitrust laws and how we got them.

And so I didn't just read treatises on the law. I also read histories. And one of the things that you read and realize when you read those histories is that antitrust isn't about efficiency, antitrust is about people. And yes, it's about protecting competition, but the reason we have it is because of what happened to certain people. And so, you know, the Sherman Act, you listen to those floor debates, it is fascinating because first of all, everyone agrees as to what we want to do, what Congress wanted to do. Congress wanted to reign in the trust they wanted to reign in John Rockefeller, JP Morgan, the beef trust, the sugar trust, the steel trust. Not to mention, you know, the Rockefeller's Oil Trust. The most common concern on the floor of the Senate was what was happening to cattlemen because of concentration in meat packing plants and the prices they were getting when they brought their cattle to processors, and to market. And then you look at, uh, 1914, the Clayton Act again. There was outrage, true outrage about how those antitrust laws, you know, 10 out of the first 12 antitrust injunctions in our, in our country post-Sherman, were targeted at workers and not just any workers. They were targeted at rail car manufacturers in Pullman, where it was an integrated workforce and they were working extremely long hours for a pittance and wages, and they decided to strike.

And some of the first injunctions we saw in this country were used to. Their strike or how it was used against, uh, uh, I think they're called drayage men or dray men in New Orleans, port workers and dock workers in New New Orleans, who again, were working these 12 hour days for, for nothing in wages. And this beautiful thing happened in New Orleans where the entire city went on strike.

It was, I think it was 30 unions. It was like the typographical workers unions. And if you think that that refers to people typing on keyboards, it does. From the people typing on mechanical typewriters to the people, you know, unload loading ships in the dock of, in the port of New Orleans, everyone went on strike and they had this, this organization called the Amalgamated Working Men's Council. And um, and they went, they wanted a 10 hour, uh, uh, workday. They wanted overtime pay, and they wanted, uh, uh, union shops. They got two out of those three things. But, um, but I think it was the trade board was so unhappy with it that they, uh, persuaded federal prosecutors to sue under Sherman.

And it went before Judge Billings. And Judge Billings said, absolutely this is a violation of the antitrust laws. And the curious thing about Judge Billings decision is one of the first German decisions in a federal court, and he didn't cite for the proposition that the strike was a restraint on trade to restrain on trade law. He cited to much older decisions about criminal conspiracies and unions to justify his decision.

And so what I'm trying to say is over and over and over again, whenever, you know, you look at the actual history of antitrust laws, you know, it isn't about efficiency, it's about fairness. It is about how small competitors and working people, farmers, laborers, deserve a level playing field. And in 1890, 1914, 1936, 1950, this was what was front and center for Congress.

It's great to end with a deep dive into the original intent of Congress to protect ordinary people and fairness with antitrust laws, especially in this time when history and original intent are so powerful for so many judges. You know, it’s solid grounding for going forward. But I also appreciate how you mapped the history to see how that Congressional intent was perverted by the judicial branch almost from the very start.

This shows us where we need to go to set things right but also that it’s a difficult road. Thanks so much Alvaro.

Well, it's a rare privilege to get to complain about a former employer directly to a sitting FTC commissioner. So that was a very enjoyable conversation for me. It's also rare to learn something new about Dr. Seuss and a Dr. Seuss story, which we got to do. But as far as actual concrete takeaways go from that conversation, Cindy, what did you pull away from that really wide ranging discussion?

It’s always fun to talk to Alvaro. I loved his vision of a life lived with dignity and pride as the goal of our fixed internet. I mean those are good solid north stars, and from them we can begin to see how that means that we use technology in a way that, for example, allows workers to just focus on their work. And honestly, while that gives us dignity, it also stops the kind of mistakes we’re seeing like tracking keystrokes, or eye contact as secondary trackers that are feeding all kinds of discrimination.

So I really appreciate him really articulating, you know, what are the kinds of lives we wanna have. I also appreciate his thinking about the privacy gaps that get revealed as technology changes and, and the, the story of healthcare and how HIPAA doesn't protect us in the way that we'd hoped to protect us, in part because I think HIPAA didn't start off at a very good place, but as things have shifted and say, you know, one medical is being bought by Amazon, suddenly we see that the presumption of who your insurance provider was and what they might use that information for, has shifted a lot, and that the privacy law hasn't, hasn't kept up.

So I appreciate thinking about it from, you know, both of those perspectives, both, you know, what the law gets wrong and how technology can reveal gaps in the law.

Yeah. That really stood out for me as well, especially the parts where Alvero was talking about looking into the law in a way that he hadn't had to before. Like you say, because that is kind of what we do at EFF at least part of what we do. And it's nice to hear that we are sort of on the same page and that there are people in government doing that. There are people at EFF doing that. There are people all over, in different areas doing that. And that's what we have to do because technology does change so quickly and so much.

Yeah, and I really appreciate the deep dive he's done into antitrust law and, and revealing really the, the, the fairness is a deep, deep part of it. And this idea that it's only about efficiency and especially efficiency for consumers only. It's ahistorical. And that's a good thing for us all to remember since we, especially these days have a Supreme Court that is really, you know, likes history a lot and grounds and limits what it does in history. The history's on our side in terms of, you know, bringing competition law, frankly, to the digital age.

Well that’s it for this episode of How to Fix the Internet.

Thank you so much for listening. If you want to get in touch about the show, you can write to us at podcast@eff.org or check out the EFF website to become a member or donate, or look at hoodies, t-shirts, hats or other merch.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. You can find their names and links to their music in our episode notes, or on our website at eff.org/podcast.

Our theme music is by Nat Keefe of BeatMower with Reed Mathis

And How to Fix the Internet is supported by the Alfred P. Sloan Foundation's program in public understanding of science and technology.

We’ll see you next time.

I’m Jason Kelley…

And I’m Cindy Cohn.


This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by its creators:

Lost track by airtone
Common ground by airtone
Probably shouldn’t by J Lang