Today almost everything is connected to the internet - from your coffeemaker to your car to your thermostat. But the “Internet of Things” may not be hardwired for security. Window Snyder, computer security expert and author, joins EFF hosts Cindy Cohn and Danny O’Brien as they delve into the scary insecurities lurking in so many of our modern conveniences—and how we can change policies and tech to improve our security and safety.

Window Snyder is the founder and CEO of Thistle Technologies. She’s the former Chief Security Officer of Square, Fastly and Mozilla, and she spent five years at Apple focusing on privacy strategy and features for OS X and iOS. Window is also the co-author of Threat Modeling, a manual for security architecture analysis in software.

Click below to listen to the episode now, or choose your podcast player:

Privacy info. This embed will serve content from

Listen on Apple Podcasts Badge Listen on Spotify Podcasts Badge  Subscribe via RSS badge

You can also listen to this episode on the Internet Archive.

In this episode, Window explains why malicious hackers might be interested in getting access  to your refrigerator, doorbell, or printer. These basic household electronics can be an entry point for attackers to gain access to other sensitive devices on your network.  Some of these devices may themselves store sensitive data, like a printer or the camera in a kid’s bedroom. Unfortunately, many internet-connected devices in your home aren’t designed to be easily inspected and reviewed for inappropriate access. That means it can be hard for you to know whether they’ve been compromised.

But the answer is not forswearing all connected devices. Window approaches this problem with some optimism for the future. Software companies have learned, after an onslaught of attacks, to  prioritize security. And we can bring the lessons of software security  into the world of hardware devices. 

In this episode, we explain:

  • How it was the hard costs of addressing security vulnerabilities, rather than the sharp stick of regulation, that pushed many tech companies to start prioritizing cybersecurity. 
  • The particular threat of devices that are no longer being updated by the companies that originally deployed them, perhaps because that product is no longer produced, or because the company has folded or been sold.
  • Why we should adapt our best current systems for software security, like our processes for updating browsers and operating systems, for securing newly networked devices, like doorbells and refrigerators.
  • Why committing to a year or two of security updates isn’t good enough when it comes to consumer goods like cars and medical technology. 
  • Why it’s important for hardware creators to build devices so that they will be able to reliably update the software without “bricking” the device.
  • The challenge of covering the cost of security updates when a user only pays once for the device – and how  bundling security updates with new features can entice users to stay updated.

If you have any feedback on this episode, please email

Below, you’ll find legal and technical resources as well as a full transcript of the audio.


Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower.

This podcast is licensed Creative Commons Attribution 4.0 International, and includes the following music licensed Creative Commons Attribution 3.0 Unported by their creators:

  • Drops of H2O (The Filtered Water Treatment ) by J.Lang Ft: Airtone.
  • Warm Vacuum Tube by Admiral Bob Ft: starfrosch
  • Xena's Kiss / Medea's Kiss by mwic
  • reCreation by airtone


Firmware updates

Internet of Things:

Hacking vulnerabilities through printers:


 Cyber attacks on hospitals:

Privacy Harms through Smart Appliances


Right to Repair:


Window: I bought a coffee mug that keeps my coffee at like 133 degrees, which I'm delighted by. But the first thing I did when I took it out of the package is it wanted a firmware update. I was like, "Yes, awesome." I had it for like two and a half weeks and it wanted another firmware update. 

I don't even know it was a security issue, it could be a functionality issue, maybe they're making my battery performance last longer. I don't know what the updates do. It's completely opaque. But at least there's an opportunity for that to also include security issues being resolved if that's a problem for that specific device. So I think there's some folks that are making space for it, they recognize that security updates are a critical path to developing a resilient device that will support the actual lifespan of that device. I mean, how long do you expect to be able to use a cup?

Cindy: That's Window Snyder, and she'll be joining us today to walk us through her ideas about how we can build a more secure world of connected devices without, hopefully, having to start over from scratch. I'm Cindy Cohn, EFF's executive director.

Danny: And I'm Danny O'Brien, special advisor to the EFF. Welcome to How to Fix the Internet, a podcast of the Electronic Frontier Foundation, where we bring you big ideas, solutions, and hope that we can fix the biggest problems we face online.

Cindy: Window, we are so excited to have you join us on How to Fix the Internet. You are someone who is always working towards a real, concrete, tech world that supports everyone. And I'm so happy be to have you here to share your ideas.

Window: Thanks so much, Cindy. I'm really glad to be here.

Cindy: So we now have internet connected computers in so many things. They're in our thermostats, our doorbells, our fridges, our TVs. Now, you've been thinking about security in our devices for a very long time. So what about this keeps you up at night?

Window: One of the things that I've seen over the years as we've built so many different mechanisms into general purpose operating systems like Windows or Linux that you might find on a server or OS10 on your Mac, that we've not seen the same kind of investment in those kinds of security mechanisms in devices. There are good reasons for that. These devices are very often kind of minimal in their functionality. They're trying to do something that's specific to a purpose, and very often they're optimized for performance or for interoperability with different hardware components. And so very often they haven't spent the time to invest in the kinds of security mechanisms that make some of the general purpose OSes or even mobile device OSs more resilient. And so we've kind of got a problem now because we took those devices and then we attached them to the internet and all that attack surface is now how exposed to the world and without the same sort of investment in making those devices more security resilient, we’ve got a growing problem.

Cindy: I think some people have seen the images of hackers taking over the cameras in kids' bedrooms and telling kids what to do. I mean, this is, I think, the kind of problem that you have when you've got systems that are really not internet ready or internet protected, that then gets connected to the internet.

Window: Exactly.

Danny: So what are the incentives for someone to hack into these things? I mean, we can talk about like those sort of prank or threatening things, but what are the people breaking into this at such a large scale trying to do with this technology?

Window: Well, very often they're opportunistic and someone who finds a vulnerability in your refrigerator and then uses it to get onto your network, they're not trying to spoil your food by changing the temperature in your refrigerator, they're using your refrigerator as a launch point to see if there are any other interesting devices on your network. And some of the attacks that you can deploy on a local network are different than the attacks that you would otherwise have to deploy from outside of the network.

Window: So what they're leveraging is access, very often. Some of these devices actually have access to all kinds of things. So if it's in a corporate network and that embedded device happens to be a printer, right, that printer is basically a data store where you send all your most important documents to be printed, but they're still stored on that printer. And then the printer is mostly uninspectable by the administration team. So it's a great place to camp out if you're an attacker and set up shop and then use that access to launch attacks against the rest of the infrastructure.

Danny: So it's effectively that they're the weakest point in your home network and it's like the entry way for anything else that they want to hack.

Window: Sometimes it's the weakest point, but it's also that it's like a deep dark corner that's difficult to shine a light into, and so it's a great place to hide.

Cindy: The stakes in this can be really high. One of the things that we've heard is these taking over of hospital networks can end up really harming people. And even things like the ability to turn up the heat or turn down the heat or those kinds of things, they can end up being, not just pranks, but really life threatening for folks.

Window: The problem that we described about the refrigerator in your home is really different if we're talking about a refrigerator at a hospital that's intended to keep blood at a certain temperature, for example, right, or medicine at a certain temperature.

Danny: So you said that general purpose computers, the laptops, and to certain extent the phones that we have today, have had 20 years of security sort of concentration. What caused those companies to kind of shift into a more defensive posture?

What encouraged them to do that? Was it regulation?

Window: That would be amazing if we could use regulation to just fix everything, but no, can't regulate your way out of this. Basically it was pain and it was pain directed at the wallet. So Microsoft is feeling a lot of pain with malware and with worms. And I don't know if you guys remember Slammer and Melissa, I love you. These viruses that they're feeling and their customers were feeling a lot of pain around it and saying, "Hey, Microsoft, you need to get your house in order." And so Bill Gates sent out this memo saying we're going to do something about security. That was around the time that I joined Microsoft. And honestly, we had a tremendous amount of work to do. It was an attempt to boil the ocean. And I was very lucky to be in a situation there where both I had experience with it, and also this is what I came to do. And also now I had the support of executives all the way up.

But how do you take a code base that is so rich and has so many features and so much functionality and get to a place where it's got a more modern security posture? So for a general purpose operating system, they needed to reduce attack surface, they needed to get rid of functionality, make it optional so that if there was a vulnerability that was present in one of those components, it didn't impact the entire deployment of... Back then it was hundreds of millions. At this point, it'd be a billion plus devices out there. You want to make sure that you are compartmentalizing access as much as possible. They deployed modern memory mitigation mechanisms that made it difficult to exploit memory corruption issues.

Window: And then they worked at it and it's been 20 years and they're still working at it. There's still problems, but there are not the same kind of problems that you saw in 2002.

Cindy: You said you wished regulation could do something about that. Do you think that is something that's possible? I think oftentimes we worry that the pace of regulation, the pace of legislation is so slow compared to the pace of technological advancement that we will... It can't keep up, and in some ways it shouldn't try to keep up because we want innovation to be racing ahead. I don't know, from where you sit, how do you think about the role of law and regulations in this space?

Window: I think regulations are great for regulating responsibility. Like for example, saying that when a security issue has been identified and it has these kinds of characteristics, let's say it's critical in that it's reachable from the network, and without any special access an attacker's able to achieve arbitrary code execution, that means they can basically do whatever they want with the system at that point, right? Then a security update needs to be made available. That's something they could regulate because it's specific enough and it's about responsibility. This is actually one of the significant differences between the security problems on software systems and hardware devices.

When there's a problem on, let's say, in your web browser, right? And you go to do your update, if the update fails, you can try it again, pretty easily. You can figure out for yourself how to get back to that known good state. So a failure rate of like 3%, 4%, 5%, for the most part users can help themselves out. And they are able to, to, to get back to work pretty quickly. But if these low-level components need to be updated and they have a failure rate of like 1%, right, and that device doesn't come back up, there's no interface. What's the user going to do at this point? They can't even necessarily initiate it to try it again. That might just be a completely bricked device, completely useless at this point.

But if it's a car, now it has to go back to the dealership. And if it's a phone, it has to come back into the shop. But if it's a satellite, that's just gone forever. If it's an ATM, maybe somebody has to physically come out with a USB and plug it in and then try and update the firmware for that ATM. For physical devices in the world, it gets really different. And then here's the other end of this. For software developers, they get away with saying, "Oh, we'll send you security updates for a year or two," or maybe they don't say at all, and it's just kind of at their whim, because there is no regulatory requirement to ship security updates for any period of time. We're all just at the whim of the folks who produce these technology components that we rely on.

But if it's an MRI in a county hospital and they're not getting security updates, but they're vulnerable to something, they're not going to go buy a new MRI, right? We expect these devices to be in use for a lot longer than even a phone or a general purpose computer, a laptop, a web browser. For sure, those things get updated every 10 minutes, right? Both the difficulty of bring a highly reliable update mechanism and also the lifespan of these devices completely changed the story. So instead of saying it's efficient to deliver security updates for a year or two years, you now get to this place where it's just like, "Well, how long do you expect a car to be useful?" Right? I expect to be able to drive my car for 10 years, and then I sell it to somebody else, and then you can drive it for 10 years. And until it's in bits, like that car should still be functional, right?

Danny: “How to Fix the Internet” is supported by The Alfred P. Sloan Foundation’s Program in Public Understanding of Science. Enriching people’s lives through a keener appreciation of our increasingly technological world and portraying the complex humanity of scientists, engineers, and mathematicians.

Cindy: I wanted to circle back as the lawyer in the house about the... What I heard you saying about the role that regulation and law can play in this is really about liability, it's that if you need to build in ongoing security into your tool, and if you don't, you need to be responsible for the harm that comes, and the liability does work a little that way now, and I think one of the problems that we have is that some of the harms that happen are privacy harms or the other kinds of harm that the law is struggling with. But I really like this idea that the way that you have a lever on these companies is to basically make them responsible for the long tail of problems that these things can create, not just the moment of sale.

Window: Absolutely. That's exactly right. But on my end of things, since I'm not thinking of about liability, because that feels like something that someone like you can probably contribute to that conversation better, but in terms of how do we get there? Well, having an update mechanism that is robust enough that you're able to ship a security update or any sort of update with confidence that that device is going to come back up, right? Because that's one of the reasons that it's hard to update devices is that if you're worried that the device might not come back up, even if it's like a 1% failure rate, then you don't want to ship up updates unless you absolutely have to, because the cost of addressing that issue is really more significant potentially than a security issue might be. And they have to make a judgment about that for all these different kinds of ways their device might be used.

But on the updates side of things, since, as you described, they only make their money when you buy the device, and after that it's out the door. And so to continuously have to pay to support security updates is kind of difficult, but there are other ways to support it without actually being responsible for shipping the security update yourself. If you make the code open source, then the community can potentially support it. If you make the code available to a third party, then that third party can be able to provide security updates for those issues.

But even for the device manufacturer themselves, getting to a place where you have a highly reliable security update mechanism could be used, not just to deliver security updates, but functional updates. And then you could potentially have an ongoing relationship with that party who purchased the device by selling them new functionality once the device is already out the door, like they could sell new features for those existing devices. And Tesla has really embraced that, right? They're doing great with it, that you buy a car and then later you can buy a new functionality for your car. Fantastic.

Cindy: So to get to a world where our devices are really secure, I am hearing three things:a lot more open source, a lot more interoperability, and in general a lot more ability for third parties to repair or update or give us more security than we have now. Is that right?

Window: I think actually the most critical component is going to be leveraging existing security mechanisms that have been built for resilience and incorporating those into these devices, which is actually what I'm building right now. That's what Thistle Technologies is doing, we're trying to help companies get to that place where they've got modern security mechanisms in their devices without having to build all the infrastructure that's required in order to deliver that. 

So the industry is in agreement, for the most part, that you should not implement your own cryptographic libraries, right? That you should leverage an existing cryptographic library that is tested, that was implemented by folks who understand cryptography and more importantly, understand how cryptographic implementations can fail, especially when they're being attacked, right? So this is actually true for security mechanisms way beyond cryptography. And that's why I think that building these security sensitive mechanisms in one place and letting folks pick and choose and incorporate those into their devices makes sense. And I think this is actually how devices are going to get there. And maybe some of those will be open source projects, and maybe some of those will be commercial projects like mine, but I think not having to all of us go it alone in all these different places, reinvent the wheel over and over again, get to a place where we've got security sensitive systems that are built and incorporated into all these different kinds of systems that don't have them yet.

Danny: So a lot of what you're describing seems to be like building or slotting in robust software that is built with security in mind. But one thing I hear from security researchers all the time is that security is a process. And is there some way that a small hardware manufacturer, right, someone who just makes light bulbs or just makes radios, if they still exist, what is part of the process there? What do they have to change in their outlook?

Window: So it's the same for any small development team that the most important stuff that you want to do is still true for software and for hardware, and that is to reduce the attack surface. And if there's functionality that's only in use by a small number of folks in your deployment, make it modular so that those folks can have the functionality, but not everybody has to have all of the risk, to move to memory-safe languages, higher-level languages, so that memory management is not managed by the developers, because that reduces the ability for an attacker to take advantage of any of the problems that can result in memory corruption.

Danny: And when you say attack surface here, you're sort of describing the bits of this technology which are vulnerable and are kind of exposed, right? You're just talking about making them less exposed and less likely to damage everything else if they break.

Window: Yeah. So if you think about your body as having some sort of attack surface, like as we're walking around in the world we can get infections through our mucus membranes, like our eyes, our nose, our mouth, and so on. So it reduces our risk if we wear a mask, it reduces, let's say, the risk for a healthcare worker if they're also wearing like a face shield to prevent somebody coughing on them and having it get in through their eyes, etc. So reducing your attack surface means providing a filter or a cover.

The attacker has a harder time coming in through the wall, they're going to come in through the doors. And so if you think of these services where you're listening as doors, then you want to make sure that you have authentication really early on in your protocol, so that there's less opportunity for them to say something that could be interpreted the wrong way by your computer, and now they're computing code on your system, for example. And then that same kind of idea, but applied through all the different components in the system.

Cindy: That's great. I love this idea that you're trying to shrink down 25 years worth of work in operating systems into a little box that somebody can plug into their internet connected thing. I think that's awesome.

Window: It's better than trying to wait 25 years for everyone else to catch up, right?

Cindy: Absolutely. So what are the values we're going to get if we get to this world? What is it going to feel like, what is it going to seem like when we have these a world in which we've got more protected devices?

Window: I think, first we'll feel some pain and then we'll feel devices that we're able to have more confidence in, that we might feel more comfortable sharing information that's very personal because we are able to evaluate what they're going to do with that data. Maybe that company that's building this thing has a very clear policy that is easy to understand, doesn't require 10 pages of legal language that's designed to be, as let's say, conservative as possible and reserve every possible right for the company to do whatever they want with your information, right? When folks are understand that, then they're more able to use it. One of the things that I'm thinking about constantly about every device I bring into my home is how is this increasing my attack surface? Do I need a separate network to manage these devices in my house? Yes, I do apparently.

Window: But is that reasonable? No, it's not reasonable. People should be able to bring home a device and use it and not worry that the microphone on their television is listening to them or that an attacker could leverage kids' baby camera to capture pictures of the inside of their house, right? People want to feel comfortable and safe when they use these things. And that's just consumers. If it's on the enterprise side, folks want to be able to, let's say, understand the risk for their organization, make reasonable trade offs, deploy their resources in things that build their business, not just things that, let's say, allow the business to function. And security is one of those things that if you have to spend money securing your infrastructure, then you're not spending money creating all the functionality that the infrastructure exists to serve.

Danny: So we have this sort of utopia where we have all these devices and they talk to one another and before I get home, my stove has set the water boiling and it's completely safe and okay. Then we have kind of another vision that I think people have of a secure world requiring this sort of bureaucracy, right? You know everything is locked down, I maybe have to go and sign out something at work, or there is someone who tells me, "I can't install this software." 

In order to feel safe, do we have to head to that second future? Or do we still get all our cool gadgets in your vision of how this plays out?

Window: That's the problem, right? We want up to be able to install the software and that it's safe. It's not going to create a new vulnerability in your corporate network, right? When you tell folks to say like, "Oh, be careful on clicking links in email," or whatever. It's like, why on earth should that be a problem? If your email is allowing you to click on a link to launch another application or a web browser, then that should be safe. The problem's not with a user randomly clicking on things, they should be able to randomly click on things and have it not compromise their device. The problem is that the web browser, which for all the functionality that the web has brought us, it is honestly a terrible idea to say, "I'm going to take this software. I'm going to put all my passwords and credit cards in it. And then I'm going to go visit all these different servers. And I'm going to take code from these servers and execute it on my device locally and hope it doesn't create problems on my system."

That is a really difficult model to secure. So you should be able to go anywhere and install software from wherever. That would be the ideal that if we can get to a place where we have a high degree of compartmentalization, we could install software off the internet, it runs in a sandbox that we have a high degree of confidence is truly a high degree of compartmentalization away from everything else that you care about in the system. You use that functionality, it does something delightful, and you move on with your life without ever having to think about like, "Is this okay?" But right now you have to spend a lot of time thinking about like, "Do I want to let it in my house? What is it going to do?" So the ideal version is you just get to use your stuff and it works.

Danny: That is a vision of a future that I want.

Cindy: Yeah, me too. Me too. And I really love the embrace of like, we should have this cool stuff, we should have this cool functionality, new things shouldn't be so scary, right? It doesn't need to be that way. And we can build a world where we get all the cool stuff and we have this ongoing innovation space without all the risk. And that's the dream. Sometimes I talk to people and they're like, "Just don't do it. Don't buy-

Danny: Get off the internet.

Cindy: Yeah. Get off the internet. Don't buy a smart TV, don't buy this, don't buy that, don't buy that. And I totally understand that impulse, but I think that you're talking about a different world, one where we get all the cool stuff, but we also get our security too.

Window: Yeah. Wouldn't you love to be able to connect with your friends online, share pictures of the family and feel like no one is collecting this to better create a dossier about how to better advertise to you? And then where does this sit and how long does it sit there for, and who's it shared with? And what's it going to mean? Are they identifying who's influential in my network so they can tell me that so and so really enjoyed this new brand of cookware, right? I would love to be able to communicate freely in public or in forums and only worry about humans instead of like all the different ways this data is going to be collected and hashed and rehashed and used to create a profile of me that doesn't necessarily represent who I am today or who I might be 10 years from now.

Cindy: Well, and the nice thing about fixing the security is that that puts that control back into our hands, right? A lot of the things that are consumer surveillance stuff are really side effects of the fact that these systems are not designed to serve us, they're not designed to secure us. And so some of these business models kind of latch onto that and ride along with that. So the good thing about more security is that we not only get security from the bad guys, we also get maybe some more control over some of the companies who are riding along with the insecure world as part of their business models.

Danny: Well, thank you Window for both making us safer in the present day and building this secure and exciting future.

Cindy: Yeah. Thank you so much, Window.

Window: Thanks for having me, guys. It's definitely been a lot of fun.

Cindy: That was just great. And what I really love about Window is that sometimes security researchers are all about, no, don't do this, don't do that, this is dangerous, this is scary. And Window is firmly on the side of, we need to have of all our cool devices, even our coffee mug that connects to the internet for some strange reason, but we need to have all of our cool devices, but we need them to be secure too.

Danny: Yeah. She's always very modest. And I think she actually has been the calm, collected voice in Apple and Microsoft as they slowly try and fix these things I think one of the things I took away from that is that it is a different world, right, but we have got some knowledge that we can drop into that new plate.

Cindy: Yeah. And I love the idea of kind of taking all that we've learned in the 25 years of trying to make operating systems secure and shrink it down into modules that people can use in their devices in ways that will stop us from having to spend the next 25 years figuring out how to do security in devices.

Danny: Something else that struck me was that the kind of thing that prompted the solution, or the attempt to fix the security problems of big PCs and desktops and later phones was this Bill Gates memo. And it struck me that the challenge here is that there are there is not one company, there is not a monopoly and there is no Bill Gates to write the memo to the Samsungs and the anchors of this world. So I don't know how you do it, but I have a feeling Window does.

Cindy: Well, I think she's working on it, and I think that's great, but she also pointed to a place that regulation might come in as well, and specifically on the idea of liability, right? Like making sure that the companies that put this out are accountable, not just at the point of sale, but over the long run for the risks that they create. And then hopefully that will help spur a kind of long relationship where, not just the company that sold you the device, but a whole bunch of other companies and people, and hobbyists and other people can come in and help you keep your device secure over the long run. And also, as she pointed out, maybe even give you some additional features. So once again, we see interoperability being key to this better future, even in this other place where we're talking about just simply making our devices not so dangerous.

Danny: Yeah. The solution to not having one big company is not to put one big government or one new big company in charge, but to share the knowledge and communicate that round. And if somebody can't, or doesn't have the resources to take that responsibility, there's someone else who can represent the consumer or the hospital, and step in and fix and repair those problems.

Cindy: I think it was interesting to me how this conversation kind of started off as a hardcore security modeling kind of thing, but we ended up again with adversarial interoperability and right-to-repair being so central to how we fix this problem. And I really appreciate how these things are starting to connect together in a single story.

Danny: Music for How to Fix the Internet was created for us by Reed Mathis and Nat Keefe of BeatMower. This podcast is licensed Creative Commons Attribution 4.0 International, and includes music licensed Creative Commons Attribution 3.0 Unported by their creators. You can find their names and links to their music in our episode notes, or on our website at Please visit where you’ll find more episodes, learn about these issues, you can donate to become a member of EFF, as well as lots more. Members are the only reason we can do this work plus you can get cool stuff like an EFF hat, and EFF hoodie or an EFF camera cover for your laptop camera. How to fix the internet is supported by the Alfred P Sloan foundation’s program and public understanding of science and technology. 

Danny: I'm Danny O'Brien.
Cindy: And I'm Cindy Cohn. Thanks for listening.