The latest version of the Kids Online Safety Act (KOSA) is focused on removing online information that people need to see—people of all ages. Letting governments—state or federal—decide what information anyone needs to see is a dangerous endeavor. On top of that, this bill, supposedly designed to protect our privacy, actually requires tech companies to collect more data on internet users than they already do.
EFF has long supported comprehensive privacy protections, but the details matter. KOSA gets the details consistently wrong, and that’s why we’re calling on members of Congress to oppose this bill.
Although KOSA has been revamped since lawmakers introduced it in February, and improved slightly, it’s still a dangerous bill that presents censorship and surveillance as a solution to some legitimate, and some not-so-legitimate, issues facing young internet users today.
KOSA is a sweeping update to the Children’s Online Privacy Protection Act, also known as COPPA. COPPA is the reason that many websites and platforms ask for you to confirm your age, and why many services require their users to be older than 13—because the laws protecting data privacy are much stricter for children than they are for adults. Legislators have been hoping to expand COPPA for years, and there have been good proposals to do so. KOSA, for its part, includes some good ideas: more people should be protected by privacy laws, and the bill expands COPPA’s protections to include minors under 16. That would do a world of good, in theory: the more people we can protect under COPPA, the better. But why stop with protecting the privacy of minors under 16? EFF has long supported comprehensive data privacy legislation for all users.
Another good provision in KOSA would compel sites to allow minor users to delete their account and their personal data, and restrict the sharing of their geolocation data, as well as provide notice if they are tracking it. Again, EFF thinks all users—regardless of their age—should have these protections, and expanding them incrementally is better than the status quo.
But KOSA’s chief focus is not to protect young people’s privacy. The bill’s main aim is to censor a broad swath of speech in response to concerns that young people are spending too much time on social media, and too often encountering harmful content. KOSA requires sites to “prevent and mitigate mental health disorders,” including by the promotion or exacerbation of “self-harm, suicide, eating disorders, and substance use disorders.” Make no mistake: this is a requirement that platforms censor content.
This sweeping set of content restrictions wouldn’t just apply to Facebook or Instagram. Platforms covered by KOSA include “any online platform that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” As we said before, this would likely encompass everything from Apple’s iMessage and Signal to web browsers, email applications and VPN software, as well as platforms like Reddit, Facebook, and TikTok—platforms with wildly different user bases and uses, and with hugely varying abilities, and expectations, to monitor content.
A huge number of online services would thus be forced to make a choice: overfilter to ensure no one encounters content that could be construed as ambiguously harmful, or raise the age limit for users to 17. Many platforms may even do both.
Let’s be clear about the dangerous consequences of KOSA’s censorship. Under its vague standard, both adults and children will not be able to access medical and health information online. This is because it will be next to impossible for a website to make case-by-case decisions about which content promotes self-harm or other disorders and which ones provide necessary health information and advice to those suffering from them. This will disparately impact children who lack the familial, social, financial, or other means to obtain health information elsewhere. (Research has shown that a large majority of young people have used the internet for health-related research.)
Another example: KOSA also requires these services to ensure that young people do not see content that exacerbates a substance use disorder. On the face of it, that might seem fairly simple: just delete content that talks about drugs, or hide it from young people. But how do you find and label such content? Put simply: not all content that talks about drugs exacerbates their use.
There is no realistic way to find and filter only that content without also removing a huge swath of content that is beneficial. For just one example, social media posts describing how to use naloxone, a medication that can reverse an overdose from opioids, could be viewed as either promoting self-harm, because it can reduce the potential danger of a fatal overdose, or as providing necessary health information. But KOSA’s vague standard means that a website owner is in a better position legally if they remove that information, which avoids a potential claim later that the information is harmful. That will reduce the availability of important and potentially life-saving information online. KOSA pushes website owners toward government-approved censorship.
To ensure that users are the correct age, KOSA compels vast data collection efforts that perversely result in even greater potential privacy invasions.
KOSA would authorize a federal study on creating a device or operating system level age verification system, “including the need for potential hardware and software changes.” The end result would likely be an elaborate age-verification system, run by a third-party, that maintains an enormous database of all internet users’ data.
Many of the risks of such a program are obvious. They require every user—including children—to hand private data over to a third-party simply to use a website if that user ever wants to see beyond the government’s “parental” controls.
Moreover, the bill lets Congress decide what’s appropriate for children to view online. This verification scheme would make it much harder for actual parents to make individual choices for their own children. Because it’s so hard to differentiate between minors having discussions about many of these topics in a way that encourages them, as opposed to a way that discourages them, the safest course of action for services under this bill is to block all discussion and viewing of these topics by younger children and teenagers. If KOSA passes, instead of allowing parents to make the decision about what young people will see online, Congress will do it for them.
A recent study on attitudes toward age verification showed that most parents “are willing to make an exception or allow their child to bypass the age requirement altogether, but then require direct oversight of the account or discussions about how to use the app safely.” Many also fudge the numbers a bit, to ensure that websites don’t have the specific birthdays of their children. With the hard-wired, national age verification system imagined by KOSA, it will be much harder, if not impossible, for parents to decide for themselves what sites and content a young person can encounter. Instead, the algorithm will do it for them.
KOSA also fails to recognize the reality that some parents do not always have their childrens’ best interest in mind, or are unable to make appropriate decisions for them. Those children suffer under KOSA’s paternal regime, which requires services to set parental controls to their highest levels for those under thirteen.
KOSA is a Poor Substitute for Real Privacy Online
KOSA’s attempt to improve privacy and safety will in fact have negative impacts on both. Instead of using super-powered age-verification to determine who gets the most privacy, and then using that same determination to restrict access to huge amounts of content, Congress should focus on creating strict privacy safeguards for everyone. Real privacy protections that prohibit data collection without opt-in consent address the concerns about children’s privacy while rendering age-verification unnecessary. Congress should get serious about protecting privacy and pass legislation that creates a strong, comprehensive privacy floor with robust enforcement tools.