Numerous state laws passed this year, and bills proposed in Congress, would set onerous new restrictions on what young people can do online, depriving teenagers of their First Amendment rights to express themselves, access protected speech, engage in anonymous speech, and participate in online communities. They also enforce a presumption that parents of minors do not want them accessing social media. These laws would require people under a certain age, usually 18, to obtain parental consent before making an account on some of the most popular platforms on the planet, many of which are useful for young people to access educational resources, community, political speech, and more. In doing so, they would also violate the same core First Amendment rights of people of all ages by requiring identification to access important global platforms. 

Utah and Arkansas  have already passed such laws. When they go into effect in March 2024 and September 2023, respectively, anyone under eighteen will be required to obtain parental consent before accessing social media. The same would be true nationally under the Protecting Kids on Social Media Act, which was recently introduced in Congress. And though it doesn’t directly require parental consent for social media, the Kids Online Safety Act, too, would require platforms to implement “parental supervision” tools that would force them to verify child-parent relationships. Once the Utah and Arkansas laws are in effect, young people will not be able to access social media using a login without a complicated approval process that would require parents and guardians to share their private information with social media platforms or third-party verification services. 



There are some differences in these laws. The Arkansas law only applies to new accounts, for example, while Utah’s law also creates dangerous privacy invasions by in some cases requiring parents to have access to children’s social networks and private messages. Still other laws, like one put forward but not yet passed in Connecticut would apply to those under 16 instead of 18. But parental consent is an aspect of each of the laws, and likely of others that may pass this year. 

How Would Parental Consent Laws Work? 

Parental consent laws like these would force parents to offer some sort of proof that they consent to a child’s use of social media. None of the laws outline any procedures for how this would be done, and they all establish these consent requirements as if it's an easy thing to implement. 

These systems don’t necessarily take into account a large number of non-traditional families.

But if the laws are meant to have any teeth, an adult would have to prove not only that they approve of a minor’s use of social media, but that they are a parent or guardian of the child. Confirming the identity of a user online is a complex issue. Doing so while maintaining the user’s anonymity is even harder. There is no obvious system for doing this currently in place on social media, nor is there one that would not be onerous to a parent. Put simply: if you had to prove who your parents were to an online platform, how would you do it? The systems that currently exist are often used to satisfy requirements under COPPA (the Children's Online Privacy Protection Rule) that protect privacy of those under 13. Though not comprehensive, current methods include: 

  • Providing a consent form to be signed by the parent, returned by mail, fax, or scanning
  • Requiring a parent to use a credit card
  • Having a parent call a telephone number or have a video conference
  • Having a parent provide a form of government issued ID compared to a database
  • Providing knowledge-based challenge questions that would be difficult for someone other than the parent to answer
  • Verifying a picture of a driver's license of other photo ID submitted by the parent and comparing that to a photo of the parent

There are significant problems with each of these methods. First, these systems don’t necessarily take into account a large number of non-traditional families. It’s unclear whether they will function for children with different last names than a parent, those in foster care, and those whose guardians are other relatives. Second, some of these methods may rely on devices that not all families may have. And third, all of these methods likely require parents to share personal information with third-parties or platforms. In that way, parental consent laws run into similar privacy problems as age verification laws.

The end result of laws requiring parental consent would be a huge number of young people—particularly the most vulnerable—losing access to these platforms, solely because they cannot satisfy the arbitrary requirements necessary. Still other minors would lose access because parents may not have the time to complete the verification requirements, or because their parents do not wish to share more private information with the companies.  

There are significant problems with each of these methods

And though it is painful to think about, there are many scenarios where a parent might not wish their child to access social media because they do not have their best interests in mind. In Utah, where the first social media parental consent law was passed earlier this year, there were 9,695 children confirmed as victims of abuse and neglect in 2022, with the largest percentage, 36%, being 12 to 17 years of age. The majority of those responsible were parents. Social media can play a critical role for young people to access resources and support in these circumstances, and laws like these may block them from access entirely. (Utah’s law, which also requires that parents have access to young people’s accounts and messages, creates even more concern in these situations.)

Parental Consent Laws For Social Media Deprive Young People and Adults of their First Amendment Rights

As mentioned, some parental consent requirements do exist in the law already, of course. COPPA protects the privacy of people under 13 by requiring parental consent before certain private information is collected about them. COPPA’s concern, however, is with privacy and the exploitation of children’s data. Under these newer parental verification laws, the government’s interest is in protecting children from viewing harmful content—but 99.9% of that content is likely legal speech, and thus these laws affect First Amendment rights.

The Supreme Court has repeatedly recognized that states and Congress cannot use concerns about children to ban them from expressing themselves or accessing information. Most recently in Brown v. EMA, the Court ruled that while the State might have “the power to enforce parental prohibitions—to require, for example, that the promoters of a rock concert exclude those minors whose parents have advised the promoters that their children are forbidden to attend, . . . it does not follow that the state has the power to prevent children from hearing or saying anything without their parents' prior consent” (564 U.S. 786, 795 n.3). In other words, although states and Congress can give parents tools to help, the state cannot substitute itself for parents and prohibit all minors from engaging in First Amendment activity.

To cite just one example of the type of important legal speech that minors could be blocked from engaging in or receiving, look no further than the most recent activism by young people against police brutality and gun violence. A Nashville protest against police brutality organized by six teenage activists who met on Twitter drew an estimated 20,000 participants in 2020. The Enough! National School Walkout, organized primarily online, involved students walking out from their classes for exactly 17 minutes in over 3,000 schools, and had nearly one million student participants. And the March for Our Lives—also primarily organized via social media—was among the biggest youth-led protests since the Vietnam War era, in which somewhere between 200,000 to 800,000 people participated, and all speakers were high schooler age or younger. 

Though these laws are often described as protecting or upholding “parental rights,” they do the exact opposite

But it is not only young people whose rights will be harmed by these laws. All users who do not verify their age to a platform could be blocked from using it. Forcing websites to require visitors to prove their age by submitting information such as government-issued identification would potentially deprive tens of millions of Americans who do not have government-issued ID from access to much of the internet, deprive all users of anonymous access, and force everyone to share more private data or be blocked from the service. 

Parental Consent Laws For Social Media Deprive Parents of their Right to Raise Children Without Governmental Interference

All of these laws rest on a single assumption: parents should not be allowed to choose to raise their children as they see fit without governmental interference, including to allow their kids to use social media without hitting a ban or parental consent rule. Though these laws are often described as protecting or upholding “parental rights,” they do the exact opposite: they remove  parents’ ability to decide for themselves what they will allow their child to access online, the vast majority of which is legal speech.

Though not perfect, platforms already have safety features that accomplish many of the goals of these laws. Instagram allows parents to supervise the accounts of young people. TikTok allows parents to pair their accounts with their child’s account to set safety features. These features may not stop young people from accessing the platforms, but they do offer an option for parents who are interested in supervising their children online. 

What Should Lawmakers Do?

Laws restricting young people’s access to social media are en vogue in part because of a series of recent studies that some have interpreted to indicate that social media use can cause mental health issues in young people. But other studies show just the opposite: Common Sense Media found that teens who are already at risk or dealing with mental health challenges are more likely to have negative experiences with social media, but those same teens are also more likely to value the benefits of social media, like finding resources, community, or support. 

The outcomes of social media use depend on the individual child

In another study often cited by lawmakers which was conducted by Meta and leaked by Frances Haugen, twice as many respondents reported that Instagram alleviated suicidal thinking as said it worsened it; three times as many said it made them feel less anxious as said it made them feel more so; and nearly five times as many reported that Instagram made them less sad as that it made them sadder. 

And just this week many of the country's leading experts on children and social media use released a report declaring that social media “is not inherently beneficial or harmful to young people.” The outcomes of social media use depend on the individual child, and parents must make their own decisions for that use. Given that “the effects of social media likely depend on what teens can do and see online, teens’ preexisting strengths or vulnerabilities, and the contexts in which they grow up,” it is certainly not the place of Congress to set a rigid and universal standard. 

Social media’s toxicity is a real issue. But young people are not the only ones affected, and solutions that limit their rights in egregious ways are not solutions at all. Laws that insert the state into a family’s right to decide what level of independence a young person has, and block young people from accessing legal speech, will not solve the problems these complex social issues, which exist both online and offline. 

According to a 2016 Pew report, less than half of parents used parental controls, and only 16% used monitoring tools on their teen’s phone. Whether that’s because most parents choose to allow their teens to have freedom online, or because they are unaware of the options, is unclear. But EFF supports the development of tools that are easy to use for parents and young people so that they can decide what they can access online. We also support laws that strengthen competition and interoperability, which would give us all more options to better control our own experiences online. 

In addition, we support privacy for everyone—including young people—online. Comprehensive data privacy legislation would remove some of the worst incentives that social media platforms have to capture the data and attention of every user in dangerous ways. Laws that force all users to share even more data with platforms move us in the wrong direction. 

In the meantime, EFF will continue to fight for the rights of all people to both share and access legal speech online.