This post was co-authored by legal intern Kate Prince.

Jump to our detailed report about GoGuardian and student monitoring tools.

GoGuardian is a student monitoring tool that watches over twenty-seven million students across ten thousand schools, but what it does exactly, and how well it works, isn’t easy for students to know. To learn more about its functionality, accuracy, and impact on students, we filed dozens of public records requests and analyzed tens of thousands of results from the software. Using data from multiple schools in both red and blue states, what we uncovered was that, by design, GoGuardian is a red flag machine—its false positives heavily outweigh its ability to accurately determine whether the content of a site is harmful. This results in tens of thousands of students being flagged for viewing content that is not only benign, but often, educational or informative. 

We identified multiple categories of non-explicit content that are regularly marked as harmful or dangerous, including: College application sites and college websites; counseling and therapy sites; sites with information about drug abuse; sites with information about LGBTQ issues; sexual health sites; sites with information about gun violence; sites about historical topics; sites about political parties and figures; medical and health sites; news sites; and general educational sites. 

To illustrate the shocking absurdity of GoGuardian's flagging algorithm, we have built the Red Flag Machine quiz. Derived from real GoGuardian data, visitors are presented with websites that were flagged and asked to guess what keywords triggered the alert. We have also written a detailed report on our findings, available online here (and downloadable here). 

A screenshot of the front page of the red flag machine quiz and website.

But the inaccurate flagging is just one of the dangers of the software. 

How Does Student Monitoring Software Work? 

Along with apps like Gaggle and Bark, GoGuardian is used to proactively monitor primarily middle and high school students, giving schools access to an enormous amount of sensitive student data which the company can also access. In some cases, this has even given teachers the ability to view student webcam footage without their consent when they are in their homes. In others, this sort of software has inaccurately mischaracterized student behavior as dangerous or even outed students to their families.

Though some privacy invasions and errors may be written off as the unintentional costs of protecting students, even commonly used features of these monitoring apps are cause for concern. GoGuardian lets school officials track trends in student search histories. It identifies supposedly “at risk” students and gathers location data on where and when a device is being used, allowing anyone with access to the data to create a comprehensive profile of the student. It flags students for mundane activity, and sends alerts to school officials, parents, and potentially, police.

These companies tout their ability to make the lives of administrators and teachers easier and their students safer. Instead of having to dash around a classroom to check everyone’s computers, teachers can instead sit down and watch a real-time stream of their students’ online activity. They can block sites, get alerts when students are off task, and directly message those who might need help. And during the pandemic, many of these tools were offered to schools free of chargeexacerbating the surveillance while minimizing students’ and parents’ opportunity to push back. Along with the increased use of school issued devices, which are more common in marginalized communities, this has created an atmosphere of hypercharged spying on students.

This problem isn’t new. In 2015, EFF submitted a complaint to the FTC that Google’s Apps for Education (GAFE) software suite was collecting and data mining school children’s personal information, including their Internet searches. In a partial victory, Google changed its tune and began explicitly stating that even though they do collect information on students’ use of non-GAFE services, they treat that information as “student personal information” and do not use it to target ads.

But the landscape has shifted since then. The use of “edtech” software has grown considerably, and with monitoring-specific apps like GoGuardian and Gaggle, students are being taught by our schools that they have no right to privacy and that they can always be monitored.  

Knowing how you’re being surveilled—and how accurate that surveillance is—must be the first step to fighting back, and protecting your privacy.  This blog is a run-down of some of the most common GoGuardian features.

GoGuardian Admin is God Mode for School Administrators

School administrators using GoGuardian’s “Admin” tool have nearly unfettered access to huge amounts of data about students, including browsing histories, documents, videos, app and extension data, and content filtering and alerts. It’s unclear why so much data is available to administrators, but GoGuardian makes it easy for school admins to view detailed information about students that could follow them for the rest of their lives. (The Center for Democracy And Technology has released multiple reports indicating that student monitoring software like GoGuardian is primarily used for disciplinary, rather than safety, reasons.) Administrators can also set up “alerts” for when a student is viewing “offensive” content, though it’s not clear what is and is not “offensive,” to whom. These alerts could be used to stifle a student’s First Amendment right to information--for example, if a school decides that anything from an opposing political party or anything related to the LGBTQ community is harmful, it can prevent students from viewing it. These flags and filters can be applied to all students, or individualized for specific students, and allow administrators to see everything a student looks at online. 

GoGuardian claims that they de-identify data before sharing it with third parties or other entities. But this data can easily be traced back to an individual. This means advertisers could target students based on their internet usage, something explicitly prohibited by federal law and in the student privacy pledge taken by many EdTech companies. 

GoGuardian Teacher: A 24/7 Lesson in Surveillance

GoGuardian gives teachers a real-time feed of their students’ screens, and allows them to block any websites for individuals or groups. Students have no way of opting out of scenes and do not have to confirm that they know they are being monitored. The only indication is the appearance of an extension in their browser. 

This monitoring can happen whether or not a student is on school grounds. “Scenes” can last for eight hours and can be scheduled in advance to start at any time of the day or night, and if a teacher schedules a scene to start immediately after the next, then they could monitor a student 24/7. During a scene, GoGuardian collects minute by minute records of what is on a student’s screen and what tabs they have open, all of which can be easily viewed in a timeline.

GoGuardian takes no responsibility for these potential abuses of their technology, instead putting the onus on school administrators to anticipate abuse and put systems in place to prevent them. In the meantime GoGuardian is still accessing and collecting the data. 

GoGuardian Beacon: Replacing Social Workers with Big Brother

GoGuardian’s “Beacon” tool supposedly uses machine learning and AI to monitor student behavior for flagged key terms, track their history, and provide analysis on their likelihood to commit harmful acts to themselves or others. GoGuardian claims it can detect students who are at risk and “identify students’ online behaviors that could be indicative of suicide or self-harm.” Instead of spending money on an investment in social workers and counselors, people who are trained to detect this same behavior, GoGuardian claims that schools can rely on its tools to do it with algorithms.

GoGuardian touts anecdotal evidence of the system working, but from our research, the flagging inside of Beacon may not be much more accurate than its other flagging features. And while schools can determine to whom Beacon sends alerts, but if those staffers are not trained in mental health, they may not be able to determine whether the alert is accurate.This could lead to inappropriate interventions by school administrators who erroneously believe a student is in the “active planning” stage of a harmful act. If a student is accused of planning a school shooting when in reality they were researching weapons used during historical events, or planning a suicide when they were not, that student will likely not trust the administration in the future and feel their privacy has been violated. You can learn more about GoGuardian Beacon from this detailed documentary by VICE News.

Protecting Students First

Schools should be safe places for students, but they must also be places where students feel safe exploring ideas. Student monitoring software not only hinders that exploration, but endangers those who are already vulnerable. We know it will be an uphill battle to protect students from surveillance software. Still, we hope this research will help people in positions of authority, such as government officials and school administrators, as well as parents and students, to push for the companies that make this software to improve, or to abandon their use entirely.

TAKE the red flag machine quiz

Learn more about our findings in our detailed report.  

Related Issues