Today we are launching TOSsed Out, a new iteration of EFF’s longstanding work in tracking and documenting the ways that Terms of Service (TOS) and other speech moderating rules are unevenly and unthinkingly applied to people by online services. As a result of these practices, posts are deleted and accounts banned, harming those for whom the Internet is an irreplaceable forum to express ideas, connect with others, and find support.

TOSsed Out continues in the vein of, which EFF launched in 2014 to collect reports from users in an effort to encourage social media companies to operate with greater transparency and accountability as they regulate speech. TOSsed Out will highlight the myriad ways that all kinds of people are negatively affected by these rules and their uneven enforcement.

Last week the White House launched a tool for people to report incidents of “censorship” on social media, following the President’s repeated allegations of a bias against conservatives in how these companies apply their rules. In reality, commercial content moderation practices negatively affect all kinds of people, especially people who already face marginalization. We’ve seen everything from Black women flagged for sharing their experiences of racism to sex educators whose content is deemed too risqué. TOSsed Out will show that trying to censor social media at scale ends up removing legal, protected speech that should be allowed on platforms

TOSsed Out’s debut today is the result of brainstorming, research, design, and writing work that began in late 2018 after we saw an uptick in takedowns resulting from increased public and government pressure, as well as the rise in automated tools. A diverse group of entries are being published today, including a Twitter account parodying Beto O’Rourke being deemed as “confusing” or “deceptive,” a gallery focused on creating awareness of diversity of women’s bodies, a Black Lives Matter-themed concert, and an archive aimed at documenting human rights violations.

These examples, and the ones added in the future, make clear the need for companies to embrace the Santa Clara Principles. We helped create the Principles to establish a human rights framework for online speech moderation, require transparency about content removal, and specify appeals processes to help users get their content back online. We call on companies to make that commitment now, rather than later.

People rely on Internet platforms to share experiences and build communities, and not everyone has good alternatives to speak out or stay in touch when a tech company censors or bans them. Rules need to be clear, processes need to be transparent, and appeals need to be accessible.

TOSsed Out Entries Launched Today: