The online conversations that bring us closer together can help build a world that’s more free, fair, and creative. But talking to each other only works when the people talking have their human rights respected, including their right to speak privately. The best tool to defend that right in the digital world is end-to-end encryption. 

In 2022, we fought back against large-scale attempts by governments to undermine secure and private online speech. The U.S. Senate introduced a new version of the toxically unpopular EARN IT Act. This bill would push companies to drop strong encryption by threatening the removal of key legal protections for websites and apps. EFF supporters spoke up and this bill was stopped in the Senate, again, although not before an unfortunate committee vote that endorsed the bill. 

In the U.K., Parliament debated an Online Safety Bill that would mandate tech providers use “accredited software” to constantly scan users for illegal material. And an even larger threat emerged in the European Union, where the European Parliament is debating a regulation that could lead to mandatory government scanning of every private message, photo, and video.

All three of these proposals are pushed by law enforcement agencies in their respective jurisdictions, and they all have the same reasoning: preventing child abuse. But constant surveillance doesn’t keep adults or kids safer. Minors also need to have private conversations with trusted adults, not devices with built-in backdoors.

We’ll continue to confront and oppose these proposals everywhere they pop up. We know that a world of broken encryption—whether it gets called a “ghost,” key escrow, or client-side scanning—is a world with less security and less privacy for everyone.

On Privacy, EU Leaders Consider A Huge Step Backwards 

The E.U. Commission, which is the executive branch of the European Union, is pushing ahead with a proposal that would compel tech companies to inspect user messages, including messages currently protected by encryption. If this proposal passes, tech platforms flagged by police will be subject to “detection orders” that will force them to provide private user messages and photos to governments.

We’re working with European partners to fight against this potentially disastrous “Chat Control” proposal. You can learn more on our joint website, Stop Scanning Me. Following our advocacy, the proposed regulation has already been rejected by the Austrian Parliament, and the German Federal Commissioner for Data Protection has called the proposal “incompatible with European values and data protection.” 

This proposal also has privacy-violating provisions that conflict with other aspects of European law, including the Digital Services Act. 

The debate in the European Parliament is still in its early stages. As more EU residents learn about this proposal, they’ll see that it’s wholly incompatible with their values. EFF will continue to lobby Members of the European Parliament (MEPs) to do the right thing. 

Law enforcement in democratic societies should not have access to unlimited, perpetual records of human conversation. The citizens they are sworn to protect do not want to live in a never-ending virtual line-up that does far more harm than good.

Learning More About Broken Scanning Systems 

We shouldn’t constantly scan people’s files, especially when they aren’t reasonably suspected of crimes. One reason is that the scanners don’t work right. In August, the New York Times reported about two fathers who were falsely accused of child abuse based on Google’s scanning software. Google didn’t back down or reinstate accounts, even after the fathers were cleared by police. 

Because of Google’s false accusations, police could have chosen to investigate these fathers for unrelated crimes, like drug possession or even copyright infringement. False accusations can be even more harmful when they are sent to a community or nation with a corrupt or biased police force, or are leveled against a member of a disfavored minority group. 

Evidence is mounting that even scanning systems that limit their search to authorities’ databases of known images of child abuse don’t work right. LinkedIn and Facebook have both examined material that their automated systems flagged as child sexual abuse material, and found accuracy rates of less than 50%. Another recent report shows that only about 20% of the images that U.S. authorities referred to Irish police as child abuse material were accurate reports. 

Towards a More Private And Secure World 

The solution isn’t more backdoors, more scanning, and endless suspicionless searches. It’s real privacy and security, including end-to-end encrypted services. Law enforcement agencies around the world should strive to do their critical work while they co-exist with secure and private online services. 

We are seeing important steps forward. Last year, Apple threatened to set up a client-side scanning system that would have constantly scanned users and reported back to law enforcement. Those plans were dropped after a public outcry, and this month, Apple said definitively that it won’t revive those scanning plans. 

What’s more, Apple has agreed to implement a complete encryption system for iCloud backups—a demand EFF has been making for more than 3 years now. When EFF supporters speak up and work together, we can win big victories like this one. 

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2022

Related Issues