The European Parliament had an important decision to make this week about the Digital Services Act (DSA). After months of considering amendments, members oscillated between several policy options on how to regulate online platforms, including the dystopian idea of mandating dominant platforms act as internet police, monitoring content on behalf of governments and collecting user information to keep the internet "safe."
European Parliament Got Many Things Right...
In today's vote, the EU Parliament made the right choice. It rejected the idea of a made-in-Europe filternet, and refrained from undermining pillars of the e-Commerce Directive that are crucial to a free and democratic society. Members of Parliament (MEPs) followed the key Internal Market Affairs Committee (IMCO) and opted against upload filters and unreasonable take down obligations, made sure that platforms don't risk liability just for reviewing content, and rejected unworkably tight deadlines to remove potentially illegal content and interfere with private communication. Further analysis is required but, on the whole, the EU Parliament avoided following in the footsteps of prior controversial and sometimes disastrous EU internet rules, such as the EU copyright directive.
This is the right approach to platform governance regulation. It was a victory for civil society and other voices dedicated to making sure that all users are treated equally, including the Digital Services Act Human Rights Alliance, a group of civil society organizations from around the globe advocating for transparency, accountability, and human rights-centered lawmaking. For example, the Parliament rejected an unworkable and unfair proposal to make some media content unblockable so that publishers could profit under ancillary copyright rules. The Parliament also decided to step up efforts against surveillance capitalism by adopting new rules that would restrict the data-processing practices of big tech companies. Under the new rules, Big Tech will no longer be allowed to engage in targeted advertising if it is based on users' sensitive personal data. A "dark patterns" provision also forbids companies from using misleading tabs and obscuring functions to trick users into doing something they didn't mean to do.
But It Also Got Some Things Wrong
The DSA strengthens the right of users to retain anonymity online and promotes options for users to use and pay for services anonymously wherever reasonable efforts can make this possible. However, the DSA also requires mandatory cell phone registration for pornographic content creators, posing a threat to digital privacy. Also, no further improvements were made to ensure the independence of "trusted" flaggers of content, which can be law enforcement agencies or biased copyright industry associations.
Even worse, non-judicial authorities can order the removal of problematic content and request platforms to hand over sensitive user information without proper fundamental rights safeguards. That recent calls by EFF and its partners to introduce such safeguards haven't found majority support shows that lawmakers are either oblivious to, or unconcerned about, the perils of law enforcement overreach felt acutely by marginalized communities around the globe.
Negotiations: Parliament Must Stand Its Ground
It is clear that the DSA will not solve all challenges users face online, and we have a long way to go if we wish to rein in the power of big tech platforms. However, the EU Parliament's position, if it becomes law, could change the rules of the game for all platforms. During the upcoming negotiations with the European Council, whose positions are remarkably less ambitious than those of the Parliament, we will be working to ensure that Parliament stands its ground and any changes only further protect online expression, innovation, and privacy.