The DSA and the DMA have important tools to make the internet a fairer place, but there are implementation challenges ahead
The European Union has reached another milestone when the EU Parliament approved the “Digital Services Act package” in July 2022. Its combined Digital Services and Digital Markets Acts are intended to foster a safer and more competitive digital space.
Digital Services Act
However, the DSA is not a panacea for users, and the final deal isn’t all good news: It gives way too much power to government agencies to flag and remove potentially illegal content and to uncover data about anonymous speakers. The DSA obliges very large platforms to assess and mitigate systemic risks, but there is a lot of ambiguity about how this will turn out in practice. Much will depend on how social media platforms interpret their obligations under the DSA, and how European Union authorities enforce the regulation. Respect for the EU’s fundamental rights charter and inclusion of civil society groups and researchers in the implementation process will be crucial to ensure that the DSA becomes a positive model for legislation outside the EU.
What is going to change for users? Selected aspects.
- More transparency online: The DSA emphasizes transparency. Users should be empowered to understand why content is taken down and what they can do about it. Platforms' content moderation practices must be explained in an easy-to-understand language within the terms of service, including descriptions of the use of automated decision making and human overview. Notably, users should not only better understand how content decisions, which should be non-arbitrary, are made, but also enjoy a right to reinstatement if platforms make mistakes. Today, users might wonder why they see certain products or articles on their favorite social media feeds. The DSA demystifies this by requiring platforms to explain their content curation algorithms in more detail and using user-friendly language. Platforms must also give users the option to choose a content curation algorithm that is not based on profiling.
- Limits on targeted ads and ads transparency: The DSA restricts platforms' ability to capitalize on sensitive user information, such as ethnicity or sexual orientation. Advertisements can no longer be targeted upon such data. For minors, targeted advertisements based on personal data will be prohibited altogether. More broadly, the DSA increases transparency about the ads users see on their feeds: platforms must place a clear label on every ad, with information about the buyer of the ad and other details. Other aspects of the DSA are underwhelming, such as the half-hearted ban of manipulative interface designs, commonly known as "dark patterns."
- Fast-track procedure for law enforcement authorities: the DSA makes it easier for law enforcement authorities to uncover data about anonymous speakers, without adequate procedural safeguards. These authorities can also order a broad range of providers to remove allegedly illegal content, and take on the role of "trusted flagger." If they label content as illegal, the platform must quickly remove it.
- Ban of mandated user monitoring and limited liability for user content: The final deal preserved the EU's system of limited liability system for online intermediaries and steered clear of filtering and otherwise unreasonable take down obligations, making sure that platforms don't risk liability just for reviewing content, and rejecting tight deadlines to remove potentially illegal content. However, new due diligence standards could still encourage platforms to over-remove content to avoid being held liable for it.
- Risk assessment and risk mitigation measures: The strong role of very large online platforms in today's society is not taken lightly by the DSA. Very large online platforms and search engines (with 45 million users or more), so-called VLOPs, will be subject to independent audits and will have to analyze risks stemming from the design and use of their services in the Union. Such risks cover a variety of aspects, including the dissemination of illegal content or a potential negative impact of fundamental rights, with close attention to regional and linguistic variations. VLOPs must also take "effective" actions to mitigate those risks, but it is unclear what role regulators and civil society will play.
Digital Markets Act
The European Union’s Digital Markets Act (DMA) aims to more competition and fairness to online markets. The DMA is complex and has many facets, but its overall approach is to place new requirements and restrictions on online “gatekeepers”: the largest tech platforms, which control access to digital markets for other businesses. These requirements are designed to break down the barriers businesses face in competing with the tech giants. If a gatekeeper violates the new rules, it risks a fine of up to 10% of its total worldwide revenues.
The DMA’s threshold for "gatekeepers" is very high: companies will only be hit by the rules if they have annual revenues of €7.5 billion within the EU, or a worldwide market valuation of €75 billion. Gatekeepers must also have at least 45 million monthly individual end-users and 100,000 business users. Finally, gatekeepers must control one or more “core platform services” such as “marketplaces and app stores, search engines, social networking, cloud services, advertising services, voice assistants and web browsers.” In practice, this will almost certainly include Meta (Facebook), Apple, Alphabet (Google), Amazon, and possibly a few others.
The DMA restricts gatekeepers in several ways, including:
- limiting how data from different services can be combined,
- banning forced single sign-ons, and
- forbidding app stores from conditioning access on the use of the platform’s own payment systems.
Other parts of the DMA make it easier for users to freely choose their browser or search engine, and force companies to make unsubscribing from their “core platform services” as easy as subscribing was in the first place.
One section of the DMA requires gatekeepers to make their person-to-person messaging systems (like WhatsApp and iMessage) interoperable with competitors’ systems on request. Interoperability is an important tool to promote competition and prevent monopolists from shutting down user-empowering innovation. Many platforms act as gatekeepers to most of our social, economic, and political interactions online. Interoperability helps by giving users more choice and control over the services and products they use.
EU lawmakers initially considered several proposals relating to interoperability, including rules that would cover gatekeepers’ social networking services as well as messaging apps. The final compromise between the EU lawmakers only includes an interoperability requirement for messaging apps, however, and messaging is a tough place to start: In particular, messaging systems raise a unique set of concerns surrounding how to preserve and strengthen end-to-end encryption. That’s why we’ve advised he EU Commission to broaden its security exceptions in practice, and show flexibility in enforcing the DMA’s interoperability mandate to make sure that there’s sufficient time to resolve all significant technical and policy hurdles.