A growing number of people are experimenting with federated alternatives to social media like Mastodon, either by joining an “instance” hosted by someone else or creating their own instance by running the free, open-source software on a server they control. (See more about this movement and joining the fediverse here).

The fediverse isn’t a single, gigantic social media platform like Facebook or Youtube. It’s an expanding ecosystem of interconnected sites and services that let people interact with each other no matter which one of these sites and services they have an account with. That means people can tailor and better control their own experience of social media and be less reliant on a monoculture developed by a handful of tech giants. 

For people hosting instances, however, it can also mean some legal risk. Fortunately, there are some relatively easy ways to mitigate that risk – if you plan ahead. To help people do that, this guide offers an introduction to some common legal issues, along with a few practical considerations.

Two important notes: (1) This guide is focused on legal risks that flow from hosting other people’s content, under U.S. law. In general, the safe harbors and immunities discussed below will not protect you if you are directly infringing copyright or defaming someone. (2) Many of us at EFF are lawyers, but we are not YOUR lawyers. This guide is intended to offer a high-level overview of U.S. law and should not be taken as legal advice specific to your particular situation.

Copyright

Copyright law gives the rightsholder substantial control over the use of expressive works, subject to several important limits such as fair use. Violations may result in ruinous damage awards if some of your users share infringing material via your instance and if you are found to be responsible for that infringement under doctrines of “secondary liability” for copyright infringement.

However, the Digital Millennium Copyright Act, 17 USC § 512, creates a "safe harbor" immunity from copyright liability for service providers – including instance admins – who "respond expeditiously" to notices claiming that they are hosting or linking to infringing material. Taking advantage of the safe harbor protects you from having to litigate the complex question of secondary liability and from the risk you would ultimately be found liable.

The safe harbor doesn’t apply automatically. First, the safe harbor is subject to two disqualifiers: (1) actual or “red flag” knowledge of specific infringement; and (2) profiting from infringing activity if you have the right and ability to control it. The standards for these categories are contested; if you are concerned about them, you may wish to consult a lawyer.

Second, a provider must take some affirmative steps to qualify:

  1. Designate a DMCA agent with the Copyright Office.

This may be the best $6 you ever spend. A DMCA agent serves as an official contact for receiving copyright complaints, following the process discussed below. Note that your registration must be renewed every three years and if you fail to register an agent you may lose the safe harbor protections. You must also make the agent’s contact information available on your website, such as a link to publicly-viewable page that describes your instance and policies.

  1. Have a clear DMCA policy, including a repeat infringer policy, and follow it.

To qualify for the safe harbors, all service providers must “adopt and reasonably implement, and inform subscribers and account holders of . . . a policy that provides for the termination in appropriate circumstances of . . . repeat infringers.” There’s no standard definition for “repeat infringer” but some services have adopted a “three strikes” policy, meaning they will terminate an account after three unchallenged claims of infringement. Given that copyright is often abused to take down lawful speech, you may want to consider a more flexible approach that gives users ample opportunity to appeal prior to termination. Courts that have examined what constitutes “reasonable implementation” of a termination process have stressed that service providers need not shoulder the burden of policing infringement.

Hosting services, which are the mostly likely category for a Mastodon instance, must also follow the “notice and takedown” process, which requires services to remove allegedly infringing material when they are notified of it. To be valid under the DMCA, the notice must include the following information:

  • The name, address, and physical or electronic signature of the complaining party
  • Identification of the infringing materials and their internet location (e.g. a url)
  • Sufficient information to identify the copyrighted works
  • A statement by the copyright holder of a good faith belief that there is no legal basis for the use complained of
  • A statement of the accuracy of the notice and, under penalty of perjury, that the complaining party is authorized to act on the behalf of the copyright holder

Providers are not required to respond to a DMCA notice that does not contain substantially all of these elements. Copyright holders are required to consider whether the targeted use may be a lawful fair use before sending notices.

If they think they’ve been unfairly targeted, users can respond with a counter-notice. You should forward it to the rightsholder. At that point, the copyright claimant has 10-14 days to file a lawsuit. If they don’t, you can put back the material and still remain immune from liability.

A proper counter-notice must contain the following information:

  • The user's name, address, phone number, and physical or electronic signature [512(g)(3)(A)]
  • Identification of the material and its location before removal [512(g)(3)(B)]
  • A statement under penalty of perjury that the material was removed by mistake or misidentification [512(g)(3)(C)]
  • Consent to local federal court jurisdiction, or if overseas, to an appropriate judicial body. [512(g)(3)(D)]

To help the process along, it’s good practice to forward the original takedown notice to the user, so they can understand who’s complaining and why.

Finally, service providers must “accommodate and not interfere with standard technical measures…used by copyright owners to identify or protect copyrighted works.” In order to qualify as a “standard technical measure,” the measure must have been developed “pursuant to a broad consensus of copyright owners and service providers in an open, fair, voluntary, multi-industry standards process,” and not impose “substantial costs” on service providers. As of 2022, nothing appears to qualify.

State Laws and Federal Civil Claims, or Why Section 230 Isn't Just a "Big Tech" Protection

Thanks to Section 230 of the Communications Decency Act, online intermediaries that host or republish speech are protected against a range of state laws, such as defamation, that might otherwise be used to hold them legally responsible for what their users say and do. Section 230 applies to basically any online service that hosts third-party content, such as web hosting companies, domain name registrars, email providers, social media platforms – and Mastodon instances.

Section 230 says that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. § 230). This protects the provider from liability for what users say in a wide variety of legal contexts. It also provides immunity from liability that might arise from service's removal of users’ speech or other moderation decisions. Unlike the DMCA, Section 230 does not require service providers to take any affirmative steps to qualify for protection.

In 2018, however, Congress passed FOSTA/SESTA, which created new civil and criminal liability for anyone who “owns, manages, or operates an interactive computer service” and creates content (or hosts third-party content) with the intent to “promote or facilitate the prostitution of another person.” The law also expands criminal and civil liability to classify any online speaker or platform that allegedly assists, supports, or facilitates sex trafficking as though they themselves were participating “in a venture” with individuals directly engaged in sex trafficking.

EFF represents several plaintiffs who are challenging the constitutionality of FOSTA/SESTA. As of this writing, the law is still on the books.

Privacy and Anonymity

Your users may register using pseudonyms, and people who want to respond to them may ask you to reveal any personally identifying information you have about them. They may seek to use that information as part of a legal action, but also to retaliate in some other way. Law enforcement may also seek information from you as part of a criminal investigation or prosecution.

If you receive a subpoena or other legal document requiring you to produce that information, consider consulting a lawyer to see if you are required to comply. The best practice is to notify the user as soon as possible so that they can challenge the subpoena in court. Many such challenges have been successful, given the strong First Amendment protections for anonymous speech. You may delay notification for an emergency, a gag order, or when providing notice would be futile, but best practice is to publicly commit to providing notice after the emergency is over or the gag has expired.

Consider publishing a regularly updated Transparency Report, which should include useful data about how many times governments sought user data and how often you provided user data to governments. Even if you get few or no government requests, publishing a report showing zero requests can provide useful data to users.

In addition, consider whether you are collecting and/or retaining more information than necessary, such as logging IP addresses, datestamps, or reading activity. If you don’t have it, they won’t come for it. Thus, it a best practice to publish a law enforcement guide explaining how you respond to data demands from the government, including what information you cannot provide.

Child Sexual Abuse Material (CSAM)

Service providers are required to report any CSAM on their servers to the CyberTipline operated by the National Center for Missing and Exploited Children (NCMEC), a private, nonprofit organization established by the U.S. Congress, and can be criminally prosecuted for knowingly facilitating its distribution. NCMEC shares those reports with law enforcement.  However, you are not required to affirmatively monitor your instance for CSAM.

Other legal issues

In our litigious society, there are many "causes of action" — reasons for initiating a lawsuit — which a creative and determined plaintiff can dream up. Aggressive plaintiffs will sometimes use dubious claims to try to squelch protected speech. If you get a threat or request for information that seems inappropriate, feel free to reach out to EFF (info@eff.org) and we will help if we can.