When Mexico's Congress rushed through a new copyright law as part of its adoption of Donald Trump's United States-Mexico-Canada Agreement (USMCA), it largely copy-pasted the US copyright statute, with some modifications that made the law even worse for human rights.
The result is a legal regime that has all the deficits of the US system, and some new defects that are strictly hecho en Mexico, to the great detriment of the free expression rights of the Mexican people.
Mexico's Constitution has admirable, far-reaching protections for the free expression rights of its people. Mexico’s Congress is not merely prohibited from censoring its peoples' speech -- it is also banned from making laws that would cause others to censor Mexicans' speech.
Mexico’s Supreme Court has ruled that Mexican authorities and laws must recognize both Mexican constitutional rights law and international human rights law as the law of the land. This means that the human rights recognized in the Constitution and international human rights treaties such as the American Convention on Human Rights, including their interpretation by the authorized bodies, make up a “parameter of constitutional consistency," except that where they clash, the most speech-protecting rule wins. Article 13 of the American Convention bans prior restraint (censorship prior to publication) and indirect restrictions on expression.
As we will see, Mexico's new copyright law falls very far from this mark, exposing Mexicans to grave risks to their fundamental human right to free expression.
While the largest tech companies in America have voluntarily adopted algorithmic copyright filters, Article 114 Octies of the new Mexican law says that "measures must be taken to prevent the same content that is claimed to be infringing from being uploaded to the system or network controlled and operated by the Internet Service Provider after the removal notice." This makes it clear that any online service in Mexico will have to run algorithms that intercept everything posted by a user, compare it to a database of forbidden sounds, words, pictures, and moving images, and, if it finds a match, it will have to block this material from public view or face potential fines.
Requiring these filters is an unlawful restriction on freedom of expression. “At no time can an ex ante measure be put in place to block the circulation of any content that can be assumed to be protected. Content filtering systems put in place by governments or commercial service providers that are not controlled by the end-user constitute a form of prior censorship and do not represent a justifiable restriction on freedom of expression." Moreover, they are routinely wrong. Filters often mistake users own creative works for copyrighted works controlled by large corporations and block them at the source. For example, classical pianists who post their own performances of public domain music by Beethoven, Bach, and Mozart find their work removed in an eyeblink by an algorithm that accuses them of stealing from Sony Music, which has registered its own performances of the same works.
To make this worse, these filters amplify absurd claims about copyright — for example, the company Rumblefish has claimed copyright in many recordings of ambient birdsong, with the effect that videos of people walking around outdoors get taken down by filters because a bird was singing in the background. More recently, humanitarian efforts to document war-crimes fell afoul of automated filtering.
Filters can't tell when a copyrighted work is incidental to a user's material or central to it. For example, if your seven-hour scholarly conference's livestream captures some background music playing during the lunch break, YouTube's filters will wipe out all seven hours' worth of audio, destroying the only record of the scientific discussions during the rest of the day.
For many years, people have toyed with the idea of preventing their ideological opponents' demonstrations and rallies from showing up online by playing copyrighted music in the background, causing all video-clips from the event to be filtered away before the message could spread.
This isn’t a fanciful strategy: footage from US Black Lives Matter demonstrations is vanishing from the Internet because the demonstrators played amplified music during their protests.
No one is safe from filters: last week, CBS's own livestreamed San Diego Comic-Con presentation was shut down due to an erroneous copyright claim by itself.
Filters can only tell you if a work matches or doesn't match something in their database — they can't tell if that match constitutes a copyright violation. Mexican copyright contains "limitations and exceptions" for a variety of purposes. While this is narrower than the US's fair use law, it nevertheless serves as a vital escape valve for Mexicans' free expression. A filter can't tell if a match means that you are a critic quoting a work for a legitimate purpose or an infringer breaking the law.
As if all this wasn't bad enough: the Mexican filter rule does not allow firms to ignore those with a history of making false copyright claims. This means that if a fraudster sent Twitter or Facebook — or a Made-In-Mexico alternative — claims to own the works of Shakespeare, Cervantes, or Juana Inés de la Cruz, the companies could ignore those particular claims if their lawyers figured out that the sender did not own the copyright, but would have to continue evaluating each new claim from this known bad actor. If a fraudster included just one real copyright claim amidst the torrent of fraud, the online service provider would be required to detect that single valid claim and honor it.
This isn't a hypothetical risk: "copyfraud" is a growing form of extortion, in which scammers claim to own artists' copyrights, then coerce the artists with threats of copyright complaints.
Algorithms work at the speed of data, but their mistakes are corrected in human time (if at all). If an algorithm is correct an incredible, unrealistic 99 percent of the time, that means it is wrong one percent of the time. Platforms like YouTube, Facebook and TikTok receive hundreds of millions of videos, pictures and comments every day — one percent of one hundred million is one million. That's one million judgments that have to be reviewed by the company's employees to decide whether the content should be reinstated.
The line to have your case heard is long. How long? Jamie Zawinski, a nightclub owner in San Francisco, posted an announcement of an upcoming performance by a band at his club in 2018, only to have it erroneously removed by Instagram. Zawinski appealed. 28 months later, Instagram reversed its algorithm's determination and reinstated his announcement — more than two years after the event had taken place.
This kind of automated censorship is not limited to nightclubs. Your contribution to your community's online discussion of an upcoming election is just as likely to be caught in a filter as Zawinski's talking about a band. When (and if) the platform decides to let your work out of content jail, the vote will have passed, and with it, your chance to be part of your community's political deliberations.
As terrible as filters are, they are also very expensive. YouTube's "Content ID" filter has cost the company more than $100,000,000, and this flawed and limited filter accomplishes only a narrow slice of the filtering required under the new Mexican law. Few companies have an extra $100,000,000 to spend on filtering technology, and while the law says these measures “should not impose substantial burdens" on implementers, it also requires them to find a way to achieve permanent removal of material following a notification of copyright infringement. Filter laws mean even fewer competitors in the already monopolized online world, giving the Mexican people fewer places where they may communicate with one another.
Section 1201 of America's Digital Millennium Copyright Act (DMCA) is one of the most catastrophic copyright laws in history. It provides harsh penalties for anyone who tampers with or disables a "technical protection measure" (TPM): massive fines or, in some cases, prison sentences. These TPMs — including what is commonly known as "Digital Rights Management" or DRM — are the familiar, dreaded locks that stop you from refilling your printer's ink cartridge, using an unofficial App Store with your phone or game console, or watching a DVD from overseas in your home DVD player.
You may have noticed that none of these things violate copyright — and yet, because you must remove a digital lock in order to do them, you could be sued in the name of copyright law. DMCA 1201 does not provide the clear, unambiguous protection that would be needed to protect free expression. One appellate court in the United States has explicitly held that you can be liable for a violation of Section 1201 even if you’re making a fair use, and that is the position adopted by the U.S. Copyright Office. Other courts disagree, but the net effect is that you engage in these non-infringing uses and expressions at your peril. The US Congress has failed to clarify this law and tie liability for bypassing a TPM to an actual act of copyright infringement — “you may not remove the TPM from a Netflix video to record it and put it on the public Internet (a copyright infringement), but if you do so in order to make a copy for personal use (not a copyright infringement), that's fine."
The failure to clearly tie DMCA 1201 liability to infringement has had wide-ranging effects for repair, cybersecurity and competition that we will explore in later installments of this series. Today, we want to focus on how TPMs undermine free expression.
TPMs give unlimited power to manufacturers. An ever-widening constellation of devices are designed so that any modifications require bypassing a TPM and incurring liability. This allows companies to sell you a product but dictate how you must use it — preventing you from installing your own apps or other code to make it work the way you want it to.
The first speech casualty of TPM rules is the software author. This person can write code -- a form of speech — but they cannot run it on their devices without permission from the manufacturer, nor can they give the code to others to run on their devices.
Why might a software author want to change how their device works? Perhaps because it is interfering with their ability to read literature, watch films, hear music or see images. TPMs such as the global DVB CPCM standard enforce a policy called the "Authorized Domain" that defines what is — and is not — a family. Authorized Domain devices owned by a standards compliant family can all share creative works among them, allowing parents and children to share among themselves.
But an "Authorized Domain family" is not the same as an actual family. The Authorized Domain was designed by rich people from the global north working for multinational corporations, whose families are far from typical. The Authorized Domain will let you share videos between your boat, your summer home, and your SUV — but it won't let you share videos between a family whose daughter works as a domestic worker in another country, whose son is a laborer in another state, and whose parents are migrant workers who are often separated (there are far more families in this situation than there are families with yachts and second homes!).
Even if your family meets with the approval of an algorithm designed in a distant board-room by strangers who have never lived a life like yours, you still may find yourself unable to partake in culture that you are entitled to. TPMs typically require a remote server to function, and when your Internet goes down, your books or movies can be rendered unviewable.
It's not just Internet problems that can cause the art and culture you own to vanish: last year, Microsoft became the latest in a long list of companies who switched off their DRM servers because they decided they no longer wanted to be a bookstore. Everyone who ever bought a book from Microsoft lost their books.
Mexico's Congress did nothing to rebalance its version of America's TPM rules. Indeed, Mexico's rules are worse than America's. Under DMCA 1201, the US Copyright Office holds hearings every three years to grant exemptions to the TPM rule, granting people the right to remove or bypass TPMs for legitimate purposes. America's copyright regulator has granted a very long list of these exemptions, having found that TPMs were interfering with Americans in unfair, unjust, and even unsafe ways. Of course, that process is far from perfect: it’s slow, skewed heavily in favor of rightsholders, and illegally restricts free expression by forcing would-be speakers to ask the government in advance for permission through an arbitrary process.
Mexico's new copyright law mentions a possible equivalent proceeding but leaves it maddeningly undefined — and certainly does nothing to remedy the defects in the US process. Recall that USMCA is a trade agreement, supposedly designed to put all three countries on equal footing — but Americans have the benefit of more than two decades' worth of exemptions to this terrible rule, while Mexicans will have to labor under its full weight until (and unless) they can use this undefined process to secure a comparable list of exemptions. And even then, they won’t have the flexibility offered by fair use under US law.
Section 512 of the US DMCA created a "notice and takedown" rule that allows rightsholders or their representatives to demand the removal of works without any showing of evidence or finding of fact that their copyrights were infringed. This has been a catastrophe for free expression, allowing the removal of material without due care or even through malicious, fraudulent acts (the author of this article had his New York Times bestselling novel improperly removed from the Internet by careless lawyers for Fox Entertainment, who mistook it for an episode of a TV show of the same name).
As bad as America's notice and takedown system is, Mexico's is now worse.
In America, online services that honor notice and takedown get a "safe harbor" — meaning that they are not considered liable for their users' copyright infringements. However, online services in the US that believe a user’s content is noninfringing may ignore it, and they are only liable at all if they meet the tests for “secondary liability" for copyright infringement, something that is far from automatic. If the rightsholder sues, the service may end up in court alongside their user, but the service can still rely on the safe harbor in relation to other works published by other users, provided they remove them upon notice of infringement.
The Mexican law makes it a strict requirement to remove content. Under Article 232 Quinquies (II), providers must honor all takedown demands by copyright owners, even obviously overreaching ones, or they face fines of UMA1,000-20,000.
Further, Article 232 Quinquies (III) of the Mexican law allows anyone claiming to be an infringed-upon rightsholder to obtain the personal information of the alleged infringer. This means that gangsters, thin-skinned public officials, stalkers, and others can use fraudulent copyright claims to unmask their critics. Who will complain about corrupt police, abusive employers, or local crime-lords when their personal information can be retrieved with such ease? We recently defended the anonymity of a person who questioned their religious community, when the religious organization tried to use the corresponding part of the DMCA to identify them. In the name of copyright, the law gives new tools to anyone with power to stifle dissent and criticism.
This isn't the only "chilling effect" in the Mexican law. Under Article 114 Octies (II), a platform must comply with takedown requests for mere links to a Web-page that is allegedly infringing. Linking, by itself, is not an infringement in the United States or Canada, and its legal status is contested in Mexico. There are good reasons why linking is not infringement: It’s important to be able to talk about speech elsewhere on the Internet and to share facts, which may include the availability of copyrighted works whose license or infringement status is unknown. Besides that, Web-pages change all the time: if you link to a page that is outside of your control and it is later updated in a way that infringes copyright, you could be the target of a takedown request.