Everyone should be able to choose how they use the Internet, including being able to screen out material they don’t want and protect themselves from malicious software. The principle is core to empowering users and to ensuring that technology works for all of us.
But a recent decision by the U.S. Court of Appeals for the Ninth Circuit threatens Internet users’ ability to tailor their online experiences by increasing legal liability for companies that build Internet filtering tools. That’s why EFF filed a friend-of-the-court brief asking the court to reconsider its decision in Enigma Software Group USA, LLC v. Malwarebytes, Inc.
The case involves two software companies that compete with one another to sell products that screen Internet traffic for malware and other threats. Enigma filed suit against Malwarebytes alleging violations of state and federal law, arguing that Malwarebytes had engaged in anti-competitive behavior by configuring its software to block users from downloading Enigma’s software. Enigma argued that this behavior diverted potential customers away from its products and toward Malwarebytes’ tools.
The trial court ruled that a provision of Section 230 (47 U.S.C. § 230(c)(2)(B)) that provides immunity for parties that build tools to block material online applied and dismissed the case. A three-judge panel of the Ninth Circuit disagreed, ruling that Section 230 immunity does not apply when there are allegations that the defendant blocked the plaintiff’s software for anticompetitive purposes, as Enigma alleged against Malwarebytes.
EFF disagrees with the Ninth Circuit’s interpretation of Section 230: there is no anticompetitive exception to Section 230. The law’s language indicates that providers can subjectively decide what material to screen or filter without facing legal liability from parties that disagree with those decisions. But beyond reaching the wrong legal conclusion, the court’s decision is problematic because it will discourage the development of new filtering tools for Internet users.
As our amicus brief explains, most filtering tools—be they targeting malware, spam, offensive content, or other objectionable material—operate by either using block lists or by following a set of rules or heuristics that flag potentially objectionable material. In the case of rules-based filters, content or software may be flagged or blocked inadvertently, resulting in false positives. But that activity does not necessarily evidence any ill motive and may instead be a mistake.
The Enigma decision, however, elevates those innocuous mistakes into potential legal liability, as a party whose material is blocked can allege that it was done for an anticompetitive purpose. And the party accused of that behavior would have to face an expensive and time-consuming lawsuit to disprove the claim.
Faced with this new legal exposure, online filtering providers may decide not to screen certain material or to adjust their rules-based screening to let material through that they previously would not have. Some would-be competitors may not even enter the filtering tool market in the first place. This will result in less useful filtering products and fewer companies offering filtering tools.
Yet Congress passed Section 230 to broadly protect filtering tools’ decisions about what material they decide to block precisely because they wanted to encourage the development of robust screening products offered by a diversity of providers. As EFF’s amicus brief argued:
Filtering tools give Internet users choices. People use filtering tools to directly protect themselves and to craft the online experiences that comport with their values, by screening out spyware, adware, or other forms of malware, spam, or content they deem inappropriate or offensive. Platforms use filtering tools for the same reasons, enabling them to create diverse places for people online.
The amicus brief also shows the court how its decision in Enigma would harm EFF directly. Our tool Privacy Badger helps users take privacy into their own hands by using heuristics to block third-party trackers. Privacy Badger relies on Section 230’s protections against claims based on improper blocking decisions.
Additionally, the panel’s decision also undermines EFF’s efforts to eradicate the spyware used to perpetuate domestic violence, stalking, and harassment. EFF has worked with filtering tool providers to push them to identify and block tracking software that is surreptitiously installed on victims’ digital devices, often by a vindictive or abusive romantic partner. EFF’s brief argued:
EFF fears that providers of filtering tools will no longer cooperate with EFF’s requests to block stalkerware if doing so would expose them to potential lawsuits alleging that they have somehow acted in “bad faith” by blocking these spyware products, especially if stalkerware companies claim these products are actually legitimate.
We hope that the Ninth Circuit agrees to reconsider the case so that it can correctly interpret Section 230, and provide the legal immunity filtering providers need to give users tools to customize their Internet experiences and protect themselves online.