Yesterday an Australian court found Google liable for defaming someone simply by returning a search result about her. The plaintiff in that case, Janice Duffy, complained that a search for her name returned links to unfavorable articles on the Ripoff Report website, as well as prompting unfavorable autocomplete suggestions when used as a search term. The court's finding of liability on Google's part, when it merely provided a service that generated these links and suggestions, is a blow for the Australian Internet and for innovation and access to knowledge.
The judgment [PDF] of the South Australian Supreme Court goes to amazing lengths to justify why a search engine should be found liable in such a case, but its reasoning defies common sense. For example, the court explains that Google was liable for "publishing" the defamatory articles because it provided a working hyperlink to them, whereas if it had published the bare URLs without hyperlinks, it would not have been so liable (it would also not have been much of a search engine, in that case):
If a search of Dr Duffy’s name had merely returned the URL of the first Ripoff Report webpage without functioning as a hyperlink and without accompanying text, it could not be said that Google was a publisher of the content of that material. To access the first Ripoff Report webpage, the user would need to enter the URL into the address box of the internet browser.
Although this seems like complete nonsense, regrettably there is precedent under the Australian common law that even a mere disseminator of information, such as a news vendor or a library, can assume liability for "publishing it." But while a news vendor or library chooses the publications that they sell or lend, a search engine's mission is to index the entire Internet, of which it can have no such knowledge. Therefore we believe that the court erred, and that if the case is appealed to Australia's High Court (as it could be), the court would have grounds to establish a narrower definition of "publication", and set Australian law back on a more sensible course.
Statutory law reform is another option to right this wrong in Australian law. Section 230 of the Communications Decency Act would have protected Google in the same circumstances if the case had been heard in the United States, but Australia's equivalent of that law is much narrower, vanishing as soon as the intermediary is notified of the allegedly offending content. The Australian Law Reform Commission last year recommended that the government introduce a safe harbor for Internet intermediaries that would protect them from liability for serious breaches of privacy, but even if such a law would have protected Google in this case, the recommendation to introduce it has not yet been taken up.
Europe is also considering amendments to its law on the same topic—but unfortunately the amendments being considered are in precisely the wrong direction. The existing safe harbor in European law protects "passive" intermediaries from liability for hosting defamatory content on a similar basis to U.S. law, and some European countries have implemented this to extend to search engines. But a European public consultation on the regulatory environment for platforms, to which EFF is currently developing a response, foreshadows moves to narrow the availability of this safe harbor. The consultation, which is drafted in highly leading and partial terms, asks questions such as:
Do you think that the concept of a "mere technical, automatic and passive nature" of information transmission by information society service providers provided under recital 42 of the ECD is sufficiently clear to be interpreted and applied in a homogeneous way, having in mind the growing involvement in content distribution by some online intermediaries, e.g.: video sharing websites?
The intention here is to suggest that certain categories of intermediaries who can be characterized as engaged in "active" hosting, should be excluded from the existing safe harbor protections that they currently enjoy, and made to assume additional liabilities; much in the same way that the Australian court reasoned that Google should assume liability because of its automated systems for excerpting snippets from search results, and because the "Google website is programmed automatically to cause the browser to display the Ripoff Report webpage by clicking on the hyperlink".1
The Australian ruling follows on a similar ruling earlier in the month from a Brazilian court, in which TV personality Daniela Cicarelli extracted a 500,000 real ($127,263.82) judgment from Google for failing to remove a YouTube video that infringed her privacy. At least in that case, unlike in the Australian case, Google was actually hosting the impugned piece of content. But even so, that case was brought prior to the passage of Brazil's Marco Civil, a broad-ranging online civil liberties law that would now prevent an intermediary from being found liable in similar circumstances, unless they disobeyed a court order for removal of the material.
The change of direction in Brazilian law points the right way for Australia and Europe also: Internet intermediaries such as search engines should never be required to remove content from the Internet without a judicial order, and should be shielded from liability for third party content that they have not been involved in modifying. These are some of the policy guidelines set out in the Manila Principles on Intermediary Liability, in which EFF and our partners from around the world have codified a set of principles that we believe policy makers around the world should follow when crafting rules on intermediary liability.
Until Australian law is made consistent with these principles, the position of search engines and other Internet intermediaries in Australia has just become untenably risky. If Internet companies start shutting off services to Australia to minimize their legal exposure, users will ultimately be the biggest losers from this decision.
- 1. That ingenious automatic "program" looks like this: <a href="..."></a>