The Cost of Censorship in Libraries: 10 Years Under the Children’s Internet Protection Act
This year marks the 10-year anniversary of the enforcement of the Children’s Internet Protection Act (CIPA), which brought new levels of Internet censorship to libraries across the country. CIPA was signed into law in 2000 and found constitutional by the Supreme Court in 2003. The law is supposed to encourage public libraries and schools to filter child pornography and obscene or “harmful to minors” images from the library’s Internet connection in exchange for continued federal funding. Unfortunately, as Deborah Caldwell-Stone explains in Filtering and the First Amendment, aggressive interpretations of this law have resulted in extensive and unnecessary censorship in libraries, often because libraries go beyond the legal requirements of CIPA when implementing content filters. As a result, students and library patrons across the country are routinely and unnecessarily blocked from accessing constitutionally protected websites.
First, libraries don’t actually have to comply with CIPA, which only applies to libraries that accept e-rate discounts or Library Services and Technology Act grants for Internet access; libraries that turn down this funding need not comply with the law. For example, Dr. Martin Luther King, Jr. Library in San Jose has successfully fought initiatives to install Internet filters, even at the cost of certain federal funds.
For institutions it does cover, CIPA has three requirements: that schools and public libraries adopt a written policy that includes an Internet filter, that they hold a public meeting before the policy is enacted, and that the Internet filtering is enforced when the computers are used. As Caldwell-Stone explains, the Internet policy must include a few things, specifically:
Schools and libraries subject to CIPA must certify that the institution has adopted an internet safety policy that includes use of a "technology protection measure"—filtering or blocking software—to keep adults from accessing images online that are obscene or child pornography. The filtering software must also block minors’ access to images that are "harmful to minors," that is, sexually explicit images that adults have a legal right to access but lacking any serious literary, artistic, political, or scientific value for minors.
According to CIPA, libraries must place filters on all the computers owned by the library, though the filter can be turned off upon request. Schools that are covered by CIPA have some additional requirements (see text of CIPA).
What should not be censored under CIPA? Even a casual reading of the law makes it clear that only images, not text or entire websites, are legally required to be blocked. Libraries are not required to filter content simply because it is sexual in nature (sexual content isn’t necessarily obscene; it may have serious literary, educational, or artistic value, for example). Libraries aren’t required to block social networking sites, political sites, sites advocating for LGBTQ issues, or sites that explore controversial issues like genocide or gun laws or WikiLeaks.
But unfortunately, that’s not what’s happening on the ground. Libraries across the country are routinely overblocking content, censoring far more than is necessary under the law. This means library patrons are cut off from whole swaths of the World Wide Web, hampering their access to knowledge.
Problems with CIPA
After 10 years of CIPA, we now know that the law is widely misunderstood and used as an excuse for censorship. Here are a few of the main problems:
Library filters block constitutionally protected content. Library filters often block many sites that aren’t pornographic or obscene in nature. This may happen because the filters aren’t very accurate at detecting certain types of content or it may happen because the libraries set the filters to block content that should be accessible (filters typically have a range of options that can be manually adjusted during setup). As a result, filters have been known to block LGBTQ-themed sites, websites for art museums, information on teen smoking, Second Amendment advocacy sites, and sites about role playing games.
Filters don’t actually effectively block obscene content. CIPA’s objective is to prevent certain harmful and obscene material from being accessed from libraries and schools. But filters aren’t perfect. In addition to blocking legitimate content, filters can fail to block certain content that is obscene. Testing and analysis (PDF) of several available filtering technologies conducted by the San Jose public library in 2008 found that filters don’t work:
In all four filters tested, image filtering had a low rate of accuracy. Many images of an adult sexual nature were displayed on web pages accessed by the testers, and additionally the image search results pages and most of those images’ full-size versions and/or parent sites could be accessed as well. Because of the ability of image search engines (like Google Images and Yahoo Image Search) to display thumbnails which often aren’t treated as “real” images by the filtering programs, image filtering is a problem for the filtering software’s AI. Images of an adult sexual nature from image search engines, pages with images of an adult sexual nature but "fake" innocent text, or images of an adult sexual nature posted to social sites like Craigslist were consistently displayed in all four filter tests.
The deficiency of filters was emphasized by the very public failure of Homesafe, a network-level filter that was offered by one of Britain’s largest Internet providers. The filter was designed to block adult content on the network level, but in late 2011 it was revealed that the filter failed to block Pornhub, which offers thousands of free explicit videos and is ranked as the third largest pornography provider on the web.
Kids are under-prepared for the open web. One of the harmful side effects of CIPA is that many kids who rely on schools and libraries for Internet access are prevented from experiencing the unfiltered web. While in the short -term this supposedly protects children from accessing harmful content, it also robs kids of the chance to learn skills necessary to navigate the web as a whole. When websites such as social networking sites, political advocacy sites, and LGBTQ-themed sites are censored from the Internet experience of young adults, we are failing to empower our children with the skills they need to use good judgment, common sense, and basic precautions when browsing the web. Rather than employing overly stringent filters to censor the Web, libraries and schools should educate students to protect themselves online.
We don’t know exactly what’s being blocked. Among the many problematic issues with Internet filters in libraries is the lack of transparency around what’s filtered. There’s no solid documentation of which libraries are filtering what specific websites. Part of this stems from libraries not being transparent about their decision to voluntarily block more content than required by law. Additionally, most filtering technology companies closely guard their algorithms for blocking sites, claiming trade secrecy. Because we don’t have a comprehensive list of what’s getting blocked, it’s difficult to judge whether some filters are more speech-friendly than others or whether some libraries have set their filters to censor more content than they should.
Content blocking goes against the ethical obligations of librarians. Librarians play an important role is preserving free speech online — a role we recognized with our 2000 Pioneer Award honoring librarians everywhere. The American Library Association has codified the ethical obligations involved in its code of ethics: "We uphold the principles of intellectual freedom and resist all efforts to censor library resources."
What can be done: fighting back against censorship in libraries
If you are concerned about the harmful ramifications of Internet censorship in libraries, you can help fight back:
Speak to your library. Find out if your library has a policy regarding censorship and ask to see it. Voice your concerns about the harmful ramifications of filters in libraries, and explain that filters are never 100% accurate.
Attend the public event. Every library that is seeking to institute a new censorship policy under CIPA is required to have an open meeting to solicit public feedback. Attend the meeting and let the library know that you think your community will benefit from an uncensored Internet.
Ask to have the filter removed. As a library patron, you may find online content blocked that obviously should not be – such as anatomy sites necessary for research. If this happens to you, don’t ignore it and try to find an alternate source. Tell the librarians on staff what happened and ask to have the filter removed so you can access the legitimate content. Every time you speak out for your rights to access content, you’re making the librarian aware that the filters are blocking too much. This not only helps prompt libraries to revisit filtering policies, it helps ensure libraries are familiar with the process of removing filters upon request.
Celebrate 404 Day with EFF. Next April 4, EFF will join several partners around the country to raise awareness of library Internet censorship. Mark your calendar now and stay turned for more information. Want to host an anti-censorship event in your school or community on that day? Email firstname.lastname@example.org and we can help.
Recent DeepLinks Posts
Mar 22, 2017
Mar 22, 2017
Mar 22, 2017
Mar 22, 2017
Mar 22, 2017
- Fair Use and Intellectual Property: Defending the Balance
- Free Speech
- UK Investigatory Powers Bill
- Know Your Rights
- Trade Agreements and Digital Rights
- State-Sponsored Malware
- Abortion Reporting
- Analog Hole
- Anti-Counterfeiting Trade Agreement
- Artificial Intelligence & Machine Learning
- Bloggers' Rights
- Border Searches
- Broadcast Flag
- Broadcasting Treaty
- Cell Tracking
- Coders' Rights Project
- Computer Fraud And Abuse Act Reform
- Content Blocking
- Copyright Trolls
- Council of Europe
- Cyber Security Legislation
- Defend Your Right to Repair!
- Development Agenda
- Digital Books
- Digital Radio
- Digital Video
- DMCA Rulemaking
- Do Not Track
- E-Voting Rights
- EFF Europe
- Electronic Frontier Alliance
- Encrypting the Web
- Export Controls
- Eyes, Ears & Nodes Podcast
- FAQs for Lodsys Targets
- File Sharing
- Fixing Copyright? The 2013-2016 Copyright Review Process
- Genetic Information Privacy
- Government Hacking and Subversion of Digital Security
- Hollywood v. DVD
- How Patents Hinder Innovation (Graphic)
- International Privacy Standards
- Internet Governance Forum
- Law Enforcement Access
- Legislative Solutions for Patent Reform
- Locational Privacy
- Mandatory Data Retention
- Mandatory National IDs and Biometric Databases
- Mass Surveillance Technologies
- Medical Privacy
- Mobile devices
- National Security and Medical Information
- National Security Letters
- Net Neutrality
- No Downtime for Free Speech
- NSA Spying
- Offline : Imprisoned Bloggers and Technologists
- Online Behavioral Tracking
- Open Access
- Open Wireless
- Patent Busting Project
- Patent Trolls
- PATRIOT Act
- Pen Trap
- Policy Analysis
- Public Health Reporting and Hospital Discharge Data
- Reading Accessibility
- Real ID
- Reclaim Invention
- Search Engines
- Search Incident to Arrest
- Section 230 of the Communications Decency Act
- Shadow Regulation
- Social Networks
- SOPA/PIPA: Internet Blacklist Legislation
- Student Privacy
- Stupid Patent of the Month
- Surveillance and Human Rights
- Surveillance Drones
- Terms Of (Ab)Use
- Test Your ISP
- The "Six Strikes" Copyright Surveillance Machine
- The Global Network Initiative
- The Law and Medical Privacy
- TPP's Copyright Trap
- Trans-Pacific Partnership Agreement
- Travel Screening
- Trusted Computing
- Video Games