In December, the FCC voted to end the 2015 Open Internet Order, which prevented Internet service providers (ISPs) like AT&T and Comcast from violating net neutrality principles. A simple majority vote in Congress can keep the FCC’s decision from going into effect. From now until the Senate votes, EFF, along with a coalition of organizations, companies, and websites, is on red alert and calling on you to tell Congress to vote to restore the Open Internet Order.
On May 3, in the U.S. Capitol Visitor Center, EFF convened a closed-door briefing for Senate staff about the realities of device encryption. While policymakers hear frequently from the FBI and the Department of Justice about the dangers of encryption and the so-called Going Dark problem, they very rarely hear from actual engineers, cryptographers, and computer scientists. EFF's panelists included Dr. Matt Blaze, professor of computer science at the University of Pennsylvania, Dr. Susan Landau, professor of cybersecurity and policy at Tufts University; Erik Neuenschwander, Apple’s manager of user privacy; and EFF’s tech policy director Dr. Jeremy Gillula.
The discussion focused on renewed calls by the FBI and DOJ to create mechanisms to enable “exceptional access” to encrypted devices. Our goal was to give a technical description of how device encryption actually works and answer staff questions about the risks that exceptional access mechanisms necessarily introduce into the ecosystem. EFF's Gillula went last and concluded that in the cat-and-mouse game that is computer security, mandating exceptional access would freeze the defenders’ state of the art, while allowing attackers to progress without limit.
Recently, the European Commission published two legislative proposals that could further cement an unfortunate trend towards privacy erosion in cross-border state investigations. Building on a foundation first established by the recently enacted U.S. CLOUD Act, these proposals compel tech companies and service providers to ignore critical privacy obligations in order to facilitate easy access when facing data requests from foreign governments. These initiatives collectively signal the increasing willingness of states to sacrifice privacy as a way of addressing pragmatic challenges in cross-border access that could be better solved with more training and streamlined processes.
Before rushing to employ algorithms to make decisions, companies should begin by asking five questions:
- Will this algorithm influence—or serve as the basis of—decisions with the potential to negatively impact people’s lives?
- Can the available data actually lead to a good outcome?
- Is the algorithm fair?
- How will the results (really) be used by humans?
- Will people affected by these decisions have any influence over the system?
Europe's General Data Protection Regulation (GDPR) comes into force on May 25th, and most companies that have users in Europe are scrambling to update their privacy policies and terms of service to avoid breaking this new EU law. It's still an open question whether the rules apply to users living outside the EU, but the changes involve refinements in terminology, how companies need to get permission to use data, and changes in user ability to look at the data itself, change it, and take it with them when they leave.
ISPs claim that the net neutrality principle banning paid prioritization—where an ISP charges websites and applications new fees and relegate those that do not pay to the slow lane—means that they cannot make enough money to upgrade and extend their service. We know this isn't true because the majority of costs for ISPs are in the initial building of their networks, which they have already recouped. And we've recently seen new ISPs build high-speed Internet networks turn a profit relatively quickly while adhering to net neutrality.
Section 1201 of the Digital Millennium Copyright Act makes tampering with "Digital Rights Management" a legal no-go zone. This scares off inventors and tinkerers from building new tools that should be perfectly legal. EFF details examples of these non-existent technologies in the Catalogue of Missing Devices. EFF supporter Benjamin McLean offered up his "Mashup Maker" as an example. This program would have ripped tracks legally acquired and imported them into a personal library with a built-in editor, making it easier for people to make fair use of these tracks.
Government officials are once again insisting that they still need to compromise our security via a backdoor for law enforcement. Opponents of encryption imagine that there is a “middle ground” approach that allows for strong encryption but with “exceptional access” for law enforcement. Government officials claim that technology companies are creating a world where people can commit crimes without fear of detection.
Despite this renewed rhetoric, most experts continue to agree that exceptional access, no matter how you implement it, weakens security. The terminology might have changed, but the essential question has not: should technology companies be forced to develop a system that inherently harms their users? The answer hasn’t changed either: no.
Are you coming to PyCon? Join our development sprint to help improve Certbot, the easy-to-use client that fetches and deploys SSL/TLS certificates from Let's Encrypt.
We're looking for an energetic Member Outreach Assistant to support EFF's fundraising operations and help build relationships with our growing community.
Unrestrained, unmonitored sharing of data collected by automated license plate readers is a threat to privacy and public safety. (NBC San Diego)
Local law enforcement and other city agencies have been deploying spy technology that's "hurtling toward us so fast that privacy laws can't keep up." We need to fight back. (LA Times)
This year's "excruciating DMCA section 1201 exemption process" threatens the right to repair tractors, cars, and electronics is at stake. (Motherboard)
In a way, Representative Marsha Blackburn is right that paid prioritization is like TSA Precheck. In that everyone else is stuck in a slow lane while those with money get to breeze past them. (Ars Technica)
Self-driving car companies may not want to share accident data out of fear it will help competitors to progress faster. But the trade-off is a higher level of safety—and its a trade-off we should demand they make. (Los Angeles)
Excellent news: Canadian police have dropped computer hacking charges against a 19-year-old who downloaded openly available information from a public records website. (CBC)
We must assure that there is transparency when cities allow police to acquire or use surveillance technology. On May 1, Oakland City Council voted in support of an important proposed Surveillance and Community Safety Ordinance to do just that. (East Bay Times)