12 years ago, hundreds of thousands of Serbians filled the streets of Belgrade, blocking the entire city in protest of Slobodan Milošević’s regime. At the time such a widespread protest had seemed unimaginable. Before the uprising, the mood in the country was melancholic, cynical, and hopeless amid disillusionment with a government that became plagued with corruption, repression, and war. An unprecedented campaign of civil resistance against the Milošević regime paved the way for eventual democratic reform and the Serbian independence in 2006. One influential aspect to this movement was the young students who inspired their country to leap into political and creative action through a 100 day plan—100 days of debate, dancing, performances, and workshops.
These days, activism and political engagement continues to be prevalent around the world. People are now largely relying on the Internet for raising political awareness and organizing campaigns in their communities. Through the Arab Spring, the Occupy movement, Internet protests against SOPA, PIPA, and ACTA, as well as other popular movements, the world continues to see how the Internet unleashes the creative potential of the massesto transform political attitudes and policy debates. The SHARE Conference in Serbia builds on these leading movements, rekindling the passion of anti-Milošević protests to tackle a new repressive threat: Internet censorship, surveillance, and locks on digital culture.
The SHARE conference is in its second year. It was founded by many of the young activists who started the instrumental student campaign in 2000 to resist the Milošević regime. Now called the Exit Festival, the movement has turned into one of the biggest annual music festivals in southeastern Europe. Staying truthful to their activist roots, however, Exit Festival hosts various talks on Internet and politics, in which EFF has participated.
On April 26 - 28, EFF will again participate in the conference in Belgrade, Serbia, speaking against surveillance regimes. SHARE will gather more than two thousand thinkers, innovators, and activists for three days of enlightening lectures, engaging workshops, contemporary music, and nights full of dancing, at the Dom Omladine — Belgrade Youth Center.
SHARE by Day: The Internet as a Space for Resistance
This year’s SHARE conference comes at a watershed moment in the Internet freedom movement. SHARE’s speakers will include street-art groups, security researchers, dissidents, innovators, and freedom fighters — individuals who have used the Internet to inspire radical change and community action. This year’s SHARE conference focuses on both the benefits and challenges of the Internet. Participants will discuss how they use the Internet to create, learn, innovate, and stir political action for positive social change. At the same time, they will examine the Internet’s dangers and the methods of mitigating its capacity to trace, track, and secretly surveil individuals.
SHARE by Night: Art, Music and Activism
“SHARE by Night,” the conference’s music program, presents innovative international electronic and contemporary music blended together with the talents of the local clubbing scene. Last year, Improv Everywhere stormed the streets of Belgrade with a “mobile party.” Crowds of onlookers stood in awe and joined in the celebration as speakers and young attendees danced through the night. SHARE by Night is keeping its full performance lineup secret until the date approaches, and no one can foresee all the spontaneous events that will surely take place throughout the conference.
A few EFF picks from Share’s 2012 lineup ...
Voina — A collective of provocative Russian street-artists known for its politically charged performance art. Since the very beginning, Voina has been involved in a variety of radical art against former KGB headquarters, police repression measures, and the Russian political system at large. Due to its radical art, members of their collective were jailed until Banksy bailed them out in 2010.
George Hotz — A security researcher who developed a code to jailbreak the iPhone and Sony Playstation 3. Last year, Sony sued Hotz and other security researchers who disclosed security vulnerabilities in the PS3 that had allowed users to install and run the Linux operating system on their consoles.
Slava Mogutin — Siberian-born artist and writer exiled from Russia at the age of 21 for his queer writings and activism. In the past decade, Mogutin’s photography and multimedia work have been exhibited internationally. At SHARE, he will present his work and his ongoing battle with censorship.
Vuk Ćosić — Active in politics, literature and art since 1994, Ćosić is well known for his ground-breaking work as a pioneer in the field of net.art. His evolving oeuvre is characterized by an interesting mix of philosophical, political, and conceptual network-related issues on the one hand, and an innovative feeling for contemporary urban and underground aesthetics on the other.
Rob Van Kranenbrug—Kranenbrug will examine the impact Radio Frequency Identification has on cities and the wider society. At the same time, he will reflect on possible alternative network technologies to safeguard our privacy and empower citizens. It will be both a timely warning and a call to arms.
Peter Sunde — Berlin-based Swedish IT expert best known for co-founding The Pirate Bay. Sunde is currently working on the Flattr project, which is a microdonation system that enables viewers of websites to make small donations by clicking a "Flattr this" button.
Khannea Suntzu — Apart from being a conceptual artist, an independent blogger, a futurist, and a hobbyist-philosopher, Khannea supports radical democratization and advocates the extension of fundamental human rights. Her work resounds a warning about the dangers of "technological unemployment" in creating effectively irreversible societal divisions. She argues for proactive social activism against this growing disparity.
Sawor Mon — Of Hmong descent, Mon lives in Burma, a country that was until recently, under military dictatorship and is currently led by a military-backed government. The most common term among activists for this type of government is the “hybrid regime.” Mon argues that the Internet is an essential tool for combating political brainwashing and propaganda.
Church of Kopimism / Isak Gerson — Isak Gerson, a philosophy student from Stockholm, had a couple of issues while attempting to get the Church of Kopimism recognized by the Swedish authorities. The main belief of this religion is that copying and sharing information are ethically and morally correct. One of their key dogmas is that CTRL+C and CTRL+V are sacred symbols.
SHARE partners with Bturn — an international online magazine covering music, film, and art in Balkan and Eastern European cultures. Bturn will continue to highlight picks in the days to come until the day of the Conference.
SHARE will also host discussions by Smari Mc Carthy, and the crowd source reform of the Icelandic Constitution, Jeremie Zimmerman from La Quadrature du Net on arguments against ACTA, Elizabeth Stark on the Open Video Alliance, EFF’s Katitza Rodriguez on the reality of mass surveillance as seen in films, Desiree Miloshevic from Afilias, and Google. There will be more speakers to come.
PayPal has instituted a new policy aimed at censoring what digital denizens can and can’t read, and they’re doing it in a way that leaves us with little recourse to challenge their policies in court. Indie publisher Smashwords has notified contributing authors, publishers, and literary agents that they would no longer be providing a platform for certain forms of sexually explicit fiction. This comes in response to an initiative by online payment processor PayPal to deny service to online merchants selling what they deem to be obscene written content. PayPal is demonstrating, again and to our great disappointment, the dire consequences to online speech when service providers start acting like content police.
Mark Coker, founder of Smashwords, described the new policy in a recent blog post. The policy would ban the selling of ebooks that contain “bestiality, rape-for-titillation, incest and underage erotica.” Trying to apply these definitions to all forms of literary expression raise questions that can only have subjective answers. Would Nabokov’s Lolita be removed from online stores, as it explores issues of pedophilia and consent in soaring, oft-romantic language? Will the Bible be banned for its description of incestuous relationships?
This isn’t the first time PayPal has tried its hand at censorship. In 2010, they cut off services to the whistleblower WikiLeaks, helping to create the financial blockade that has hamstrung the whistleblower organization. And as we explained when WikiLeaks was facing censorship from service providers: the First Amendment to the Constitution guarantees freedom of expression against government encroachment—but that doesn't help if the censorship doesn't come from the government. Free speech online is only as strong as private intermediaries are willing to let it be.
Frankly, we don’t think that PayPal should be using its influence to make moral judgments about what ebooks are appropriate for Smashwords readers. As Wendy Kaminer wrote in a forward to Nadine Strossen’s Defending Pornography: “Speech shouldn’t have to justify itself as nice, socially constructive, or inoffensive in order to be protected. Civil liberty is shaped, in part, by the belief that free expression has normative or inherent value, which means that you have a right to speak regardless of the merits of what you say.”
But having a right to speak is not the same as having a right to be serviced by a popular online payment provider. Just as a bookseller can choose to carry or not a carry particular books, PayPal can choose to cut off services to ebook publishers that don’t meet its “moral” (if arbitrary and misguided) standards.
Online payment providers like PayPal help many websites fund their very existence. As we explained in our interactive graphic Free Speech is Only as Strong as the Weakest Link, a payment provider can shut down controversial online speech by cutting off their means of financial support. And PayPal, the behemoth of online payment providers, has little incentive to compromise with small businesses that are punished through these arbitrary policies.
Unfortunately, Congress knows just how vulnerable online speech can be to the vagaries of payment providers. The Stop Online Piracy Act, defeated earlier this year after Internet-wide protests, contained language that would have allowed individuals and companies to cut off financial support for a website simply by sending an infringement notice to its payment providers or ad networks. No judge or jury would have been required.
The censorship of Smashwords is a blow to free speech and adds to the ever-growing list of examples of payment providers turned into content police.
Earlier this month, EFF called for the protection of Saudi blogger and journalist Hamza Kashgari, who had fled Saudi Arabia after tweets he wrote about the Prophet Mohammed provoked clerics to demand that he be tried for apostasy, and members of the public to call for his murder. Kashgari had been a columnist for the Jeddah-based newspaper Al Bilad until outrage over the tweets, when Saudi Minister of Culture and Information Abdul Aziz Khoja ordered Kashgari “not to write in any Saudi paper or magazine,” an order which Kashgari also posted to his Twitter account. As outrage mounted, Kashgari retracted his statements, deleted his Twitter account, apologized for the comments, and finally fled the country in response to mounting threats on his life.
Upon arriving at the airport in Kuala Lumpur, Malaysia, on his way to seek refuge in New Zealand, Kashgari was arrested by security officials at the request of the Saudi government. Malaysia and Saudi Arabia do not have an extradition treaty, but they do maintain good relations. EFF was among the many organizations that called on Malaysian Prime Minister Najib Tun Razak release Kashgari from detention and stop extradition proceedings, reminding the Prime Minister that Malaysia that as member of the UN Human Rights Council, his nation is committed to upholding the highest human rights standards, which is inconsistent with allowing Kashgari to be extradited back to a country where he faces serious threats to his life.
Mohammed Noor, Kashgari’s lawyer in Malaysia, was able to obtain a court order to prevent the deportation, but he was not allowed to see his client before he was put on a plane and repatriated to Saudi Arabia. Noor told the Associated Press:
“We are concerned that he would not face a fair trail back home and that he could face the death penalty if he is charged with apostasy.”
Kashgari is now in detention in Saudi Arabia. Several sites and petitions have been set up to support him and call for his release. Kashgari is being represented by prominent human rights lawyer Abdul-Rahman al-Lahem, who has stated that he will push for this case to be argued before a committee in the information ministry instead of a Sharia court. Even if Kashgari is not charged with apostasy, a crime with carries the death penalty, the blogger and journalist continues to face threats to his life from Saudi militants. A Facebook page titled “The Saudi people want the execution of Hamza Kashgari,” has over 26,000 members. It is not enough for the Saudi government to release Kashgari—they must allow him to leave the country for his own safety.
The Electronic Frontier Foundation will continue to keep a close eye on developments in Saudi Arabia. Freedom of expression is a fundamental human right. No one deserves to be killed, whether by his or her government or by fellow citizens, for something they write in a 140-character tweet.
The world’s attention has recently turned to the question of how to hold companies accountable for knowingly marketing, selling and adapting the tools of surveillance to repressive regimes. U.S. and E.U. companies’ equipment has been linked to torture and other human rights violations in many Middle East and North African countries, along with longstanding cases involving similar allegations in China. Most recently, evidence suggests prominent American journalist Marie Colvin may have been tracked via her satellite phone before being killed by government forces in Syria. Public pressure on companies to “Know Your Customer” and take other actions to avoid having their tools used as part of human rights violations is intensifying. The European Parliament has begun the first steps in banning sales of this technology to authoritarian governments, and the U.S. Congressman Chris Smith (R-NJ) introduced a bill, the Global Online Freedom Act, which is in part aimed at this problem.
But there is another avenue for justice: the U.S. courts.
Aiding and abetting, and conspiracy to commit crimes, have long been illegal under U.S. law, and it’s not difficult to see how surveillance tools used to commit human rights violations — especially ones specifically and knowingly modified or supported by a company — could qualify under these or other longstanding laws. In fact, there are two pending cases in the U.S. right now raising those claims against Cisco based on evidence that the company knowingly marketed, sold and specially adapted and tools that the Chinese government uses to target Chinese democracy activists and members of the Falun Gong religious minority.
That’s right. Two years after holding that corporations must be allowed to fully participate in funding candidates in U.S. elections, the Supreme Court will consider whether corporations are nonetheless completely immune from claims alleging that they helped commit gross human rights abuses.1
There’s nothing particularly novel about corporate liability for facilitating the bad acts of others. While a corporation cannot go to jail, corporations are regularly held civilly and even criminally liable for involvement in the offenses done by others. Thus, a company that facilitates money laundering can be held liable, and, as EFF members well know, a company can also be secondarily liable for the copyright infringements of others. The two cases concern two different laws: the Alien Tort Statute (ATS) in Kiobel and the Torture Victim Protection Act (TVPA) in Mohamad. While the constitutional analysis under the First Amendment in Citizens United and the statutory interpretation of the TVPA and ATS in these cases are not exactly the same, the public’s concern that the Supreme Court may embrace a world in which corporations have the all rights, but none of the responsibilities, of ordinary people is very real.
How did we get here? In the United States, people have long been held liable for knowingly assisting in human rights abuses even when they are committed overseas. Under case law going back to Filártiga v. Peña-Irala in 1979, people who helped foreign governments engage in torture, summary execution or slavery have been held responsible in both civil and criminal courts. Recently these same claims, on the same standard, have been applied to companies, ranging from one using slave labor to build a pipeline in Burma, to one who helped in the wrongful hanging of Nigerian human rights hero Ken Saro-Wiwa. The cases are not easy, and only apply to a set of extreme human rights violations like torture and execution, but they provide a measure of justice to those who have faced horrific human rights abuses, and hopefully, a strong disincentive for corporations to get involved in the dirty business of assisting in human rights abuses abroad in the first place.2
This is where mass surveillance companies selling technology to authoritarian regimes come in. For months now, we have seen increasing evidence that U.S. and E.U.-based companies have been selling spying technology that has led to the torture and summary execution of journalists, human rights advocates, and democratic activists.
In Bahrain, dozens of recent political prisoners have testified that government officials tortured them before reading back transcripts of text messages and emails likely obtained through these technologies. In Syria, just as the government was ramping up its deadly crackdown on democratic protests, the Italian company Area SpA rushed to complete a “monitoring center” that could not only read every email in the country, but track citizens’ locations via GPS in virtual real-time. Technology from U.S. based companies Hewlett Packard and NetApp have also been linked to Syria, according to Bloomberg. And in Libya, the Wall Street Journalreported that, “a surveillance center in Tripoli provides clear new evidence of foreign companies' cooperation in the repression of Libyans under Col. Gadhafi's rule.” Similar reports have emanated from Iran.
Despite these damning investigations from Bloomberg and the Wall Street Journal, dozens of companies are still operating with little oversight or accountability if they knowingly sell and facilitate their products for use to commit these human rights abuses. On the contrary, business appears to be booming; the market for these products has increased to $5 billion a year.
Those looking for tools to help hold companies accountable for selling the surveillance state to foreign despots should be watching the Supreme Court closely. Kiobel and Mohamad will be argued February 28, and should be decided by late June. More information about the cases is available at corporateaccountabilitynow.org. While some judicial avenues will still exist even if these cases fail, if the Court does require the same responsibilities of corporations not to torture that it already requires of humans, it may help hold these surveillance companies accountable in the courts when they are responsible for assisting in human rights atrocities around the world, and more importantly, it may hopefully help dissuade companies from getting into bed with these repressive governments in the first place.
1. There’s nothing particularly novel about corporate liability for facilitating the bad acts of others. While a corporation cannot go to jail, corporations are regularly held civilly and even criminally liable for involvement in the offenses done by others. Thus, a company that facilitates money laundering can be held liable, and, as EFF members well know, a company can also be secondarily liable for the copyright infringements of others.
2. Note that EFF is counsel in one of the cases, Bowoto v. Chevron, involving Chevron’s helicoptering in, overseeing and payment of Nigerian forces who opened fire on protesters in Nigeria, and in that capacity we also signed on to an amicus brief in the Supreme Court urging the Supreme Court to find that corporations can be liable under the TVPA.
As the European Parliament considers passing a directive that would target hacking, EFF has submitted comments urging the legislators not to create legal woes for researchers who expose security flaws.
In the United States, laws such as the Digital Millennium Copyright Act and the Computer Fraud and Abuse Act have created a murky legal landscape for researchers who conduct independent analysis of technology for security threats. Throughout the world, the Convention on Cybercrime has caused similar problems. Now, new vague and sweeping computer crime legislation is back on the European Union's agenda threatening coders' rights: the European Commission’s proposal on a draft Directive on Attacks Against Information Systems [pdf].
All told, the European Commission needs to make a stronger case for why this directive is needed at all. We believe it is largely duplicative of the Convention on Cybercrime, which itself is riddled with problems. Should the proposed directive move forward, however, we urge the Parliament to improve several aspects.
No criminalization of tools
The main so-called “novelty” of the draft directive is the criminalization of the use, production, sale, or distribution of tools to commit attacks against information systems. In our submission to the European Parliament, we opposed the wholesale criminalization of these tools: while they can be used for malicious purposes, they are also crucial for research and testing, including for "defensive" security efforts to make systems stronger and to prevent and deter attacks.
We urge the Parliament to focus on the intent behind using the tool, rather than mere possession, use, production, or distribution of such tools per se. The latter approach threatens valuable security testing that makes technology more robust and benefits us all.
Protect coders’ rights to unauthorized access to computers for security testing
We asked the European Parliament to protect researchers who access a computer system without explicit permission when the perpetrator does not have a criminal intent, or mens rea. This protection is needed to safeguard security researchers’ rights to free expression and innovation. Examining computers without the explicit permission of the owner is necessary for a vast amount of useful research, which might never be done if obtaining prior permission was a legal requirement.
The language of the draft Directive resembles language in the Computer Fraud and Abuse Act (CFAA), which provides, among other things, that it is illegal to ‘intentionally access a computer without authorization or exceed authorized access, and thereby obtain . . . information from any protected computer.’
The US experience can serve as a warning to European legislators that vague ill-defined terms can have deleterious effects on free expression, innovation, and competition, especially with respect to the meaning of "authorized" computer access.
Protect coders' rights to free expression and innovation
Finally, we asked the European Parliament to protect security researchers’ right to free expression. Their ability to freely report security flaws is crucial and highly beneficial for the global online community. Public disclosure of security information enables informed consumer choice and encourages vendors to be truthful about flaws, repair vulnerabilities, and improve upon products.
For example, in early February, two German security researchers reported a vulnerability in two encryption systems that could allow eavesdropping on hundreds of thousands of satellite phone calls. Public disclosure of this kind of research allows consumers to be better informed and aware that their communications are not actually protected, which in turn lets them make thoughtful choices about the technology they use. Hopefully it could even inspire the European Telecommunications Standards Institute to formulate a stronger security algorithm that protects users’ privacy.
In our submission, we asked the Parliament to protect the rights of those researchers and whistleblowers. In the course of fixing a problem, they could inadvertently violate laws—even if they never intend to steal information, invade people’s privacy, or otherwise cause harm. By reporting the vulnerability, researchers could risk exposing themselves to a lawsuit or criminal investigation. On the other hand, potentially serious security flaws will go unaddressed if security researchers are forced to withhold information to protect themselves from possible legal liability.
All told, the European Commission hasn’t demonstrated that this proposed directive is necessary, and we don’t think it is. If this proposal moves forward, though, the European Parliament needs to narrowly define and clarify it. The goal should be to leave breathing room for legitimate security research and testing, allowing security researchers to flourish and do what they do best.
The Pakistani government is looking for new ways to censor the Internet.
This week, the Pakistani Telecommunication Authority (PTA) released a Request for Proposals (RFP) for the development, deployment and operation of a “National Level URL Filtering and Blocking System,” calling on institutions to submit by March 2nd a feasible proposal that would allow the government to institute a large-scale filtering system. Shockingly, the RFP requires: “Each [filtering] box should be able to handle a block list of up to 50 million URLs (concurrent unidirectional filtering capacity) with processing delay of not more than 1 milliseconds.” While content filtering and blocking has existed in Pakistan for the past few years, it has been executed manually and has thus been inconsistent and intermittent.1 The state’s latest effort to subsidize a comprehensive, automated censorship regime is deeply troubling.
The RFP, posted on the National ICT R&D Fund website, details various requirements for the system, as well as details for applying for the grant. Its terms of reference describe how this system would address the supposed “problem” that Pakistan does not currently have a sufficient mechanism to filter and block content:
Many countries have deployed web filtering and blocking systems at the Internet backbones within their countries. However, Pakistani ISPs and backbone providers have expressed their inability to block millions of undesirable web sites using current manual blocking systems.
It goes on to describe how the blocking and filtering would be carried out:
This system would be indigenously developed within Pakistan and deployed at IP backbones in major cities, i.e., Karachi, Lahore and Islamabad. Any other city/POP could be added in future. The system is proposed to be centrally managed by a small and efficient team stationed at POPs of backbone providers.
The system would have a central database of undesirable URLs that would be loaded on the distributed hardware boxes at each POP and updated on daily basis. The database would be regularly updated through subscription to an international reputed company maintaining and updating such databases.
The RFP ends with 35 system requirements that details all aspects of the project and what would be required in the system. Some other specifications for the system include capabilities to block both an individual and a range of IP addresses, have support for multiple languages, and be stand-alone hardware that can easily be integrated into any network.
The entity funding this initiative is an arm of the Pakistani Ministry of Information Technology called the National ICT R&D Fund. The Ministry created the fund in 2007 to take a certain percentage of revenue from telecommunications companies and allocate it for scholarships in IT education and research and development of information and communication technologies. Therefore, all grant funding for this national censorship project comes from domestic ISPs, mobile carriers, and telephone companies. The decision-making process by which it chooses projects and beneficiaries for grants, however, is not described anywhere on their website.
Censorship and content filtering is part of a broader trend towards moral policing in Pakistan. Ever since the Pakistan Telecommunication Act, passed in 1996, enacted a prohibition on people from transmitting messages that are “false‚ fabricated‚ indecent or obscene,” the PTA has increasingly intensified their efforts to censor content online. The PTA blocked thousands of sites in 2007—not just those containing pornographic material or content offensive to Islam, but numerous vital websites and services—in response to a Supreme Court ruling that ordered the blocking of “blasphemous” websites. In 2008, they briefly blocked YouTube because the site hosted Geert Wilder’s film “Fitna.” They blocked it again in 2010, over a hosted clip of Pakistani President Asif Ali Zardari telling an unruly audience member to “shut up.” In May of 2010, the PTA blocked Facebook in response to a controversy over a competition to draw the Prophet Mohammed.
Most recently in November of last year, the PTA sent a notice to Pakistani mobile carriers to ban 1,600 terms and phrases from SMS texts within seven days or they would face legal penalties. It was soon revealed that the list originated from an American National Football League’s “naughty words” list words that were banned from being printing on American football jerseys.
This new proposal is fundamentally different from Pakistan’s prior censorship efforts. First, it aims to find a non-governmental third party to design and implement a censorship mechanism. Second, this new system would, for the first time, automate the blocking and filtering process to facilitate comprehensive censorship of webpages. Previously, they have had to censor and block content manually and therefore the process has been less than consistent.
A range of local Pakistani digital civil liberty organizations have come out against the PTA’s initiative. Bytes for All, a human rights organization based in Pakistan focused on digital security, online safety and privacy, responded to the announcement with a press release, which strongly criticized the government:
Bytes for All, Pakistan (B4A), strongly condemns this move of the Government and holds it akin to infringing citizens’ fundamental constitutional rights. For a democratically elected civilian government, implementing such a system is highly dictatorial in nature and will directly affect the freedoms and socio-economic well-being of the citizens, reflecting the tyrannical actions of repeated oppression by past military governments.
The statement goes on to call on the attention of the UN Expert Panel on Human Rights on the Internet to the current situation. Another organization, Bolo Bhi, has sent a letter to the Ministry of Information Technology to demand transparency into the proceedings of this alarming initiative:
We feel that for successful implementation of a policy at all levels, transparency is crucial. We are a functioning democracy and therefore it is important to have stakeholders on board that could guide and assist on a policy before such a decision is made.
Both organizations call for international companies and institutions to refrain from applying for this proposal in the name of upholding the right to free expression. The RFP itself does not even attempt to explain or justify the need for the censorship system. However, the terms of reference briefly mentions that such system is needed “in order to block the specific URLs containing undesirable content as notified by PTA from time to time.”
The website for the National ICT R&D Fund states that its mission is “To transform Pakistan’s economy into a knowledge based economy by promoting efficient, sustainable and effective ICT initiatives through synergic development of industrial and academic resources.” For the past five years, the fund has backed domestic IT projects in education, health, and technology development, including some dubious projects in biometrics and other supposed security measures.
It is deeply ironic that the National ICT R&D Fund’s purported purpose is “to transform Pakistan’s economy into a knowledge based economy,” yet it calls for proposals for a project that is itself inherently backward and draconian. A national blocking and filtering system would thrust the entire society into a tailspin of repression that would do immeasurable damage to the economy. More importantly, this automated censorship regime would violate the human right to free expression and access to knowledge.
It’s clear that the authorities behind these institutions simply do not comprehend the massive socio-economic costs this would have on Pakistan. As Bolo Bhi wrote in their press statement: “At a time when we as a country are struggling to counter a popular narrative about us, further limiting the sphere would portray us as a grim totalitarian state, which is simply untrue.” If the government of Pakistan ever hopes to catch up as a hub of innovation and re-emerge into the international realm as a modern democratic nation, a repressive censorship program restraining Pakistani expression would not be the place to begin.
Ahead of the Academy Awards this weekend, Chris Dodd, head of the Motion Picture Association of America, would like to assure you that "Hollywood is pro-technology and pro-Internet." But what does that mean? The comments filed at the Copyright Office this month by MPAA and RIAA, together with the Business Software Alliance, the Entertainment Software Association, and other copyright owners' groups, paint a clear picture of these groups' vision for the future of the Internet and digital technologies.
EFF is asking the Copyright Office for legal exemptions to the Digital Millennium Copyright Act to allow jailbreaking (or "rooting") of smartphones, tablets, and game consoles, so that people can run their software of choice on the devices they own. EFF is also asking for exemptions that will allow noncommercial video remixers to use video clips from DVDs and online video services. Other organizations are asking for exemptions for various forms of digital video, accessibility for the disabled, and other important projects. Under the DMCA, exemptions expire every three years, and have to be justified all over again. Many of you sent comments and signed petitions in support of EFF's exemption requests, and the Copyright Office received almost 700 comments.
MPAA and friends don't approve of a single one of the exemption requests. "The risk associated with encouraging people to circumvent and test the limits of fair use is too high," they say, and the makers of computing devices should be able to stop "unintended uses" of their products. In fact, say the entertainment lobbies, giving you the ability to modify your own devices for your own use will "wreak havoc" on "markets for consumer access to works."
Let's unpack this. Almost everything we do on the Internet or with digital media makes a copy—even viewing a webpage. In many cases, the fair use rule of copyright law is what keeps these everyday activities from being copyright violations. But proving definitively that a use is fair often requires a courageous artist or entrepreneur to go to court and risk massive penalties for the chance of having a judge say that what they're doing is legal. According to the entertainment lobbies, the U.S. government should not encourage people to do this.
Ironically, most of the devices that let us create and experience movies, music, software, and so on "test the limits of fair use"—and many have wound up in court. If this were discouraged, we may never have had the VCR, the MP3 player, the digital video recorder, image-searching websites, or social networks—at least not without asking the entertainment industries' permission first.
And speaking of permission, MPAA regrets that "the Copyright Office missed an opportunity to endorse" the custom of "asking permission" before innovating.
So what should the Copyright Office be doing? MPAA et al. humbly suggest that the Office should be protecting the "ongoing viability of business models" that create "predictability with respect to how works will be accessed and how copyrighted software and technologies used to facilitate such access will be used and manipulated." You won't find that in any law, although it sounds a lot like the goals of the now-defunct SOPA and PIPA bills. Again, let's look behind the euphemisms: the entertainment lobbies want the U.S. government to protect their members' bottom lines by regulating how digital technologies can be used. Only uses that receive Hollywood's permission, and are "predictable," should pass muster.
Apparently this is what Mr. Dodd means when he says "Hollywood is pro-technology and pro-Internet": technology that blocks "unintended uses" and an Internet subject to Hollywood's veto power. SOPA and PIPA may be dead, but the agenda behind them seems alive and well.
Note that disabling Viewing and Search History in your YouTube account will not prevent Google from gathering and storing this information and using it for internal purposes. It also does not change the fact that any information gathered and stored by Google could be sought by law enforcement.
With Viewing and Search History enabled, Google will keep these records indefinitely; with it disabled, they will be partially anonymized after 18 months, and certain kinds of uses, including sending you customized search results, will be prevented. An individual concerned about privacy may also want to set up a secondary Google account for browsing and sharing YouTube videos. She could then download all of her existing YouTube videos to her computer, delete them from her primary Google profile, and then use a separate browser to upload them to a new secondary Google account. If you want to do more to reduce the records Google keeps, the advice in EFF's Six Tips to Protect Your Search Privacy white paper remains relevant.
The following steps will delete your viewing and search history on YouTube. If you have multiple YouTube accounts, you will have to complete these steps for each account.
1. Log in to your Google account.
2. Go to https://www.youtube.com
3. Click on your icon.
4. Click “Video Manager”
5. Click “History”
6. Click “Clear all viewing history.”
7. Click “Pause viewing history."
8. Click "Search History."
9. Click "Clear all search history."
10. Click “Pause search history.”
If you have multiple YouTube accounts, you will have to complete these steps for each account.