Location tracking apps. Spyware to enforce quarantine. Immunity passports. Throughout 2020, governments around the world deployed invasive surveillance technologies to contain the COVID-19 outbreak.
But heavy-handed tactics like these undercut public trust in government, precisely when trust is needed most. They also invade our privacy and chill our free speech. And all too often, surveillance technologies disparately burden people of color.
In the United States, EFF and other digital rights advocates turned back some of the worst proposals. But they’ll be back in 2021. Until the pandemic ends, we must hold the line against ill-considered surveillance technologies.
Automated contact tracing apps
Contact tracing is a common public health response to contagious disease. In its traditional form, officials interview an infected person to determine who they had contact with, and then interview those people, too. Many have sought to automate this process with new technologies. But an app will not save us.
Some proposals would be simultaneously privacy-invasive and ineffective. For example, tracking our location with GPS or cell-site location information (CSLI) would expose whether we attended a union meeting or a BLM rally. That’s why police need a warrant to seize it. But it is not sufficiently granular to show whether two people were close enough to transmit the virus: the CDC recommends six feet of social distance, but CSLI is only accurate to a half mile and GPS to 16 feet. So EFF opposes location tracking. Yet some countries are using it.
Another approach is tracking our proximity to others by measuring Bluetooth signal strength. If two people install compatible proximity apps, and come close enough together to transmit the virus, then their apps will exchange digital tokens. Later, if one becomes ill, the other can be notified.
Proximity tracking might or might not help at the margins. It will be over-inclusive: two people standing a few feet apart might be separated by a wall. It also will be under-inclusive: many people don’t have smartphones, and many more won’t use a proximity app. Moreover, no app can fill the as-yet unmet need for traditional public health measures, such as testing, contact tracing, support for patients, PPE for health workers, social distancing, and wearing a mask.
Proximity apps must be engineered for privacy. Unfortunately, many are not. In a “centralized” model, the government has access to all the proximity data and can match it to particular people. This excessively threatens digital rights.
A better approach is Google Apple Exposure Notification (GAEN). It collects only ephemeral, random identifiers that are harder to correlate to particular individuals. Also, GAEN stores these identifiers in the users’ phones, unless a user tests positive, in which case they can upload the identifiers to a publicly accessible database. Public health authorities in many U.S. states and foreign nations sponsor GAEN-compliant apps.
Participation must be voluntary. Higher education, for example, must not require students, faculty, and staff to submit to automated contact tracing. We need laws that prohibit schools, workplaces, and restaurants from discriminating against people who do not use proximity tracking.
Surveillance to enforce quarantine
Some countries have used surveillance technologies to enforce home quarantine. These include compulsion to wear GPS-linked shackles, to download government spyware into personal phones, and to send the government selfies with time and place stamps.
EFF opposes such tactics. Compelled spyware unduly invades the right of individuals to autonomously control their smartphones. GPS shackles invade location privacy, cause pain, and trigger false alarms. Home selfies expose sensitive information, including grooming in private, presence of other people, and expressive effects such as books and posters.
Fortunately, governments in the United States largely have not used these tactics. The exception is a small number of cases involving people who tested positive and then allegedly broke stay-at-home instructions.
Some have proposed “immunity passports” to screen people for entry to public places. The premise is that a person is not fit to enter a school, workplace, or restaurant until they can prove they have tested negative for infection or supposedly obtained immunity through past infection. Such systems may require a person to use their phone to display a digital credential at a doorway.
EFF opposes such systems. They would aggravate existing social inequities in access to smart phones, medical tests, and health treatment. Moreover, the display or transmission of credentials at doorways would create new infosec vulnerabilities. These systems also would be a significant step towards national digital identification that can be used to collect and store our personal information and track our movements. And inevitable system errors would needlessly block people from going to school or work.
Further, such systems would not advance public health. Tests of infectiousness have high rates of false negatives, and do not account for new infection after testing. Likewise, it remains unclear how much protection a past infection provides against a future infection.
Fortunately, California’s governor this fall vetoed a bill (A.B. 2004) that would have laid the groundwork for immunity passports. Specifically, it would have created a blockchain-based system of “verifiable health credentials” to report COVID-19 and other medical test results. EFF opposed it.
Processing our COVID-related data
While some of the worst ideas did not gain traction in 2020, the news is not all good. Governments and corporations are processing all manner of our COVID-related data, and existing laws do not adequately secure it.
States are conducting manual contact tracing, often contracting with business to build new data management systems. States also are partnering with businesses to create websites where we provide our health and other information to obtain screening for COVID-19 testing and treatment. Just as the U.S. Department of Health and Human Services expanded its processing of data about people who took COVID-19 tests, the federal government announced plans to share COVID-related data with its own corporate contractors, including TeleTracking Technologies and Palantir.
Businesses are also expanding their surveillance of workers. This occurs at job sites, in the name of tracking infection, and in socially distant home offices, in the name of tracking productivity.
There are many ways to misuse our COVID-related data. Companies might divert our COVID data to advertising. All this COVID data might be stolen by identify thieves, stalkers, and foreign nations. In New Zealand, a restaurant employee even used COVID data to send harassing messages to a customer.
Moreover, public health officials and their corporate contractors might share our COVID-related data with police and immigration officials. This would frustrate containment of the outbreak, because many people will share less of their personal information if they fear the government will use it against them. Yet in some communities, police are conducting contact tracing or obtaining public health data about the home addresses of patients. The outgoing administration even proposed deploying the National Guard to hospitals to process COVID-related personal data.
Existing data privacy laws do not adequately secure our COVID-related data. For example, HIPAA’s protections of health data apply only to narrowly defined healthcare providers and their business associations. This is one more illustration of why we need a comprehensive federal consumer data privacy law.
In the short run, we need COVID-specific data privacy legislation. But efforts to enact it have stalled in Congress and state legislatures.
As pandemic fatigue sets in, the temptation will grow to try something—anything—even if it is unlikely to contain the virus and highly likely to invade our digital rights. So, we probably haven’t heard the last of location tracking apps, immunity passports, and spyware for patients. Other bad ideas may gain momentum, like dragnet COVID-19 surveillance with face recognition, thermal imaging, or drones. And we still need new privacy laws to lock down all of our COVID-related personal data.
Looking to 2021, we must remain vigilant.
This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2020.