The U.S. Department of Health and Human Services (HHS) has proposed a sweeping update to the federal regulations that govern scientific experiments involving human subjects, whether it’s studying behavior, testing biological specimens, or analyzing DNA. While the proposed policy [.pdf] generally moves in the right direction, EFF has filed formal comments outlining several serious concerns about how these rules will impact privacy.

The “Federal Policy for the Protection of Human Subjects”—often referred to as the “Common Rule”—is the ethical framework for biomedical and behavioral research established in the wake of medical scandals that shook the nation, including the now infamous Tuskegee Syphilis Study, in which the U.S. government withheld treatment and medical information from rural African-American men suffering from the disease. Much of the Common Rule revolves around two concepts: informed consent and independent review. These principles reflect the need for people need to know the risks and benefits and what will happen to their specimens before agreeing to participate in an experiment and the idea that researchers will make better ethical decisions with the guidance of oversight bodies.

The Common Rule is up for its first full update since it was published in 1991. In reviewing the policy, HHS is not only supposed to address ethical issues that have arisen over the last two decades but also to take a forward-looking approach to protect human subjects from potential abuses that may emerge in the future.

One of the most controversial elements in the proposed regulations is how researchers conduct secondary experiments on biospecimens. Under the current rule, researchers do not require informed consent for secondary research performed on “non-identified” biospecimens. In lay terms, researchers don’t need your permission to run tests on specimens taken for other reasons, such as the leftover blood or tissue from a routine doctor’s visit, as long as the specimens can’t be linked to you. However, as TechInsider explains:

But that "stripped of your personal information" bit is tricky—and easily bungled. Researchers have shown that today, they can use genetic testing and information on the internet, for example, to re-identify samples that were supposed to be anonymous. Nobody imagined that could happen back in 1991.

While HHS’s proposal makes progress on this issue, EFF thinks it should go further. Unfortunately, this is far from the only controversial aspect of the Common Rule update. The proposal also includes numerous loopholes and exemptions that we believe are unnecessary and potentially dangerous.

The following are among the issues EFF raises in our comments, authored by EFF Senior Staff Attorney Lee Tien and EFF Legal Intern Yonatan Moskowitz:

HIPAA Exception 

The proposed rule would exempt research covered under the Health Insurance Portability and Accountability Act (HIPAA). We believe this exception should be removed or narrowed significantly, because it mistakenly assumes that the existing legal statutory regime is up to date ethically and constitutionally. As we write in the comments: 

The Common Rule cannot merely assume that existing law adequately considers the modern importance of respecting patients’ autonomy and privacy. An inherent weakness of HIPAA is that its Privacy and Security Rule only applies to “covered entities” and their business associates. When treatment providers and other entities subject to HIPAA disclose data to, for example, state government entities for public health purposes, such data is no longer protected by HIPAA unless the government entity is already subject to HIPAA. 

Consent in the Age of Big Data

We disagree with HHS’s assertion that it may not be as important to require informed consent in research that involves algorithmic analysis of “big data.” Instead, we argue, that while “big data” does make research easier, these technological advancement can also enhance the rights of human subjects. HHS also should be wary that “big data” can reproduce and exacerbate existing inequalities and injustices.

As we write:

First, computational and data-storage advances have increased the ease with which researchers can receive, track, send, and enforce fine-grained consent on the same databases they will be manipulating to perform their research. 

Second, this ease of data transfer has had a paradigm-shifting impact on the ability of entities to aggregate deep databases on individuals—the disclosure of which have much more of an impact on patient privacy than the disclosure of databases did in the past.

Broad vs. Granular Consent

While the proposed rule would require informed consent for secondary research on “non-identified” biospecimens, the type of consent HHS envisions is not enough to protect subjects: 

Unfortunately, the updated Rule would implement this requirement via a broad consent for future research, under which any such research would be “exempt” research that would not require annual continuing review by an IRB. We seriously question this approach. First, broad consent to future research is arguably the least meaningful form of individual consent. The human subject will not know what the future biospecimen research entails, how it will affect him or her, how the biospecimen or research data will be shared, or which biospecimens they can expect to provide in the time period that this consent is presumed valid for.  

Second, while genomic-related research and technology is of great potential benefit, its rapid evolution also presents significant risk and uncertainty to privacy and social control, especially given the increasing use by law enforcement and government of genetic identification. And quite apart from the concerns about government access, use or disclosure of genetic data raises ethical and privacy issues for individuals in the employment and other private-sector contexts. 

The proposed rule is based on the assumption that “individual tracking” of test subjects is too much of a burden for researchers. We obviously disagree; individual tracking can be easily accomplished with modern technology, such as APIs. As we explain:

For example, if the specimens may only be used if the researcher reports which specimens they are using and which information they intend to extract, then the researcher can query the database for fields that record a more fine-grained consent for secondary research. An individual could therefore offer consent for only certain kinds of experiments, and could require that information to be received by researchers before they undertake any experiments. And before a person undertakes research on the specimen, they could be required to confirm that their study fits into the permission granted from the human subject, and to check with the original specimen-collecting researcher to confirm that the cumulative information gathered from the tests will not surpass the human subject’s individual de-anonymization threshold.


We’re concerned that proposal allows heads of department or agencies conducting the research to make the final decision whether the research is covered by the Common Rule. We believe HHS should consider including a sanction or private cause of action when researchers or department heads erroneously grant exemptions. 

Our primary concern here is to preserve meaningful oversight, which we believe requires after-the-fact accountability. The NPRM must lay out a more substantive process mandating more than one level of appeal/review by a human in all but the most clear-cut examples. 

We also warned HHS of the dangers of research institutions allow researchers to make their own determinations or use an automated tool to determine whether their research is exempt. 

Intelligence Activity Exemption 

The proposal includes exemptions for intelligence surveillance activities. Since the intelligence communities have long histories of using creative interpretations of the law and regulations, we believe that any exemption should be subjected to heightened review. As we write:

[The policy] offers practically no limitation to an intelligence community with a history of expansively interpreting limited exemptions. There should be a discussion, a representative list, or at a minimum a modifier added here to give future courts or administrative law judges some sort of applicable standards to apply if a dispute arises.

Public Behavior Exemption

The proposed rule would also exempt studies of public behavior from independent review if the information is properly anonymized. However, the rule fails to define what behavior is public or to explain what behavior is public enough that it shouldn’t be protected. As we write, this exception could allow for inappropriate research into Internet browsing: 

For example, information disclosed to Internet Service Providers including requests for pages and documents to be sent to a particular IP address should not be considered public merely because they are not occurring in a single physical place in one person’s home. This is true even if tools can be developed to observe this—and only this—information from a “public” place that the information happens to pass through on its way to a destination.

What’s Next 

These items are only a selection of our concerns. However, as critical as we are of the proposed regulation, there’s serious risk that it could get even worse. Biotech and pharmaceutical companies and well-funded research institutions have come out hard against the policies in the other direction, since they would prefer to conduct experiments with as little oversight and restriction as possible. We’ll be watching closely for developments, including further opportunities to lodge feedback and opposition.

For more information, read our filing, as well as the World Privacy Forum's comments [.pdf]

Related Issues