Californians in November voted to repeal a 2018 law that would have ended cash bail and replaced it with a digital pretrial risk assessment tool that dictates whether a person can be released as they await their trial. By voting No on Proposition 25, Californians voted to keep the cash bail system, and not replace it with automated pretrial risk assessments.

EFF did not have a position on the ballot measure. Much like the ACLU of Northern California, EFF believed this proposition—no matter its outcome—does not create a fair pretrial system. However, EFF has done extensive research on pretrial risk assessment algorithms, and worked to prevent the deployment of unfair tools through the legislature

Pretrial risk assessment tools come with their own risks and potential pitfalls, and it is vital that Californians consider what’s required to address their potential harms. While the proposition failed, the fact is that these tools are currently in use in 49 of California’s 58 counties as part of their bail system, according to a December 2019 report from the Public Policy Institute of California

There are many reasons to be concerned about replacing cash bail with an algorithm that categorizes people as low-, medium-, or high- risk before releasing some, and leaving others in jail. 

Digital pretrial risk assessment generates similar concerns as predictive policing. Both rely on data generated by a racially biased criminal justice system in order to make determinations of who is a threat to society and who is not. This can have a devastating impact on people’s lives, their well-being, and that of their families. In the case of predictive policing, the algorithm could flag someone as a risk and subject them to near constant police harassment. In the case of risk assessment, if a person has been flagged as unreleasable, they may sit in jail for months or years awaiting their trial for no other reason.

Some see risk assessment tools as being more impartial than judges because they make determinations using algorithms. But that assumption ignores the fact that algorithms that are given biased data or not carefully developed can cause the same sort of discriminatory outcomes as existing systems that rely on human judgment—and even make new, unexpected errors or entrench systemic bias with a new veneer of supposed impartiality. 

This system creates a scenario in which some people, arbitrarily or discriminatorily picked out by computer as being high risk, are left to sit in jail without ever knowing what data dictated the outcome. This is not a merely theoretical concern. Researchers at Dartmouth University found in January 2018 that one widely used tool, COMPAS, incorrectly classified black defendants as being at risk of committing a misdemeanor or felony within 2 years at a rate of 40%, versus 25.4% for white defendants. Computers, especially ones operating on flawed or biased data, should not dictate who gets to go home and who does not or be given undeserved weight in a judge’s decision about release.

Any digital tool hoping to replace cash bail should provide the public with straightforward answers concerning what data is factored in decision making before they are used in a courtroom. EFF has previously submitted comments to the California Judicial Council outlining our recommendations for guardrails that should be placed on such tools, if court systems must use them. These include: How much transparency will there be about how the algorithm functions and what data went into its development? And, will there be an appeals process for people who feel as if their case has not been fairly adjudicated? In what instances could or would a judge overturn advice given to them by the assessment tool? 

The pandemic is disproportionately hurting incarcerated individuals across the country. Pretrial risk assessment tools should not be used unless they are equitable, transparent, and fair—and there are deep challenges to overcome before any tool can make those claims. People need to know if the assessment tool deciding who gets to leave jail will also condemn individuals to imprisonment without reprieve in an arbitrary or racialized way.

This article is part of our Year in Review series. Read other articles about the fight for digital rights in 2020.