News

Technology Adoption around the Criminal Justice System is a Tightrope

/ Blog Post

By Ben Winters, 2019 Equal Justice Works Fellow hosted by the Electronic Privacy Information Center. Ben’s Fellowship is supported by the Philip M. Stern Fellowship.

Photo of Ben Winters

The American criminal justice system holds an estimated 2.3 million people in state and federal prisons, juvenile correctional facilities, local jails, and immigration detention facilities. As public outrage continues to grow over the incarceration of people in the U.S, some jurisdictions turned to pre-trial risk assessment technology with the stated goal of minimizing that population. Concurrently, predictive policing tools, face surveillance tools, government and corporate surveillance, and gang databases continue to proliferate widely.

There are resources developed and delivered through basic technology that do help clients clear burdensome administrative hurdles, understand rules and regulations, and easily find the right forms in an underfunded and overly complex legal system. However, most tools sold to or adopted by governments cause harm and carry biases, which are often unrecognized and unregulated.

In 2018, more than 100 civil rights organizations signed a statement urging against the use of pretrial risk assessment tools. In February 2020, two high-profile proponents of the tools qualified their support. Since 2019, two major predictive policing programs were shut down after oversight bodies published disastrous findings, and several U.S. cities banned facial recognition. Still, nearly every state uses some sort of algorithmic risk assessment in their Criminal Justice System, and adoption continues.

In the criminal justice system, where your life and liberty are at risk, automated technology must be held to a higher standard. However, few regulations exist throughout the country to protect individuals against technological harm. Regardless of the intentions of the technology, if you feed the automation technology crime data that already contains racial biases, it will create outputs that are inherently biased. This means that your zip code, social network, income, race, or education level could be the reason you’re not eligible for bail, rather than the specific details of the situation.

Automated technology in the criminal justice system has become an industry where gaps in access to justice are encoded and exacerbated. Still, widespread adoption moves forward.

Intersectional consideration and transparency need to be a priority before the technology is adopted. Access to justice issues shouldn’t be treated as an industry problem for tech firms to expand their business.

To learn more about Ben’s Fellowship project, visit his profile.

Automated technology in the criminal justice system has become an industry where gaps in access to justice are encoded and exacerbated.

Ben Winters /
Equal Justice Works Fellow

Learn more about becoming an Equal Justice Works Fellow