GROW YOUR TECH STARTUP

Predictive AI is a biased tool in a flawed criminal justice system: Brookings Institution panel

February 20, 2020

SHARE

facebook icon facebook icon

Predictive AI is a biased tool in an already flawed criminal justice system that uses risk assessment tools to predict the likelihood of someone getting re-arrested regardless of innocence or guilt, according to a Brookings Institution panel discussion.

“The tools, really, are just an instrument in our whole, flawed justice system” — Dr. Faye Taxman

The US criminal justice system currently uses predictive analytics to determine the probability of someone being re-arrested, but what purpose does it serve and for whom? How accurate are the tools across different racial groups?

On Wednesday the Brookings Institution held a panel on “AI, Predictive Analytics, and Criminal Justice” where professors and civil rights leaders discussed how the criminal justice system was flawed and that predictive analytic tools should not be used until they are improved.

So, what’s wrong with predictive analytic tools? For starters, they don’t do too well when trying to predict re-arrests of black Americans.

According to retired Texas A&M professor Dr. Edwina Dorch, a risk assessment tool currently used in the criminal justice system called COMPAS can only predict if black Americans will recidivate 63% percent of the time.

“It [COMPAS] can only predict if a black American will be re-arrested 63 percent of the time.

“We should want to know what factors are being left out. What factors should go in there to make it predict a hundred percent of the time rather than 63 percent of the time.”

Dr. Edwina Dorch

Dr. Edwina Dorch

“Forty percent of the people recidivate within the first year, but part of that is being caused by the fact that we don’t have the correct variables in the risk assessment instrument”

What are those factors you may ask?

The factors, or variables, that are not taken into account in the risk assessment tools include those that are outside the control of the individual. In other words, the variables aren’t all based on human behavior, but rather “environmental variables.”

Environmental variables can include drug laws or bail reform, that aren’t related to someone’s personal variables.

For example, when stop-and-frisk was reversed in New York, arrests went down “precipitously.”

When arrests go down due to a change in the law (an environmental variable), the probability of getting arrested and re-arrested goes down as well, and it has nothing to do with the individual’s behavior.

“We don’t have the services that you need in order to get you to reduce your risk” — Dr. Edwina Dorch

Environmental variables are beyond the scope of current risk assessment instruments, and this makes it harder to predict recidivism.

After all, if a person were to be arrested for marijuana possession, but then the state decriminalized personal possession, there would be no recidivism for a crime that doesn’t exist anymore.

“Forty percent of the people recidivate within the first year, but part of that is being caused by the fact that we don’t have the correct variables in the risk assessment instrument, and the other part of that being caused by the fact that we don’t have the services that you need in order to get you to reduce your risk,” Dorch added.

Another interesting point that The Leadership Conference on Civil & Human Rights Justice Reform program director Sakira Cook told the Brookings Institution panel was about the difference between crime data and arrest data and how those two data sets serve different purposes.

Sakira Cook

Sakira Cook

“It’s very difficult to create a tool that’s fair along racial lines, and that is a huge problem for us”

“What we know about the criminal legal system is that it is inherently biased,” she said, adding, “and there are grave disparities in how African Americans and Latinos and other people of color experience the criminal legal system and then how white people experience it.”

“Those disparities are borne out in the data that we see with respect to ‘crime data,’ but that’s not actually crime data; that’s arrest data.

“It actually just tells us what police are doing in various communities.

“And that’s what these tools are based on. Foundationally, they are based on arrest data.”

Cook makes a good point — arrest data does not equal crime data. You can be arrested for anything, but “innocent until proven guilty” is the law of the land.

“These tools are not transparent” — Sakira Cook

So, what purpose do predictive analytics actually serve? Are they about preventing crime and supporting rehabilitation, or do they only predict re-arrests regardless of innocence or guilt?

It all depends on the type of data gathered and how it’s used. According to the panelists, the data sets are severely flawed in their variables and are biased along racial lines.

“Designers are using historical data sets to make predictions about what people will do in the future,” Cook continued.

“Think about that. All of us have made some mistake in the past. We’ve all done something that maybe we’re not proud of, and if that is going to be the cornerstone of what determines our liberty […], should we be using historical data to determine those things?

“AI could be used for good, but only if the goals are aligned with what we’re ultimately trying to have happen” — Sakira Cook

“Should that be the fundamental basis upon which we should make those types of decisions? And we believe no.”

“These tools are not transparent. In many places the designers of the tools will enter into agreements with the governments who are buying them, which say you can’t even interrogate the data,” continued Cook.

“Is the data from 10 years ago? Is the data from two years ago? We won’t know that.

“You won’t be able to independently validate that because outside researchers won’t have access to the data without signing an NDA.”

“AI could be used for good, but only if the goals are aligned with what we’re ultimately trying to have happen — only if the tool is designed in such a way to meet that goal. If it isn’t, you’re going to have an outcome that might favor one group over another.

“It’s very difficult to create a tool that’s fair along racial lines, and that is a huge problem for us,” Cook concluded.

Dr. Faye Taxman

Dr. Faye Taxman

“The criminal justice system was not designed to really think about rehabilitation”

Faye Taxman, director of the Center for Advancing Correctional Excellence at George Mason University, rhetorically asked the Brookings Institution panel, “Do we need instruments at all?

She answered her own question, stating, “As a scientist I would say we need instruments. We need better instruments than the ones we currently have.”

“This dilemma that we have I think is really a dilemma now about how do we create instruments that are fair, transparent, and actually work to help people address the factors of how they got involved in criminal behavior.

“We have no national standards for how these tools should be developed” — Dr. Faye Taxman

“Right now we have a system that basically anyone can design an instrument and say it’s a good instrument, and we have no checks and balances in our process.

“We have no national standards for how these tools should be developed.

“We need a commission, I believe, that looks at the existing tools.”

“We don’t have well-designed instruments, and we as a public should demand better.

“The tools, really, are just an instrument in our whole, flawed justice system,” Taxman added.

“I think we as a society should be asking better of our state and federal and local governments in terms of better policies for how we deal fairly with these sorts of issues” — Dr. Faye Taxman

“The criminal justice system was not designed to really think about rehabilitation. We’ve been struggling with this for a long time, and so we make decisions based upon the potential risk that a person has to be re-involved in the justice system,” Taxman added.

If the risk assessment tool’s purpose is to predict whether or not someone will be arrested more than once, are there mechanisms in place to get that person help, so they don’t get re-arrested?

In other words, does the technology better serve society through rehabilitation, or does it better serve the prosecutors whose job is to incarcerate?

Technology behind bars on the Brains Byte Back podcast

SHARE

facebook icon facebook icon

Sociable's Podcast

Trending