Civic News

Philadelphia is grappling with the prospect of a racist computer algorithm

Should it be fair? Should it be accurate? Or can it be either? A hearing at City Council this week tried to make sense of the tradeoffs of a new tool that will help judges decide who to keep in jail.

The Special Committee on Criminal Justice Reform's hearing of reducing the pre-trial jail population. (Photo by Juliana Reyes)
After talk of algorithms, “uncomfortable predictors” and training data, William Cobb steered away from the details and asked the question that he and his fellow committee members had danced around for the last half hour.

The committee, a task force focused on criminal justice reform, was discussing “risk assessment tools” — computer programs designed to predict the likelihood that someone who’s been arrested will get arrested again. Advocates have argued that these tools could unfairly target certain groups, namely Black and brown men who live in high-crime areas.

That’s why Cobb, who founded a re-entry reorganization called Redeemed PA, began by describing himself: I’m a Black man, he said. I live in a poor neighborhood. I’ve been arrested before.

“As a Black male,” Cobb asked Penn statistician and resident expert Richard Berk, “should I be afraid of risk assessment tools?”

“No,” Berk said, without skipping a beat. “You gotta tell me a lot more about yourself. … At what age were you first arrested? What is the date of your most recent crime? What are you charged with?”

Cobb interjected: “And what’s the culture of policing in the neighborhood in which I grew up in?”

It was that point — the point that a legacy of racist policing and policies would inform the data behind the risk assessment model — that the committee seemed to grapple with the most.

If we strive for fairness, unfortunately, we're going to lose some accuracy.

The forthcoming risk assessment model is one way the city is trying to lower its jail population, as part of a $6 million project backed by a $3.5 million MacArthur Foundation grant. The gist is that Philly needs to do a better, smarter job of releasing those that are arrested, instead of keeping them in jail for months before their trial (the average Philly inmate spends three months in jail awaiting trial, four times the national average, according to a NewsWorks report). A risk assessment model aims to give judges a data-driven way to inform their decisions on how high to set bail and who to release.

The city has budgeted $100,000 for the tool, to be built by Berk, who built a similar tool for the city’s Adult Probation and Parole Department. It’s still early in the process: the courts aim to have the tool up and running within the next two years, said courts spokesman Gabriel Roberts.

Read these pieces in City & State PA and NewsWorks for a more in-depth look on the topic.

dvwh-uue

“If we strive for fairness, unfortunately, we’re going to lose some accuracy,” said Richard Berk. (Photo by Juliana Reyes)

Everyone seems to agree that the city’s current system is broken and that more data to consider is better than no data. The question, now, is how to implement this solution.

Berk, whom committee co-chair Councilman Curtis Jones, Jr. repeatedly called a “rock star,” has been frank about the realities of the system.

“If we strive for fairness, unfortunately, we’re going to lose some accuracy,” he said at the Council hearing. “That’s a price. It’s a price you may choose to pay.”

At the hearing, he played the role of the academic removed from ethics. On questions of whether to include certain data points, he repeatedly told the committee that those weren’t his decisions to make.

It’s not something I can decide for you,” he said.

This was an interesting role for him to play, given the fact that some of the committee members, like Cobb and committee co-chair Keir Bradford-Gray, yearned for Berk to give them answers about the morality of the tool.

“In criminal justice, we have to weigh fairness against other things. Isn’t that the tenet of the criminal justice system, to be fundamentally fair?” said Bradford-Gray, chief defender of the Defender Association of Philadelphia. “Maybe I’m an idealist … but I think that’s what a criminal justice system should have at its core or else we’re going to continue to get these same systemic, oppressive types of policies that we have for certain people that we don’t have for others. That’s where we are right now.”

To which Berk responded: “I have no disagreement with that whatsoever. All I’m saying is, in the real world, we can’t have it all.”

The eventual decision on which factors to include and how heavily to weight them will be made by a committee of “justice partners,” said Roberts, the courts spokesman.

As the committee spoke about how they hoped for a more individualized risk assessment model, one that took into account the uniqueness of each person’s situation, it made us think of the limitations of scale. How do you build a product that can process thousands of cases and yet still account for the individual? It’s a question most tech companies deal with, but here, the stakes feel higher.

Companies: Philadelphia City Council

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

Our services Preferred partners The journalism fund
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

The person charged in the UnitedHealthcare CEO shooting had a ton of tech connections

From rejection to innovation: How I built a tool to beat AI hiring algorithms at their own game

Where are the country’s most vibrant tech and startup communities?

The looming TikTok ban doesn’t strike financial fear into the hearts of creators — it’s community they’re worried about

Technically Media