Philadelphia is grappling with the prospect of a racist computer algorithm - Technical.ly Philly

Civic

Sep. 16, 2016 12:59 pm

Philadelphia is grappling with the prospect of a racist computer algorithm

Should it be fair? Should it be accurate? Or can it be either? A hearing at City Council this week tried to make sense of the tradeoffs of a new tool that will help judges decide who to keep in jail.

The Special Committee on Criminal Justice Reform's hearing of reducing the pre-trial jail population.

(Photo by Juliana Reyes)

After talk of algorithms, “uncomfortable predictors” and training data, William Cobb steered away from the details and asked the question that he and his fellow committee members had danced around for the last half hour.

The committee, a task force focused on criminal justice reform, was discussing “risk assessment tools” — computer programs designed to predict the likelihood that someone who’s been arrested will get arrested again. Advocates have argued that these tools could unfairly target certain groups, namely Black and brown men who live in high-crime areas.

That’s why Cobb, who founded a re-entry reorganization called Redeemed PA, began by describing himself: I’m a Black man, he said. I live in a poor neighborhood. I’ve been arrested before.

“As a Black male,” Cobb asked Penn statistician and resident expert Richard Berk, “should I be afraid of risk assessment tools?”

“No,” Berk said, without skipping a beat. “You gotta tell me a lot more about yourself. … At what age were you first arrested? What is the date of your most recent crime? What are you charged with?”

Cobb interjected: “And what’s the culture of policing in the neighborhood in which I grew up in?”

It was that point — the point that a legacy of racist policing and policies would inform the data behind the risk assessment model — that the committee seemed to grapple with the most.

"If we strive for fairness, unfortunately, we're going to lose some accuracy."
Richard Berk, University of Pennsylvania

The forthcoming risk assessment model is one way the city is trying to lower its jail population, as part of a $6 million project backed by a $3.5 million MacArthur Foundation grant. The gist is that Philly needs to do a better, smarter job of releasing those that are arrested, instead of keeping them in jail for months before their trial (the average Philly inmate spends three months in jail awaiting trial, four times the national average, according to a NewsWorks report). A risk assessment model aims to give judges a data-driven way to inform their decisions on how high to set bail and who to release.

Advertisement

The city has budgeted $100,000 for the tool, to be built by Berk, who built a similar tool for the city’s Adult Probation and Parole Department. It’s still early in the process: the courts aim to have the tool up and running within the next two years, said courts spokesman Gabriel Roberts.

Read these pieces in City & State PA and NewsWorks for a more in-depth look on the topic.

dvwh-uue

“If we strive for fairness, unfortunately, we’re going to lose some accuracy,” said Richard Berk. (Photo by Juliana Reyes)

Everyone seems to agree that the city’s current system is broken and that more data to consider is better than no data. The question, now, is how to implement this solution.

Berk, whom committee co-chair Councilman Curtis Jones, Jr. repeatedly called a “rock star,” has been frank about the realities of the system.

“If we strive for fairness, unfortunately, we’re going to lose some accuracy,” he said at the Council hearing. “That’s a price. It’s a price you may choose to pay.”

At the hearing, he played the role of the academic removed from ethics. On questions of whether to include certain data points, he repeatedly told the committee that those weren’t his decisions to make.

It’s not something I can decide for you,” he said.

This was an interesting role for him to play, given the fact that some of the committee members, like Cobb and committee co-chair Keir Bradford-Gray, yearned for Berk to give them answers about the morality of the tool.

“In criminal justice, we have to weigh fairness against other things. Isn’t that the tenet of the criminal justice system, to be fundamentally fair?” said Bradford-Gray, chief defender of the Defender Association of Philadelphia. “Maybe I’m an idealist … but I think that’s what a criminal justice system should have at its core or else we’re going to continue to get these same systemic, oppressive types of policies that we have for certain people that we don’t have for others. That’s where we are right now.”

To which Berk responded: “I have no disagreement with that whatsoever. All I’m saying is, in the real world, we can’t have it all.”

The eventual decision on which factors to include and how heavily to weight them will be made by a committee of “justice partners,” said Roberts, the courts spokesman.

As the committee spoke about how they hoped for a more individualized risk assessment model, one that took into account the uniqueness of each person’s situation, it made us think of the limitations of scale. How do you build a product that can process thousands of cases and yet still account for the individual? It’s a question most tech companies deal with, but here, the stakes feel higher.

-30-
Juliana Reyes

Juliana Reyes became Technical.ly's associate editor after reporting on the Philadelphia tech scene for four years. She's co-president of the Asian American Journalists Association Philadelphia chapter and a two-time Philadelphia News Award winner for "Community Reporting of the Year." The Bryn Mawr College grad lives in West Philly, likes her food spicy and wears jumpsuits often.

  • L. Kendrick

    Discussion during Philly Tech Week 2016: Bias, data, and modelling
    https://www.youtube.com/watch?v=Ubqz8CXX7vY

    Weapons of Math Destruction by Cathy O’Neil
    https://www.amazon.co.uk/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815
    And, O’Neil on YouTube
    https://www.youtube.com/watch?v=cK87rN4xpqA

  • L. Kendrick

    Also, “The Intersection of Poverty and Data, Part 4”
    https://www.youtube.com/watch?v=AuhAoyNKyKU

  • Someone22

    When it’s the truth, even though it discrimates against or for one group based on previous experience or averages, it’s not racist. Numbers don’t lie. 11% commits 40% of the crime. Fact.

    • L. Kendrick

      The people who input the data that produce the numbers often lie (and are, often, incompetent and dishonest). https://www.amazon.com/Damned-Lies-Statistics-Untangling-Politicians/dp/0520219783

      • Someone22

        An Amazon link about refuting statistics doesn’t refute fact. And, surely doesn’t refute the daily news and violence commuted by this minority . Once again, it’s facts not feelings that are what reality is based upon. When a culture or group finds violence and crime acceptable there’s only one way to fix it. Get them away from the public.

  • Zoe Wyse

    Basing decisions about how to treat people on things like their race or their gender would be very concerning. These are factors that people don’t control. They do not define a person’s character. If we based our decisions on how to treat people on things like whether someone had brown hair or blond hair, or on what type of car they chose to drive, or on what types of movies they preferred to watch, we would find this very concerning, regardless of whether a predictive tool might be able to come up with some correlation one way of the other. It just isn’t right.

    Statistics and data are very helpful in allowing us to evaluate whether or not programs are helpful or constructive or not. For example, if a community works hard to design a program to support people dealing with addiction, but the data shows that the vast majority of people in the program hate it, they don’t tend to get better, and the majority prefer others ways of working on their addiction issues, that would be useful information that could lead to better programs. However, when statistics and data are used to make decisions about people based on attributes that they have absolutely no control over and which in no way describe their hearts, their compassion, their integrity, or their character that is concerning.

    I am not familiar with this statistical tool, and so I don’t want to pass judgment without understanding the facts of the tool. Perhaps items like race and gender are not incorporated as factors at all in making predictions with this tool. In any case, the tool is not the fault of whoever designed it. The importance of acting responsibly lies with people who decide whether to use a tool for decision-making based on whether it is ethical to do so.

    I think we should be using statistics and data more and more to evaluate programs and come up with constructive ways to be of service to people and help make communities safer and more pleasant for everyone. But we should all be very careful not to use statistics to perpetuate negative stereotypes.

Advertisement

Sign-up for regular updates from Technical.ly