Civic News
AI / Crime / Cybersecurity / Data / Municipal government

How can municipal gov better use public algorithms? Pitt Cyber has some ideas

"We should want governments to use data, that's a good thing," Executive Director Beth Schwanke said of the institute's new report with recommendations for ethical use. "We just want to be doing it the right way."

The Cathedral of Learning at the University of Pittsburgh. (Photo by Flickr user Tony Webster, used via a Creative Commons license)
Pittsburgh’s municipalities have used public algorithmic systems to make decisions ranging from children’s safety to policing. A task force from a local cybersecurity institute wants to make sure it’s ethical.

The University of Pittsburgh’s Institute for Cyber Law, Policy and Security just published the first report from its Pittsburgh Task Force on Public Algorithms. Formed in 2020, the task force spent the last two years collecting data and relevant information regarding civic use of algorithms. The goal of the research and culminating report are to better understand the oversight processes for these algorithms, and how to craft them in ways that don’t perpetuate bias in government policy.

Members of the task force included several faculty from Pitt Cyber and the university overall, but also from local institutions like the Urban League of Greater Pittsburgh, ACLU of Pennsylvania and A Second Chance. The report, which was financially supported by The Heinz Endowments and the Hillman Foundation, also featured input from other universities across the country, including the University of California San Diego, the UCLA School of Law, the Rutgers University School of Law and Carnegie Mellon University.

The report included input from a government advisory panel designated by former mayor Bill Peduto and County Executive Rich Fitzgerald, which had representatives from the City of Pittsburgh’s Department of Innovation and Performance, the Pittsburgh Bureau of Police, the Allegheny County Department of Human Services and more. At the end of the report is also an appendix of a community feedback and engagement summary.

“There’s been so much attention on algorithmic bias writ large, and no one was really looking at it at the time in the context of what municipal governments are doing,” Pitt Cyber Executive Director Beth Schwanke told Technical.ly. “There was not as much attention to what was going on, on the local side of things, and that has really direct impacts and consequences on people’s everyday lives.”

See the report

The report notes that its authors approached the research with a few “core understandings.” Per the report’s language:

  • Government use of algorithms can bring many concrete benefits.
  • Those same algorithms can carry risks that can and should be guarded against.
  • Government acceptance and use of algorithms will likely increase in the coming years.

Spanning 44 pages in total, the report first outlines the general definition of a public algorithmic system and specifically how it’s used in the Pittsburgh region, before diving into what problems can exist with those systems and recommendations on how to prevent them.

According to the task force, a public algorithmic system has a definition broader than algorithms or the applied use of artificial intelligence, automated decision systems and the like. Instead, it is “any system, software or process that uses computation” such as AI techniques, machine learning or data processing “to aid or replace government decisions, judgements and/or policy implementations,” with a particular focus on those that involve access to opportunities, safety or other rights.

That broad definition includes a wide range of systems, even some that might be low-risk, the report pointed out. So instead of detailing the potential problems with each, it highlights two specific systems used by the county and the city: the Allegheny Family Screening Tool (AFST) and the City of Pittsburgh’s suspended predictive policing system. The former is a predictive risk modeling tool to increase efficiency in screening for allegations of child maltreatment while the latter was a system that used past crime reports and 911 calls to find potential “hot spots” for future crime.

While neither system is or was perfect, the report illustrated the vast differences in evaluation and transparency, detailing the public input and external reviews of the AFST, compared to almost no transparency in the development of the predictive policing tool. That issue, combined with problems around error rates, biased data, non-representative data, lack of public ownership, overweighted outcomes and more are what can make public algorithmic systems a potential threat to civil rights.

“I think what happens is a lot of government agencies are just procuring algorithms off the shelf, and people aren’t being trained in the right ways,” Schwanke said. “Instead of being a tool to our government, they’re becoming a tool that is relied on in an unsafe way.” Nonetheless, there is still a lot of value in public algorithmic systems, she argued: “We should want governments to use data, that’s a good thing. We just want to be doing it the right way.”

To address the risks of public algorithms in both current and future systems, the report outlined a total of seven recommendations for regional governments. Those included encouragement of public participation, involvement of the public in system development plans, use of third-party reviews for high risk systems, integration of reviews in system procurement processes, open publication of system information and avoidance of biometric algorithmic systems like facial recognition. The seventh recommendation is continual evaluation of the effectiveness of all other recommendations on a case-by-case basis.

As far as whether or not Mayor Ed Gainey‘s administration will continue the involvement and support that the previous mayor had for open data and tech transparency, Schwanke said she couldn’t speak to that. But she noted that Gainey attended community meetings for the task force when he was a state representative, and that the city’s Data Governance Committee is looking at ways to implement the report’s recommendations.

Concern for the use of public algorithmic system isn’t new, with similar questions of their bias in Philadelphia. It’s also the foundation of several books and research on the increasing use of data and AI-driven systems by public and private entities, most notably in Cathy O’Neill’s “Weapons of Math Destruction.”

But the report concludes on a note of public involvement, placing the impetus of improvement not only on governing bodies but on the people they govern too. Mitigation alone, the report argued, won’t be enough to fully resolve the systemic issues within these systems.

“Success will depend on the depth of public participation,” the report said of its recommendations. “You do not need to be a technical expert to participate. Public deliberation and engagement are crucial in determining whether an algorithm is an appropriate tool in a particular context and whether appropriate conditions for its use have been met. You are an expert on the needs of you, your family, and your community.”

Sophie Burkholder is a 2021-2022 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Heinz Endowments.
Companies: University of Pittsburgh
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Where to watch the April 8 solar eclipse in Pittsburgh

How venture capital is changing, and why it matters

What company leaders need to know about the CTA and required reporting

Why the DOJ chose New Jersey for the Apple antitrust lawsuit

Technically Media