Civic News

Allegheny County blocks generative AI on its computers as it shapes up its approach to the tech

In Pittsburgh, an "agency-by-agency" approach to tools like ChatGPT have created a patchwork system of guidelines and regulations.

Allegheny County is still deciding how AI fits in with governing. (Photo illustration by Natasha Vicens/PublicSource)

In June of 2020, then-Mayor Bill Peduto sent a letter to the Pittsburgh Task Force on Public Algorithms thanking the group for cautioning against an algorithmic system used to predict areas likely to have high levels of ongoing crime. He wrote that the system could “exacerbate implicit bias and racial injustices in our communities,” and noted that the program had been indefinitely suspended. 

Such algorithms fall into the bucket of artificial intelligence, a type of technology that’s gripped the world’s attention since Peduto left office at the end of 2021. Now, in addition to algorithms, which for years have been used in both the public and private sectors to review large data sets, the advent of generative AI poses new risks.

In recent years, tech companies have released programs such as ChatGPTDALL-E and Sora, which use large-scale training based on existing content to generate text, images and video in response to user prompts. These programs have prompted ethical debate regarding plagiarism, copyright, factual accuracy, privacy and more. 

Government use of generative AI comes with its own risks, including the possibility of convincing-looking fake images, that could erode public trust. Experts, meanwhile, worry officials haven’t properly regulated those algorithmic tools that have been around for years. 

Public-facing guidelines on how the government uses AI to reduce algorithmic bias

Both Allegheny County and the City of Pittsburgh have taken action to regulate their use of AI technology.

For the county, it’s a work in progress starting with a pause on ChatGPT and similar programs; for the city, it involves creating internal guidelines informed both by Pitt Cyber’s research and a national coalition of municipal governments. 

Some cities across the country have made their guidelines public. Many of these guidelines focus solely on generative AI technologies.

Ethical discussion of AI shouldn’t ignore the risks posed by algorithms as the public focus shifts toward generative tools, said Beth Schwanke, executive director of the University of Pittsburgh’s Institute for Cyber Law, Policy and Security. 

Beth Schwanke, executive director of the University of Pittsburgh’s Institute for Cyber Law, Policy and Security, works in a computing lab in the Barco Law Building in Oakland on July 12. (Photo by Jess Daninhirsch/PublicSource)

In addition to the city’s past use of algorithms in policing, the county’s Department of Human Services has used an algorithmic system for its Allegheny Family Screening Tool to inform responses to phone calls reporting alleged child abuse, which received heavy public scrutiny

Schwanke stressed that local government should be public about how it uses AI. 

“Having some sort of public, published guidance also helps residents and journalists and others look at how the city or the county, our local governments, are approaching AI and reflect on that and ask questions,” Schwanke said. 

The county’s IT department and the CountyStat division of the county manager’s office recently created an “AI Governance Working Group” to develop generative AI policy, according to county communications director Abigail Gardner. This policy remains in development and will precede additional efforts to issue guidelines on other forms of AI technology. In the interim, the county has blocked generative AI programs on its computers. 

The city, however, is further along with its approach to guidance for AI usage. 

Pittsburgh bans gen AI from image creation, sourcing facts and more

The city has joined the Government AI Coalition, formed by San Jose, California, known for its association with Silicon Valley.

This national coalition of more than 250 municipal agencies collaboratively creates and administers policies for AI use in government. The coalition offers a template, which agencies can adopt and edit as they see fit. The City of Pittsburgh has created its own policy document informed by the coalition and distributes it internally to employees. 

The city also looks to the Pitt-based Pittsburgh Task Force on Public Algorithms for guidance. 

Heidi Norman, the city’s performance and innovation director, and Chris Belasco, manager of data service for the department, said the city enforces a few clear prohibitions on AI technology: It’s never used to create images or video, it’s never used to source factual information and it never takes the place of human decision-making.

City employees sometimes use generative AI programs such as ChatGPT to summarize or clarify information, and it’s disclosed internally when a generative AI program assisted in any writing, according to Belasco. To Norman and Belasco’s knowledge, generative AI has not yet assisted in the writing of any public-facing documents, but disclosure would be expected. 

“We don’t want people using it as a source of knowledge. It can help us [by] summarizing information or clarifying information but … we don’t want users asking it for facts, given the risks of AI hallucinations,” Belasco said, referring to generative AI’s propensity for sometimes offering incorrect information. 

The city found the Pitt Task Force’s recommendations on algorithms, outlined in an 2022 report, helpful, according to Norman. The recommendations called for more information and interaction with the public about algorithms, avoiding facial recognition (something Pittsburgh police officers have illegally used in the past) and reviewing third-party programs that use algorithms. 

The city’s internal policy regarding algorithms focuses most on probing the companies that provide the algorithm software to ensure it’s clear what the algorithm does and how it works, according to Norman. Some city departments, including Innovation and Performance, use algorithms that review and recommend certain applicants in a job search, Norman said. 

“Those kinds of algorithms can be very detrimental if the organization does not understand what inherent biases might be embedded within the algorithm, and so it’s imperative for us to really understand, are we using those algorithms or not?” Norman said. “And if we’re going to use them, what exactly do they do so that we can determine what the risk level would be for an inappropriate type of recommendation or suggestion resulting from the algorithm.” 

So far, Norman doesn’t believe the past few years of progress in AI technology has manifested in Pittsburgh as anything truly exciting and revelatory just yet. Still, it can help some for now and perhaps much more in the future. 

“The types of things that I think we do get excited about for the future are repetitive tasks that result in fatigue and errors by people — replacing that with computer systems, sort of like we did from calculators to spreadsheets,” Norman said. “Those types of things I think are going to potentially make a lot more of our human talent available for higher-order thinking and processing.”

The region still lacks transparency and consistency across departments

The city arranged an interview between PublicSource and representatives of its Innovation and Performance department to describe its policies regarding generative and algorithmic AI, but it does not make those policies publicly available and declined to provide them to PublicSource. The county has not yet decided if future policies will be made public, according to its spokesperson. 

Schwanke commends the city and county for developing guidance while urging greater transparency from both.

Beth Schwanke works in a computing lab in the Barco Law Building on July 12. (Photo by Jess Daninhirsch/PublicSource)

As local governments develop policies, Schwanke says establishing consistency across departments should be given consideration. The task force’s most recent report warned of the emergence of wide-ranging differences between local municipal departments, she said. 

“There was kind of an agency-by-agency approach,” Schwanke said. “And some agencies were really approaching things transparently, thinking about accountability, understood risks in bias, etcetera. And I think there were others that needed, at the time, and perhaps still, to do some work and thinking about that.”

Pitt Cyber has been researching and raising awareness of AI technology much before the current generative AI boom, and Schwanke maintains much of the focus ought to stay there. 

“There’s been a huge focus on gen-AI by a lot of municipalities,” Schwanke said, “and we also need to be thinking about the other types of AI that are already impacting residents’ lives in real ways.” 

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

3 ways to support our work:
  • Contribute to the Journalism Fund. Charitable giving ensures our information remains free and accessible for residents to discover workforce programs and entrepreneurship pathways. This includes philanthropic grants and individual tax-deductible donations from readers like you.
  • Use our Preferred Partners. Our directory of vetted providers offers high-quality recommendations for services our readers need, and each referral supports our journalism.
  • Use our services. If you need entrepreneurs and tech leaders to buy your services, are seeking technologists to hire or want more professionals to know about your ecosystem, Technical.ly has the biggest and most engaged audience in the mid-Atlantic. We help companies tell their stories and answer big questions to meet and serve our community.
The journalism fund Preferred partners Our services
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

Skills, not schools: A new path for government tech

See Pittsburgh’s top robotics, from tech for the visually impaired to humanoid bot workers

Meet Pittsburgh’s winners in the 2024 Technical.ly Awards

This Week in Jobs: Fill your plate with these 26 tech career opportunities

Technically Media