Civic News

Baltimore and AI: The city’s CDO talks about AI’s everyday applications in government

Justin Elszasz reflects on AI in local government after an Institute for Education-organized chat with international civic and tech leaders.

A photo of attendees at an Institute for Education-organized event on AI. Justin Elszasz, Baltimore's CDO, is in the top right. (Courtesy photo)
Baltimore’s chief data officer, Justin Elszasz, tweeted last Wednesday that he had then just attended a fireside chat in DC with New York City’s former CTO, the European Union’s ambassador to the US and other civic and tech leaders. The topic of conversation: the future of artificial intelligence in cities like Baltimore.

The Institute for Education, an organization that routinely holds exclusive events with political and cultural leaders, hosted “Artificial Intelligence in Smartening Up Cities” in the nation’s capital last Tuesday night. Attendees included the aforementioned onetime New York City CTO, John Paul Farmer, as well as the City of Vienna, Austria’s innovation head Robin Heilig and EU ambassador Stavros Lambrinidis.

These days, “AI” is an umbrella term that covers machine learning, which relies on algorithms to make software better at predicting outcomes without being explicitly programmed to do so. The umbrella includes applications ranging from Baltimore police’s controversial ShotSpotter technology, which uses machine learning to detect firearms discharges throughout the city, to Netflix suggesting what to put in your queue. This isn’t the “Terminator” films’ Skynet-type tech of doomsday predictions’ past, but the more mundane and commonplace programs we already use every day. The question instead revolves around the implications of local government using AI, as Baltimore tries to be more efficient with its resources, provide more equitable services and protect residents’ privacy.

“We need our residents’ voices to understand where we should be using these types of [AI] tools,” Elszasz told Technical.ly. “ShotSpotter might be the most prominent live example currently, but more opportunities are going to come up.

A prime use case Elszasz throws out is an algorithm to find vacant houses at the point they’re abandoned, and before they reach that city-defined vacancy status where they have deteriorated to the point of being unsafe for residence. At that stage, a house can cost thousands to repair. The AI algorithm could check homes’ water usage and flag when a residence has been empty long before it reaches the extent of disrepair. The good it could solve, given Baltimore’s approximately 15,600 vacant properties, could have a real impact.

But the privacy concerns, with the data and monitoring elements, are also real. That’s where Elszasz calls for community input and dialog to find the middle ground between a solution and Big Brother-esque overstepping.

And there are already fewer theoretical applications of AI taking place in Baltimore, with the Department of Public Works’ Bureau of Solid Waste using RUBICONSmartCity routing software to optimize truck routes.

For the good AI can do, it still learns from us, so it picks up our flaws. There have been papers about the racial bias of AI, including one that led to the controversial departure of a prominent Black researcher at Google, and the American Civil Liberties Union warns of the ways AI can increase inequity. With these trickle-down effects of tech’s diversity problem, it’s no mere coincidence that the best facial recognition software systems misidentify Black people at rates five to 10 times higher than they do whites.

The issues with facial recognition software could easily reflect potential problems with an algorithm on vacancy and water levels in a predominantly Black city — for instance, it could list Black people’s homes as vacant when they’re just on vacation because of how the data gets skewed. That’s just a hypothetical problem, but Elszasz wants to have the conversation.

“You can’t talk about AI without talking about the full data life cycle,” Elszasz said. “Where the data came from, what’s been done to it, what bias it has. They’re all entangled, and that’s why we need to have these conversations because city government can’t/won’t get it right every time. It’s just too complex. I’m one person in a position of power with my privileges. The idea of me making decisions about what algorithms we should or shouldn’t use is wrong. Especially for Baltimore.”

The fireside chat put the conversation of AI and city government in perspective for Elszasz. It made him realize how much you cannot separate the historic redlining and divestment of neighborhoods in Baltimore, which culminate in the concept of the Black Butterfly and the White L, from anything you do data-wise and algorithmically in the city. The systemic racism always shows in Baltimore data, and it can’t be avoided.

“There are opportunities to be had [with AI] but we need to do it responsibly and equitably,” Elszasz said.

Donte Kirby is a 2020-2022 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Robert W. Deutsch Foundation.
Companies: City of Baltimore

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

Our services Preferred partners The journalism fund
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

Leadership lessons from Baltimore’s Key Bridge collapse, a defining crisis event

Interactive timeline: top moments from Baltimore’s challenging yet inspiring year in tech

How 5 orgs help local businesses achieve success

Baltimore is setting a national standard for diversifying its economy

Technically Media