Civic News
Data / Technology

This Penn professor’s geospatial tool predicts where child maltreatment will happen

Ken Steif, the director of University of Pennsylvania’s Master of Urban Spatial Analysis program, has been working with Texas nonprofit Predict Align Prevent on a framework that considers privacy and proximity to community resources.

Tip tap. (Photo by Flickr user Amelia Wells, under a Creative Commons license)
The difference between proactive and reactive resource allocation may be a life.

For about half of the children who die from maltreatment, child welfare agencies aren’t aware there’s a problem until a fatality report, estimated Dyann Daley, the founder of the Texas nonprofit Predict Align Prevent (PAP). Between 2013 and 2017, 34 children died of this cause in Richmond, Va., for instance.

To prevent instead of react to those deaths, the Richmond Department of Social Services contracted Daley and Ken Steif of Philly-based civic tech consulting org Urban Spatial to create a geospatial data tool that predicts where child maltreatment will most likely take place. The code is public on Github so others can use it as a blueprint.

“The goal of putting it on open source is that cities, civic technologists, nonprofits, even businesses, can come and take this data, can make it their own and can add to it,” said Steif, who is the director of University of Pennsylvania’s Master of Urban Spatial Analysis program. “We believe it’s approachable and sponsors a community of best practices around a really important problem that cities have faced and will continue to face in the future.”

The project analyzes 6,500 confirmed cases of child maltreatment in Richmond over four years and, depending on the factors leading to those cases, predicts where else it might happen, Steif said.

Past reports of child maltreatment, domestic violence, aggravated assaults, teenage runaways and narcotics are the most likely indicators of future abuse, said Daley, who has worked as a child anesthesiologist for 11 years.

Once potential cases were mapped, Steif and Daley looked at how close they were to community resources that could prevent childhood maltreatment, like churches, child care facilities and schools. Then, they pointed to gaps where more proactive efforts could be set up.

“Knowing very specifically in a city where these things will occur allows for the prevention efforts that people have trying to reduce fatalities or trying to reduce kids going into [foster care to] get to those places and provide the services before there’s a serious, more fatal event,” Daley said.

Pinpointing where maltreatment will likely happen allows for improved services, but it begs the question of privacy. But Steif said the scale of the project’s data is “bigger than a house but smaller than a neighborhood,” so no individual is identified.

(Other predictive models currently in use, such as in Pittsburgh, use private data to predict risk at the household level. And Philly spent eight years developing a risk-assessment tool deemed potentially racist; as of December, it had not yet been deployed, and lawmakers and activists were calling for a discontinuation of the project.)

PAP and Urban Spatial also had to consider how the data they drew from could by skewed by selection bias, which is “something about a place that breeds maltreatment or something about the kind of person that has a higher propensity to commit maltreatment sorting into a particular place,” Steif said.

In tests of the model, though, Steif said it produced comparable results in poor versus rich neighborhoods. It did find more risk of maltreatment in areas where mainly minorities live versus white areas, and Steif said his team believes that’s likely not because police or agencies are targeting the neighborhoods where maltreatment is being reported at the highest rates.

“Ultimately, what this means is that more … human resources should be deployed in those minority neighborhoods,” he said.

An independent ethical evaluation by the University of Auckland of the project reminded Daley and Steif of possible consequences of their work, such as prompting heightened surveillance in certain communities, privacy breaches and stigmatization. But the benefits it poses outweigh its risks, according to the evaluation.

The team is not sure if the tool will be replicated beyond Richmond, but it’s a hope, thanks to its open source code. For Steif, this project’s benefit is the crux of all of the work he does at Urban Spatial: providing cities with data analysis so they can make better decisions about how to serve people.

“It’s not just the development of an algorithm,” Steif said. “I mean, yes, it’s an algorithm that we’re very proud of. But more importantly … we can hand this over to someone who doesn’t have the technical machine learning or programming skills, and they can actually use the predictions to make better day-to-day operational decisions.”

Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending

Wharton created a free series for entrepreneurs to learn about gen AI

What does SXSW mean for cities now?

Total solar eclipse 2024 is a big deal. Here’s what to expect

Philadelphia commerce tech company Stuzo has been acquired for $190 million

Technically Media