Technologists must do better: Drexel prof on the ethics of algorithms - Technical.ly Philly

Dev

Sep. 30, 2016 11:57 am

Technologists must do better: Drexel prof on the ethics of algorithms

As Philadelphia tries to make sense of a potentially racist criminal justice algorithm, we asked one expert about who should be responsible.

Kristene Unsworth in her office at Drexel.

(Photo by Juliana Reyes)

Technologists don’t have the luxury of ignoring the ethical considerations behind their work.

Or, at least, they shouldn’t, says Drexel professor Kris Unsworth. And there’s a growing constituency of academics and engineers who agree.

“Science isn’t just this pure thing away from ethics,” she said. “People need to know they go together. We can’t, as technologists, just step back and say, ‘Not my problem.’ We can do better.”

We reached out to Unsworth, whose research focuses on technology and ethics, after she tweeted out our story on a City Council hearing where city leaders struggled to make sense of a new data tool that they worry could reinforce decades of racist criminal justice policies.

At the hearing, the statistician charged with building the tool, Penn’s Richard Berk, told a city committee that the algorithm would involve tradeoffs: If it were the pinnacle of fairness, it would be less accurate.

Meaning, if the tool were blind to factors affiliated with race, like ZIP codes, it would make more mistakes.

“In the real world,” Berk said, “we can’t have it all.”

But what struck us the most was the position he took. The decision is in your hands, he told the committee. I’m just making the tool. My work here is done.

Later, when we asked him about this stance in general, he said it was a matter of playing to his expertise. “I have no legal training or any training as an ethicist. And I have no experience in the arm wrestling of politics,” Berk wrote in an email. “All I can say with confidence is that in criminal justice, like in most things, you can’t have it all.”

Advertisement

We wondered about this and the implications for the algorithms that are increasingly ruling our lives. Can technologists, for whatever reason, function outside the realm of ethics? Do they? And what does that mean, then, for these tools? Where does the responsibility lie?

As technology moves toward self-awareness, what moral role should coders play?

Unsworth believes it’s crucial for technologists to take on this burden, to be accountable to their creations and the people whom they affect.

“People who design algorithms have a responsibility to that fairness piece,” she said.

She’s not the only one who’s thinking this way. There’s a growing movement, she said, among technologists who are grappling with what it means to make responsible algorithms. It’s a shift from a more traditional, status-quo industry view. She likened it to how engineers moved to make technology more accessible in the ’90s. Technology, she said, is moving into a more self-aware phase.

In July, Unsworth attended a conference in Germany called Data, Responsibly, co-organized by Drexel computer science professor Julia Stoyanovich. She’s part of a working group that includes those from both academia and industry that is developing a set of principles for “accountable algorithms.” She’s in the middle of a four-year, $300,000 National Science Foundation grant to study the ethics of algorithms with Drexel professor Kelly Joyce.

Some at Haverford College, like computer science professor Sorelle Friedler, and Penn, whose Fels Policy Research Initiative is hosting a series of talks on fairness and algorithms this semester, are also focused on these issues. There was also that Philly Tech Week 2016 presented by Comcast panel from Community Legal Services and Philadelphia Legal Assistance on the same topic.

Meanwhile, in Brooklyn, our sister site reported on former Kickstarter data chief Fred Benenson’s use of the phrase “mathwashing” to describe how mathematical models can “paper over a more subjective reality.”

fred-benenson-meme

In D.C., a Georgetown professor named Pablo Molina said technologists must develop stronger codes of conduct.

It’s a topic that’s also rising to national prominence, in part due to a new book by New York City-based author Cathy O’Neil called Weapons of Math Destruction. (She wrote a blog post about what bothered her about the hearing at City Council that we covered.)

OK, so people are thinking about this.

What of solutions to the fairness/accuracy tradeoff?

Because Unsworth, for her part, thinks we can “have it all.”

Unsworth advised creating an algorithm to test the fairness of the city’s forthcoming risk assessment model, which is still early in the production phase — the courts hope to get it running within the next two years. She said it’s important to self-critique these kinds of models. Transparency is crucial, too, she said, adding that judges who use this information should be able to explain how this tool is affecting their decision-making process.

In a broader view, she said that education is important. At Drexel, cybersecurity students are required to take her courses on ethics and information and policy. Might computer science students be next?

Lastly, the public should gain a better understanding of the subjectivity of algorithms. Too often, she said, people think that because it’s science, it’s objective. And, the thinking goes, since it’s objective, it’s morally neutral.

“What many of us are saying is, ‘That’s just not the case,'” she said.

-30-
JOIN THE COMMUNITY, BECOME A MEMBER
Already a member? Sign in here

Advertisement

Build the 21st century of government at Imagine Nation ELC 2018

Villanova gets $3 million from the feds to study inclusion in STEM

CS4Philly needs your help getting computer science into Philly schools

SPONSORED

Philly

Join our Technical.ly Match beta, an opt-in alternative to recruiting

Philadelphia

Perpay

E-commerce Software Engineer: Magento, PHP, Python

Apply Now
Chesterbrook, PA

Deacom

Entry Level Software Developer

Apply Now
Chesterbrook, PA

Deacom

IT Specialist

Apply Now

Meet the anti-hunger idea that emerged victorious from Jefferson’s Nexus Maximus

Philly311’s new chief wants to get city agencies to use more data

Philly CIO: Smart-city initiatives can ‘help build trust in government’

SPONSORED

Philly

Clients and community come first for Vanguard employees

Malvern, PA

Vanguard

Sr. Full Stack Developer-Client Experience Lab

Apply Now
Malvern, PA

Vanguard

Product Owner of Data Science Solutions

Apply Now
Malvern, PA

Vanguard

Java Developer-Entry

Apply Now

Sign-up for daily news updates from Technical.ly Philadelphia

Do NOT follow this link or you will be banned from the site!