Software Development

Why diversity in artificial intelligence development matters

At the HUE Tech Summit, technologists of color discussed the dangers of non-inclusive AI — and what we can do about it.

The AI panel at HUE Tech Summit. (Photo by Holly Quinn)
There is a flawed but common notion when it comes to artificial intelligence: Machines are neutral — they have no bias.

In reality, machines are a lot more like people: If it’s taught implicit bias by being fed non-inclusive data, it will behave with bias.

This was the topic of the panel “Artificial Intelligence: Calculating Our Culture” at the HUE Tech Summit on day one of Philly Tech Week presented by Comcast. (Here’s why it matters that Philly has a conference for women of color in tech.)

“In Silicon Valley, Black people make up only about 2% [of technologists],” said Asia Rawls, director of software and education for the Chicago-based intelligent software company Reveal. “When Google Analytics labeled Black people as apes, that’s not the algorithm. It’s you. The root is people. Tech is ‘neutral,’ but we define it.”

“Machine learning learns not by itself, but by our data,” said moderator Annalisa Nash Fernandez, intercultural strategist for Because Culture. “We feed it data. We’re feeding it flawed data.”

Often, the flaw is that the data isn’t inclusive. For example, when developers assume that the tech will react to dark skin the same as light skin, they’re creating a neutrality that doesn’t actually exist — so, an automated soap dispenser won’t sense dark skin.

“Implicit bias in computer vision technology means that cameras that don’t see dark skin are in Teslas, telling them whether to stop or not,” said Ayodele Odubela, founder of fullyConnected, an education platform for underrepresented people.

If there’s a positive note, panelists said, it’s that companies are learning to expand their data sets when a lack of diversity in their product development becomes apparent.

AI can expose bias, too. Odubela works with Astral AR, a Texas-based company that’s part of FEMA’s Emerging Technology Initiative. The company builds drones that can intervene when someone — including a police officer — pulls a gun on an unarmed person and actually stops the bullet they fire.

“It can identify a weapon versus a non-weapon and will deescalate a situation regardless of who is escalating,” Odubela said.

What can be done now to make AI and machine learning less biased? More people from underrepresented groups are needed in tech, but even if you’re not working in AI (and even if you’re not working in tech at all), there’s one ridiculously simple thing you can do to help increase the datasets: Take those surveys when they pop up on your screen, asking for feedback about a company or digital product.

“Take a survey, hit the chatbot,” said Amanda McIntyre Chavis, New York ambassador of Women of Wearables. “They need the data analytics.”

“People don’t respond to those surveys, then they complain,” said Rawls. “I always respond, and I’ll go off in the comments.”

Ultimately, if our machines are going to be truly unbiased anytime soon, there needs to be an understanding that humans are biased, even when they don’t mean to be.

“We need to get to a place where we can talk about racism,” said Rawls.

If we don’t, eventually the machines will probably be the ones to bring  it up.

Before you go...

Please consider supporting Technical.ly to keep our independent journalism strong. Unlike most business-focused media outlets, we don’t have a paywall. Instead, we count on your personal and organizational support.

Our services Preferred partners The journalism fund
Engagement

Join our growing Slack community

Join 5,000 tech professionals and entrepreneurs in our community Slack today!

Trending

The person charged in the UnitedHealthcare CEO shooting had a ton of tech connections

From rejection to innovation: How I built a tool to beat AI hiring algorithms at their own game

Where are the country’s most vibrant tech and startup communities?

The looming TikTok ban doesn’t strike financial fear into the hearts of creators — it’s community they’re worried about

Technically Media