Design can suffer because equity is not always a priority when our brains make decisions.
In his forthcoming book “Design for Cognitive Bias,” Think Company content strategy advocate David Dylan Thomas examines how a tech-savvy world of tomorrow cannot be truly progressive if new technology carries the biases of its creators.
Thomas points to an example of an Amazon hiring bot only hiring men as an example of sexist bias affecting hiring practices. According to Thomas, if an Amazon hiring bot using 10 years of resumes uses 10 years of men’s resumes as hiring data, its artificial intelligence will make it keep selecting men for jobs and inadvertently downplaying women’s resumes in the process.
“On the surface, it seems a fine way to design, but it ignore facts,” Thomas said. “A thoughtful way to design could say, ‘What if we could design the system in a way to make the bot look at women’s colleges?’ It’s very easy to design something like that to point the AI at the world that we’ve got.”
Thomas says that having checks and balances in your design process can help curb biases and believes that eliminating bias altogether is not as practical as adding balance. By adding what he calls complementary biases to processes, people’s biases can be balanced out by the thoughts of people from other groups, creating a comprehensive exchange of ideas.
Designating a red and blue team in an exercise, for example, shows how complementary biases can work: Each team learns the same material separately and for one day get to “go to war” with one another. While the red team may have their own bias, the blue team’s bias helps show the red team a different perspective. As a result, each team’s biases balance the other’s.
Between frequent public speaking engagements and designing inclusive design workshops for clients, Thomas found additional time to work on his book thanks to support from his team at Think Company, finding a synergy because the book and his day job employ similar ideas.
An experience while traveling for Think Company became the impetus for Thomas writing a book about design bias: While attending South By Southwest in 2016, Thomas saw Iris Bohnet give a captivating speech about gender equality by design.
“She laid bare this notion that bias comes down to pattern recognition,” Thomas said.”If a pattern on your brain says a developer has to be a skinny white dude, that’s what you will expect. It’s not that you actually believe the skinny white dude is a better developer, but you will treat him differently than if someone who isn’t a skinny white dude were to apply for the same job.”
For Thomas, steps in curbing bias are only as good as the system as they are in.
“Sadly there are studies where people of color, ‘whiten’ their resumes and get called back twice as much as opposed to Black and Asians who don’t ‘whiten’ their resumes,” he said. “If Black people whiten their resume they get 2.5 times more callbacks for jobs” — so making resumes anonymous is a step in the right direction.
Still, Thomas notes that his book, which offers short-term steps in changing how biases affects systems, is a single response to an overall system that has looked down on women and Black people for hundreds of years. Long-term advances in civil rights and systemic norms will take more time and work, he said, for those groups to get the same opportunities as white men at jobs.
Michael Butler is a 2020-2022 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. This position is supported by the Lenfest Institute for Journalism.-30-