Diversity & Inclusion
Arts / Data / Events / Technology

These Brooklyn artists are exploring the dark side of Big Data

Tega Brain and Surya Mattu, the creators of Unfit Bits, reveal how algorithms can reinforce social inequalities in an upcoming exhibit tour.

Inside The Glass Room, the site of Tega Brain and Surya Mattu's upcoming tour. (Courtesy photo)

For all the inspiring, useful and simply cool things tech has enabled, it’s also brought up some serious social and ethical concerns. At Technical.ly, we’ve covered a few of them: whether the tech boom contributes to gentrification, for instance, and how much of a threat artificial intelligence may pose in the future.
Later this week, Brooklyn-based artist-engineers Tega Brain and Surya Mattu will be leading a free tour that dives into several of these issues. They’ll be at The Glass Room, an exhibit in Nolita sponsored by Mozilla and the Tactical Technology Collective that explores the privacy implications of technology. Its displays include a book of leaked passwords and facial-recognition software that can track churchgoers’ attendance (no, seriously). Brain and Mattu will hold their tour, themed “algorithmic disobedience,” on Friday, Dec. 9 from 3 p.m. to 4:30 p.m.

Get a ticket

What exactly is “algorithmic disobedience”? One satirical demonstration of it is Brain and Mattu’s project Unfit Bits, which is on view at The Glass Room. Unfit Bits examines an increasingly popular practice among health insurance providers through which they offer customers discounts if they wear a fitness-tracking device while exercising regularly. It spoofs this concept by illustrating farcical ways of simulating activity data. No need to head to the gym: just swing a Fitbit on a pendulum, and collect your discount.
Jokes aside, Unfit Bits is meant to expose the downside of these types of incentive programs. On the surface, they seem wholly beneficial: Who would argue against a financial incentive to work out more? But that perk comes at the cost of giving up privacy. People are opening up their daily routines to their insurance providers, which could potentially use that information against them — say, if it turns out a customer is too sedentary to qualify for a discount. (For a deeper dive into the topic, check out this 2014 feature from Backchannel.)
There’s another issue at stake, Brain told Technical.ly by email. Whether it’s worth exchanging privacy for potential financial gain depends a lot upon one’s economic status, she pointed out. For those struggling to meet their insurance premiums, the incentive is much more enticing.

“This was such an obvious example of how privacy is linked to privilege and how surveillance or data collection typically happens to the most vulnerable in society,” she wrote.

There are other examples of how Big Data can work against disadvantaged groups. Brain named a couple: Racial bias in algorithms that predict the likelihood of criminal activity, as well as LinkedIn’s propensity to show high-level management jobs more often to men than women. Both instances, according to her, demonstrate how susceptible algorithms are to human bias and historical inequities, but that bias often isn’t apparent. (The notion that these algorithms are neutral is something others in the industry, including Fred Benenson, Kickstarter’s former vice-president of data, have spoken out against.)

This becomes problematic when algorithms are assumed to be neutral or are given more weight than human decision-making processes,” Brain wrote.

For better or worse, though, these algorithms underpin much of our daily activity — so much that most of us might not even be aware of it. Through their tour, Brain and Mattu seek to expose how issues of inequity pop up in common online activity. It’s also a topic Mattu examines in his writing for ProPublica, where he is a contributing researcher. Mattu was one of the authors of ProPublica’s feature from earlier this fall on what Facebook knows about its users.

(Mattu’s co-authors on that feature, Julia Angwin and Terry Parris Jr., broke another disturbing story regarding Facebook: Advertisers could target campaigns by race and potentially flout federal anti-discrimination regulations. In response, Facebook eliminated the feature.)

That our daily internet and smartphone activity could have such implications is a lot to process, and Brain and Mattu’s tour cheekily acknowledges as much. The description of the event reads thusly: “For the confused or overwhelmed, this tour offers an hour of group tech therapy.”

Series: Brooklyn
Engagement

Join the conversation!

Find news, events, jobs and people who share your interests on Technical.ly's open community Slack

Trending
Technically Media