How artists are hacking bias in algorithms

Missing Data Sets ©Mimi Onuoha

Algorithms and artificial intelligence are used to hire and fire staff, admit students to university and even decide on prison sentences. While these algorithms save time, they can be unfair or discriminatory. Artists are taking up the fight against AI’s coded bias and exposing some of its pitfalls.

Barbara Gruber

Whether it’s Google Translate arbitrarily assigning genders to certain professions, or Amazon sifting through resumes and suggesting mostly male names for hiring, or facial recognition software misidentifying black faces: there are many examples of bias generated by artificial intelligence.
 
Most AI systems reflect characteristics of the dominant voice in their code and in the data they use to learn, and that is clearly male and white.
 
Curator and founder of AIArtists.org Marnie Benney says that while AI might be designed with good intentions, a lack of diversity in the data sets and in the people creating the technology systematises discrimination entrenched in our societies.
 
Benney launched the platform in 2019 to create a global community of artists exploring the creative possibilities and challenges of AI. She argues that a wide variety of experiences and perspectives is essential for understanding how humans are entering the age of intelligent machines.
 
“We need artists, poets, musicians and philosophers around the world to channel their creativity and help investigate these new tools,” she says. “We need queer, gay, trans, straight, fluid people thinking about it.”
 
Only about 10% of professionals working in the field of AI identify as women, and when it comes to AI and art there are similar disparities. This is why it is so important to promote diversity and inclusion as a curator and as a technologist, says Benney.
 

Data driven discrimination

One of the women featured in the AIArtists.org project is poet and academic, Joy Buolamwini. In her powerful performance piece AI, Ain’t I A Woman, Buolamwini shows how modern AI systems fail to recognise black females.

Reciting her poem as images display how coded facial recognition algorithms have identified Shirley Chisholm, Oprah Winfrey and Serena Williams as men, labeled the hair of Michelle Obama as a toupee and Google photos labelling black people as gorillas, Buolamwini shines a light on what she describes as the “coded gaze.”

Buolamwini uses art, the poetry of code, and research to explore the social implications of AI and is a leading activist for what she calls “algorithmic justice.” She founded the Algorithmic Justice League which collects people’s experiences about bias in AI, audits software and creates more inclusive data sets.

“Who codes matters,” according to Buolamwini, particularly “creating full spectrum teams with diverse individuals who can check each other’s blind spots”.

Buolamwini emphasises that it’s also important “how we code,” factoring in fairness, and “why we code,” making sure greater equality and social change are a priority and not an afterthought. She says she wants a “world where technology works for all of us, not just some of us”. Caroline Sinders at a workshop Caroline Sinders at a workshop in Berlin in 2019 | © Z2X Festival

Why feminist data matters

Algorithms are only as good as the data they work with, of course. However, collecting diverse data is often challenging and extremely time consuming.

Frustrated by the many documented cases of data driven bias against women, machine learning design researcher and artist Caroline Sinders commenced a social justice art project to work on a feminist data set in 2017.

Sinders’ approach is to dissect and interrogate every step of the AI process — including data collection, data labelling, data training, selecting an algorithm to use, and then designing how the model is placed into a chat bot — all through the lens of intersectional feminism. This form of feminism acknowledges that different identities and marginalisation of a person should be viewed together as overlapping and not as separate issues.

Sinders gathers feminist data through a communal process and workshops in libraries, conferences or art spaces. Feminist data can be blog posts, podcast transcripts, books or articles that workshop participants identify and discuss.

“Ethical, communal, hackable design and technology is a start towards an equitable future,” she explains. “It allows for community input and for a community to drive or change a decision about a product, its technical capability and its infrastructure.”

Sinders says her workshops show how hard it is to actually find feminist data, because, "these writings are under-cited, under-published and if you’re using Google to search, it's also really hard because there's the biases of the search tools.”
 

Not the Only One

One artist whose work is directly challenging biases of technology intersecting with race, gender, ageing and our future histories, is Stephanie Dinkins.
 
Her trans-media project Not the Only One (N’TOO) is a voice-interactive installation telling the multigenerational story of a Black American family. Dinkins designed and trained the deep learning algorithm with interviews of three generations of women from her family.
 
Through Dinkins’ use of oral history and machine learning as co-creator of a living repository of memories, myths, values and dreams of this specific community, the AI becomes the fourth generation in the lineage of her family. Visitors can walk up to the installation and interact with the artificial intelligence by asking questions.

Library of Missing Datasets

Missing data sets can also contribute to machine learning bias because relevant data isn’t being tracked at all.
 
Mimi Onuoha is a Nigerian-American artist whose work Library of Missing Datasets shows how non-recorded information reflects priorities in society as well. She says that her artistic practice aims to disrupt and challenge assumptions baked into the technologies shaping our experiences, by focusing on patterns of absence. Onuoha believes art can change the AI narrative and shift the way we think about these things. Library of missing data sets Mimi Onuoha's "Library of missing data sets" highlights how machine learning bias happens | © Mimi Onuoha “It allows us to imagine different possibilities and different futures for how we can be using these technologies, instead of getting caught up in this trap of constantly responding to the harms that we see,” she explains.
 
For AIArtist.org curator, Marnie Benney, one of the benefits of AI, especially for artists, is its ability to reflect back who we are as humans.
 
“AI is helpful in identifying some of the fundamental and systematic problems of inequality that have plagued our species,” she says. The structure of the technology highlights inherent biases of its creators - inherent biases we all have.
 
She also believes it’s the people creating art about the inequalities they personally face, both offline and online, who will help end biased technologies. Her hope is that their work will ultimately change how artificial intelligence is designed and applied in the future.

Learn more about Marnie Benney's views on the future of creative AI here.

Recommended Articles

API-Error