“It’s about human bias, amplified by technology”: In conversation with Shirley Ogolla

AI Oracle art installation © Vincenzo Werner

Are you a junk data collector, a clone educator or a 3D food printer technician? Shirley Ogolla and the artist collective no:topia explore dystopian job prospects and raise fundamental questions around machine bias in their interactive art installation.

Barbara Gruber

When she graduated from high school, Shirley Ogolla wanted to study computer science. She recalls thinking it would be cool to build robots for old people. But when the then 17-year-old looked into possible university courses she was told, "But you’re a girl. You realise you will be the only girl amongst 120 students. You should seriously think about whether you really want to do that.”
That totally put her off, and in hindsight she regrets not pursuing a computer science degree, even though she still ended up working in artificial intelligence. She now does it from a different perspective, as an artist and social science researcher. Last year, she was voted one of the German Informatics Society’s top 10 AI Newcomers of the Year.
Ogolla has always been fascinated by technology’s impact on humans, what it does to society and the deep changes it’s having on the future of work. While spending a summer at Harvard’s Berkman Klein Center, she dived deep into AI and also founded the art collective no:topia, which explores machine learning and discrimination.

Portrait Shirley Ogolla Shirley Ogolla | © Jakob Weber No:topia’s five artists originally wanted to create art in line with its name - neither too dystopian, nor too utopic. With AI Oracle the collective ended up choosing dystopia, because it triggers more reflection and as Ogolla says: “the time is now, we have to think about the future we want.”

Let’s talk about AI and discrimination

The self-professed internet geek says art has always been her second passion, though it’s not been easy to be both an artist and an academic.
“In Germany I have the feeling you’re either an artist, or a scientist, but both is not really compatible,” she says. “But when I lived in the US, I realised that there are a lot of people who are both: artists and scientists. It's totally compatible and feeds into each other.”
“When I started working as a researcher in Germany, I kept it secret at first, because I thought my scientific credibility might be jeopardised if I’m approaching the topic of AI from a playful angle.”
And yet, as she emphasises in her academic research there’s an urgent need to demystify the field of machine learning, not only for social scientists, legal experts and policy makers, but also for the general public. For Ogolla, art is a great way to break things down, to add more voices and to talk about discrimination.
“I felt that no one outside our academic bubble really understands the links between AI and discrimination. I thought it’s a shame because AI gives us a great chance - I’d even say somewhat of an excuse - to talk about discrimination. Because ultimately, it’s really about human bias, amplified by technology.”
Ogolla says the idea for her interactive art installation AI Oracle was inspired by making the subject of AI, work and discrimination physically experienceable and emotionally tangible.

"You don't need a high level of education nor do you have to speak academic English," she says. Art installation AI Oracle The AI Oracle invites visitors to engage with ethical questions on machine bias | © Vincenzo Werner

Welcome to the future, you will now be scanned

The artwork invites visitors to step into a dark box, where they experience being scanned for the available data of their digital lives including age, gender, sex, educational records, social networking profiles, social class, financial records, health records, intelligence quotients, credit score, location data, dating accounts and also extended data about family, friends, colleagues and even your pets.  

“It’s an incredible number of data points, many people don’t realise just how extensive. And the more extensive it is, the more accurate the forecast,” says Ogolla.  

The AI Oracle’s creator enjoys sitting next to her installation to watch the many hundreds of visitors entering the dark cube, getting scanned and asking themselves questions.   

“A lot of people thought that the data was pulled in via their mobile phones, which is funny because it was just a light installation. It was not actual artificial intelligence, it was just randomised. But people believed it was AI, and that’s extremely dangerous, because it shows the projection of power into technology, and there is still so much that technology can’t do.”

Your future job?

Work is already changing rapidly under the influence of AI, introducing lots of new job descriptions. ‘Data janitor’ or cleaning data is Ogolla’s favourite dystopian future job because, as she says, “everyone talks about fancy jobs like data scientists.”
“We will need robot cleaners and data janitors,” she says. “There will be a lot of new jobs that have to do with maintaining, processing, and feeding data. And the job profiles will change and we have to think about whether we want that or not.”
Ogolla says by translating her academic research into an artistic practice, the installation raises a lot of ethical questions around privacy, transparency and also discrimination.  However she is still doubtful whether people realise how much data actually contributes to discrimination.
“People who are not privileged or who are not white men, they know what it’s like to be discriminated against, based on their surname, their skin colour, their gender and their abilities - whether that is at the workplace, at school or even earlier, in kindergarten.”
Collective no:topia includes the following artists: Vincenzo Werner (construction design), Moody Kablawi  (interaction programmer), Louis Killisch (visual documentation), Piera Riccio  (artist & interaction advisor) and Shirley Ogolla (artist, researcher & producer).

Learn more about Shirley Ogolla's views on the future of creative AI here.

Recommended Articles