Discussion Couch Lessons: AI + bias

AI + Bias Illustration: Marcia Mihotich

Wed, 08.07.2020

6:00 PM - 7:00 PM

Online

AI + Bias

Generation A = Algorithmus

Artificial intelligence is not neutral. It’s created by humans and how it’s built, trained and applied greatly influences the outcomes. How an algorithm interacts with human beings from different cultures, genders, sexualities, races, etc., depends on the team of AI experts that built the system and the training data they used as inputs. AI systems do not learn bad habits without humans programming those bad habits into them. At a time when many companies and governments are looking to deploy AI systems across their operations, being acutely aware of those risks and working to reduce them is an urgent priority. In this lesson we’ll look at discrimination that is already being observed in AI managed systems and discuss some suggested tactics to combat it in future developments. What does it mean to carefully consider every angle of making, iterating, and designing AI? Every step of this process needs to be thoroughly re-examined through different lenses.

What will you learn

  • What is bias in technology?
  • How can AI’s be developed to serve a diverse set of users?
  • What does it mean to design a data set as a form of protest?

Presenters

Caroline Sinders, machine-learning-design researcher and artist / USA
Lorena Jaume-Palasí, Executive Director The Ethical Tech Society / Germany
Gunay Kazimzade, Weizenbaum Institute for the Networked Society / Germany

The series is curated and moderated by Martin Thörnkvist (curator und organizer of Deep Cuts and Hours Beirut, Sweden) and Jeannette Neustadt (project manager Generation A = Algorithm, Goethe-Institut, Germany).

Back