Quick access:

Go directly to content (Alt 1) Go directly to first-level navigation (Alt 2)

The Project

How is bias coded into our technology? How can we infiltrate the opaque mechanisms upon which technological systems are built and counter the perpetuation of bias? The Open Call, the Festival Revisions. Decoding Technological Bias and the film series Filmic Explorations are part of the project IMAGE + BIAS that critically engages with the cultural realities being increasingly determined by imperceptible technologies.


Articles


Online film series on Goethe on Demand

Image & Bias Filmic Explorations © Goethe-Institut San Francisco

Online Film Series | Thursday, November 18 to Sunday, November 28, 2021
IMAGE + BIAS. Filmic Explorations

Technology is never neutral but a reflection of the biases in our society. How is bias coded into our technology? The film series Image and Bias. Filmic Explorations brings together feature films, documentaries and short films that address such questions from different perspectives and with different focus.


The Festival – Video recordings

The festival Revisions. Decoding Technological took place from June 10 through 18, 2021 and brought a network of luminaries together to share new perspectives and rewrite new visions advocating for justice and reclaiming power.

Image+Bias: Mushon Zer-Aviv Goethe-Institut

Panel Discussion
Encountering technological bias

The panel discussion brings together experts in different fields to identify various forms of embedded bias within our networked image cultures.

Image+Bias: Jillian York Goethe-Institut

Against Technosolutionism!
Why We Can't Regulate Our Way Out Of This Mess

The same radical technologies that helped give rise the social and political movements of 2010-12 later enabled a rise in disinformation, propaganda, and the promotion of other harms.

Image+Bias: Maureen Webb Goethe-Institut

Coding Democracy
How Hackers Are Disrupting Power, Surveillance, and Authoritarianism

Hackers have a bad reputation, as shady deployers of bots and destroyers of infrastructure. Maureen Webb would like to offer another view.

Image+Bias: Kalindi Vora Goethe-Institut

Surrogate Futures
Technology, Race, and the Human

In this talk, Kalindi Vora and Neda Atanasoski consider how the surrogate effect of technology within technoliberalism, as they describe it in their book, Surrogate Humanity: Race, Robots and the Politics of Technological Futures (2019), comes to bear on recent discussions around technological bias.

Image+Bias: Jonathan Beller Goethe-Institut

Economic Media
For the Decolonization of Money

This talk shows how monetary media, by inscribing code on bodies, is a force of ongoing colonization under racial capitalism.

Image+Bias: Jer Thorp Goethe-Institut

Living in Data
A Citizen’s Guide to a better Information Future

To live in data in the twenty-first century is to be incessantly extracted from, classified and categorized, statisti-fied, sold, and surveilled.

Image+Bias: Ryan Milner Goethe-Institut

You Are Here
A Field Guide for Navigating Polarized Speech, Conspiracy Theories, and Our Polluted Media Landscape

Our media environment is in crisis. Whitney Phillips and Ryan Milner offer strategies for navigating increasingly treacherous information flows.


The Open Call

From March 1 through May 23, 2021, we invited artists, designers, and the general public to submit creative representations on the subject of bias and technology’s growing ability to alter people’s visual perception of reality.

An international jury comprised of artists, curators, and researchers had the difficult task of selecting the ten best works* from over 150 submissions.

Read more about the project and the open call
* These AR art pieces contain sensitive or violent content which some people may find offensive or disturbing.


Finalists

Golden fantasy animal on black background Caitlin Foley, Misha Rabinovich

Ecology of Worries

Artists: Caitlin Foley and Misha Rabinovich

“Ecology of Worries” asks the question of whether we should teach a machine to worry for us. It is enabled by an archive of actual recorded worries we’ve been collecting from people since 2016.

The words White and BIPOC Arman Paxad

White Shade

Artist: Arman Paxad

The term BIPOC has found a safe haven in the media and minds of people who consider themselves concerned about the long-running presence of racial and social injustice in the Western world.

An undefined face in black and grey Adam Szklenar

Filterted Views

Artist: Adam Szklenar aka “Skwodam”

AI-generated pictures of non-existing people with social media targeting keywords from real people. The AI goes wrong sometimes and makes smaller or bigger mistakes – both in image generation and targeting your interests. AR with original music.


Presentation of the finalists

Image+Bias: Presentation and Talk Goethe-Institut

Presentation and talk
Visualizing Bias with Augmented Reality

On July 2, Goethe-Institut, Gray Area, Fotomuseum Winterthur, and Artivive invited to a presentation, panel discussion, and virtual gallery launch with the finalists of the global open call Visualizing Bias with Augmented Reality.


Selected Projects

  • Abstrakte Grafik in schwarz, weiß, grau und grün mit Schriftzug Decoding Babel und Fotos von bekannten Persönlichkeiten Craig Tilley
    Decoding Babel von Craig Tilley: Da sich Technologien mit einer extrem hohen Geschwindigkeit weiterentwickeln, laufen wir Gefahr, als Gesellschaft diesen Entwicklungen gegenüber zurückzubleiben und damit eine Zukunft mit zu erschaffen, die weit von unseren Vorstellungen entfernt ist.
    Details Decoding Babel
  • Porträt eines Mannes mit einem Internet-Suchbalken über den Augen Aurora Micale, Chiara Palmucci
    Incel von Aurora Micale and Chiara Palmucci: Algorithmen sind in der Lage, jeden Menschen durch eine mehr oder weniger genaue Analyse der Websuche zu klassifizieren.
    Details Incel
  • Collage aus neun unterschiedlichen Gesichtsausdrücken einer Frau Avital Meshi
    Deconstructing Whiteness von Avital Meshi: „Deconstructing Whiteness“ dokumentiert eine interaktive KI-Performance. Das Stück untersucht die Sichtbarkeit von Race im Allgemeinen und „Whiteness“ im Besonderen – durch die Linse der KI-Technologie.
    Details Deconstructing Whiteness
  • Collage: Ein Drache mit einem Coronavirus in den Vordertatzen steht auf einer Weltkugel voller Panzer. Alireza Vaziri Rahimi
    The Covid-19 Infodemic von Alireza Vaziri Rahimi: In diesem Projekt möchte ich Konzepte wie Propaganda, Fehlinformation und soziale Medien in Bezug auf Design und unsere heutige digitale Kultur neu untersuchen.
    Details The Covid-19 Infodemic
  • Collage verschiedener Comicfiguren Ilaria Trapani, Marco Manco
    8 WAVE von Ilaria Trapani und Marco Manco: Zensuralgorithmen können nicht verstehen, was beleidigend ist und was nicht. Die Rate an geteilten Memes, die hasserfüllt sind, ist erschreckend hoch.
    Details 8 WAVE
  • Bearbeiteter Ausschnitt aus Alexandre Cabanels' “Fallen Angel” Nirav Ben
    Gradation Descent von Nirav Ben: Diese Arbeit ist ein kurzes audiovisuelles Video, das eine konzeptionelle Nachahmung oder visuelle Neuinterpretation von Algorithmen des maschinellen Lernens und der Bildverarbeitung zeigt, die auf Alexandre Cabanels Gemälde „Fallen Angel“ einwirken.
    Details Gradation Descent
  • Zugeschnittene Nahaufnahme eines Männergesichts auf schwarzem Hintergrund mit Schriftzug Bias Daniele Silvestri
    Fronzoni von Daniele Silvestri: Im Prozess des Profiling durch Algorithmen sind wir von Punkten umgeben, die unsere Persönlichkeiten mit höchstmöglicher Sicherheit bestimmen wollen, um unser Verhalten zu antizipieren.
    Details Fronzoni


Partners

Top