Artists force us to confront the rise in citizen surveillance

Valia Fetisov's "User Flow" on display at Chronus Art Center, Shanghai 2019 Courtesy: Valia Fetisov

In an age of social distancing and mass protest, it's not only governments that are stepping up their use of technology to monitor citizens. Let's face it, we are all keeping a closer eye on each other.

Christy Lange

As I write this, twin crises are unfolding in the United States: The COVID-19 pandemic and the protests against police brutality. The US government is turning to technological solutions to manage both of these crises, in the name of protecting citizens’ health and safety. Law enforcement is using everything from drones to social media listening to monitor protestors. Meanwhile, to stem the spread of the virus, governments worldwide have teamed up with tech companies to create contact tracing apps to track citizens.
 
Around the world, tech-based solutions to crises are being marketed to private and public interests alike. Once-proprietary surveillance tools are becoming popular consumer products, as tech companies seek to exploit our fears. Consequently, the act of citizens surveilling their fellow citizens is becoming just another smartphone-enabled habit. But what happens when we are increasingly watching each other, and willing to be watched? Artists are helping us imagine the consequences of that future.

Would you report a jaywalker?


Before many of these issues were even on our radar, Belgian artist Dries Depoorter was exploring the consequences of citizen surveillance. His installation Jaywalking (2015) allows viewers to watch live footage from traffic cameras set up at crosswalks. If they see a pedestrian crossing against the light, viewers are given the option of reporting the offender by pressing a button that sends a screenshot, anonymously, to a nearby police department. Viewers of Jaywalking are presented with a dilemma that feels almost visceral: do we use our new powers in the name of civic duty? Where is the line between snitching, safety and surveillance? Depoorter's works play on the fact that today we are becoming more and more accustomed to being the ones behind the camera, not just in front of it. He asks us to weigh the consequences and responsibilities of exercising that power.
 
While spying on unsuspecting jaywalkers feels almost illicit, today, almost any citizen can easily buy their own surveillance equipment on Amazon, courtesy of Amazon. The Ring home security system, acquired by Amazon in 2018, allows homeowners to install their own motion-sensing 'video doorbells', in order to remotely monitor their doorstep. While the product is marketed as a convenience, the tagline carries undertones of the potential danger our neighbours might pose: “A lot happens at your front door.” Ring's website shows off actual Ring video footage capturing (alleged) thieves and vandals in the act. And this footage isn't just useful for homeowners: Ring retains the right to access all the data and videos, and the company reportedly also allows law enforcement to obtain the footage to pursue suspected criminals. Dries Depoorter's "Jaywalking" installation at Mirage Festival, 2019 Dries Depoorter's "Jaywalking" installation at Mirage Festival, 2019 | Photo credit: Marion Bornaz

Shadow stalkers


American artist Lynn Hershman Leeson has been pursuing themes of policing and surveillance since her early work in the 1970s. Her most recent installation, Shadow Stalker (2018–21), explores what happens when we rely on technology to monitor and police our society. Upon entering the installation, viewers input their email address and an algorithm creates their ‘digital shadow’ projected on the wall, publicly displaying just how much personal data can be easily mined from this single data point. Another screen shows maps with certain neighbourhoods marked red, signifying their likelihood of being high-crime areas. The maps are made using predictive policing technology – proprietary algorithms that law enforcement uses to predict where crimes might occur. Shadow Stalker makes visible to the average citizen what it's like for tech companies and government agencies who have access to the invisible technology that tracks, monitors, and polices us.
 
Hershman Leeson has been studying the mechanisms of surveillance since before such high-tech solutions even existed. In her early interactive piece Lorna (1979–84), the viewer is shown video footage of the fictional character Lorna in her apartment. Lorna, the piece informs us, is too afraid to leave home because of the fearful images she sees on TV. Via LaserDisc technology, viewers are given the option to eavesdrop on her conversations and to control certain devices in her apartment, including her TV, mirror, and phone, in a work that seems to presage today’s 'smart' home devices.
Lynn Hershman Leeson's interactive piece "Lorna" from 1979 Lynn Hershman Leeson's interactive piece "Lorna" from 1979 | © Lynn Hershman Leeson / Hotwire productions LLC
American artist Lauren Lee McCarthy’s Someone (2019) turns Lorna into reality, albeit a less sinister one. In this work, viewers are given access to laptops that screen live audio and video inside the apartments of four individuals. These four volunteers have allowed McCarthy to replace their virtual home assistants with microphones and cameras that can be accessed by gallery goers they can talk to but can't see. When the volunteers need something, they simply call for 'Someone', and the viewer can interact with them or offer 'help' by controlling their lights, music, and other home devices.
 
The ostensible subject of McCarthy's piece is to become ‘a human version of Amazon Alexa' – putting the viewer in the omnipotent position of seeing or hearing through the lens of a smart home security device. But watching and waiting for volunteers to request help from ‘Someone’ quickly becomes an excuse to surveil these strangers in their own homes, creating the same uncanny sensation of voyeurism as in Depoorter's Jaywalking. Someone is also a reflection of how willingly we invite this kind of surveillance into our homes when we feel it’s being performed by a neutral, disembodied machines. The question is, who is truly in the position of power in this dynamic between watcher and watched, between machine and human?

Networked neighbourhood watch

 
When it comes to the normalisation of passively – or actively – watching our fellow citizens, there is perhaps no better modern example than the Nextdoor app. In the months since the pandemic began, Nextdoor has experienced a boom in popularity in the U.S. While the app is marketed as a community-building tool "to greet newcomers, exchange recommendations, and read the latest local news," it is also widely used to 'snitch on' fellow neighbours and has led to racial profiling. The misuses of the app have spawned a Twitter account, which documents everything from the offensive (people labelling their black neighbours as 'suspicious') to the absurd (a user asking her neighbours not to lock their cars after 8pm because the "beep beep noise" bothers her). Until recently, the app featured a "forward to police" function that allowed users to send their notices of suspicious activity directly to law enforcement. The popularity of the Nextdoor app is a lesson in how technology can – wittingly or unwittingly – turn vigilance into something more dangerous.
Lauren Lee McCarthy's "Someone" project Lauren Lee McCarthy's "Someone" project allows viewers a sneak peek into four private apartments | Photo credit: Stan Narten In the past, networks and infrastructures of surveillance were largely opaque or hidden – now we are visibly taking part in them. Valia Festov’s User Flow attempts to immerse the viewer in an experience that might resemble our future society if we continue to adopt systems of social control to manage crises. Upon entering this installation, the user is asked to submit themselves first to a physical search, then to a psychological evaluation; finally, they are confronted with a monitor equipped with a 3D camera and a face alignment neural network that bids them to come closer so it can spray liquid into their mouths.
 
User Flow illustrates the way surveillance and control systems that we take for granted - like submitting to an airport security inspection or facial recognition ID – can become more and more oppressive, especially when technology is involved. “The line and the inspection process are visible to others in order to normalise behaviour of being searched, but also to present care and to dramatise the idea of 'being safe',” writes Festov in an article about User Flow. “The question is, do we trust systems that want to quantify our relationships with each other and establish a new social order? Or, will the design of these systems, their performativity and their social normalisation become so dominant that they will be able to take over our decisions?”
 
Festov plumbs the fundamental question at play today: what are we willing to accept in the name of our own health and safety – especially in a crisis – and what could happen if these new measures stay with us long after the crisis is over?

Learn more about Christy Lange's views on the future of creative AI here.

Recommended Articles

API-Error