Quick access:

Go directly to content (Alt 1)Go directly to second-level navigation (Alt 3)Go directly to first-level navigation (Alt 2)

“Couch Lessons”
“What are machines allowed to do independently?”

Every “Couch Lesson” has a different focus, this time, it’s “AI + Peace.”
Every “Couch Lesson” has a different focus, this time, it’s “AI + Peace.” | Illustration: © Marcia Mihotich

The “Couch Lessons” highlight the opportunities and risks of algorithmic decision-making systems. In a new episode, the disarmament expert Angela Kane and the political scientist P. W. Singer discussed the topic of “AI + Peace.”

By Annette Walter

Technological progress has often been driven by military conflicts. For decades, nations around the world have been using Artificial Intelligence (AI) as a means of warfare, for example with armed drones in Afghanistan. But how can AI be used responsibly for military purposes; how can it help promote peace and counter armed conflicts?

Angela Kane from the Vienna Centre for Disarmament and Non-Proliferation (VCDNP). Angela Kane from the Vienna Centre for Disarmament and Non-Proliferation (VCDNP). | Photo: © Angela Kane These are questions that the scholar Angela Kane from the Vienna Centre for Disarmament and Non-Proliferation (VCDNP) has been dealing with intensively for a long time. She emphasised how important it is in this context to discuss ethical issues, noting, “We need more regulation of the use of armed drones and other military AI technologies.” She cited the United States, where concrete political measures to deal with drones in the military were initiated under the presidency of Barack Obama, as a good example. “That was a start, I found that positive.” However, this project no longer exists under his successor Donald Trump.

Most important change in human history

P. W. Singer, American political scientist and military analyst, dealt with the use of drones and military robots in his 2009 book “Wired for War: The Robotics Revolution and Conflict in the 21st Century”. For him, AI is the most important change in human history, a technology that is causing disruptions in many areas. Not just the American defence strategy involves AI. IT giants such as Google and Facebook are investing billions in this field, but also corporations that one wouldn’t expect to at first glance, such as the fast-food chain McDonald’s and the agricultural machinery manufacturer John Deere.
 
For Singer, it’s crucial that, “Every new technology poses new questions about what is right and what is wrong. This raises unknown legal and ethical questions that we have never had to answer: What are machines allowed to do independently, who should own them and who should reap the fruits of their labour?”

Privacy issues need to be renegotiated

P. W. Singer, American political scientist and military analyst. P. W. Singer, American political scientist and military analyst. | Photo (detail): © P. W. Singer In this context, he also mentioned the technology of automatic face recognition as a common method with which the US military can identify targets in the dark. But the police in cities like Moscow, Beijing and New York also use such programs, for example during street protests. This method might guarantee more security for citizens, but its use raises complex questions about privacy.
 
Singer believes that the coronavirus pandemic will accelerate the development of AI, for example in the field of telemedicine. This would affect robot-controlled Covid-19 tests, contact tracing, which is already being practiced, or automated temperature measurement of people in public spaces to check for possible infections.

Regulations for AI-capable weapons are imperative

One of the listeners wanted to know whether war needs to be redefined due to the use of AI technology. In Kane’s view, AI has given warfare a new definition. People used to fight against people. Today, a person can control a drone 3,000 kilometres away from its area of operation and thus injure or even kill countless civilians. Therefore, the most important thing for her, as she emphasised several times in the “Couch Lesson”, is “We need regulations for AI-capable weapons, for example when it comes to their use by peacekeeping forces. We don’t yet have such regulations.”
 

Top