Frankly … posthuman
How robots manipulate us
Have you read the terms and conditions? No? In our “Frankly … posthuman” column, Liwen Qin reveals how our behaviour adapts to that of a machine.
By Liwen Qin
I was alone in the elevator of JW Marriot Hotel in Shenzhen, one of the Silicon Valleys of China. The door opened. A white vertical cylinder that resembled an Artuditu of the Star Wars without arms and legs rolled in. In a child-like voice, it confidently announced: “Hello, I am going to the 38th Floor.” I stood there totally amazed, filled with joy and curiosity, wanting to follow it and befriend it, perhaps own it. I was fully aware that I had been manipulated.
The development of Artificial Intelligence has been exaggerated. In an article listing the “seven deadly sins” of AI predictions in the past, Rodney Brooks, former director of the Computer Science and Artificial Intelligence Laboratory at MIT, explained that there is still an immense gap between the reality and the common belief in the AI development.
Without the chance to say “no”AlphaGo or Deep Blue may have beaten human world chess champions, but machine learning is still very brittle and far less complex than the sponge-like learning of human brains. The real reasons for us to worry today is not AI taking over the world, but humans adapting to the lower intelligence of the machines, some as cute as the Artuditu.
The first adaptation is a behavior pattern that resembles machines. Clicking on electronic contracts that we never read, or “liking” videos online, feeding the machine to get to know our preferences: cuteness or jokes. We are trained into AI’s data feeding machines without the chance to say “no” in many cases.
Sentiments of a slave-master relationshipThe second adaptation is a sense of entitlement which affects the relations between humans. When humans develop a sense of ownership towards another somewhat intelligent creature and abuse such power, like yelling at Alexa, it starts to affect their attitude towards each other because it revives certain behaviors and sentiments of a slave-master relationship.
The third adaptation is that spending too much time with subservient AI could weaken our ability to build up satisfying relationships with other humans. Today, over half a million Japanese men are dating a digital character called Rinko. Such “digisexuals” choose the all agreeable and controllable relationship with AI rather than complicated human relations in which there are also tensions.
Sure, we train AI too. But not everyone can decide how to train AI, only those who define what the AI should “hoax” us to do: companies and governments are leading the game.
With a moral, social and historical lensSo far, daily digital surveillance and manipulations are much less powerful in Germany than in some other parts of the world, thanks to the strong resistance of the society against invasion of citizens’ data privacy, but some part of the reason, ironically, is the relatively poor IT infrastructure in Germany compared to countries like the US and China. But this is not going to last very long.
I don’t think we have lost the fight against machines, if common users become more aware of their interactions with AI, and if designers of AI bear in mind that the application of a primitive intelligence on humans should always be scrutinized with a moral, social and historical lens. After all, how we perceive and what we expect from ourselves defines how AI will develop in the future.
My name is Liwen Qin. I work with IT companies from time to time and will share my newest discoveries with you here.
On an alternating basis each week, our “Frankly …” column series is written by Liwen Qin, Maximilian Buddenbohm, Dominic Otiang’a and Gerasimos Bekas. In “Frankly … posthuman”, Liwen Qin takes a look at technical advances and how they affect our lives and our society: in the car, in the office, and at the supermarket checkout.