Algorithms and the Freedom of Choice
The Computerised Self

Computer data
Computer data | Photo: © Dmitry Nikolaev – Fotolia.com

Computer programs create user profiles and determine the role we play on the Internet and our online behaviour patterns. Are algorithms causing us to lose our freedom of choice? This discussion also applies to users in Germany who above all see the advantages.

Algorithms – programs based on certain chains of commands – use the data tracks we leave behind us on the Internet to calculate our consumption habits and our communication behaviour. They influence the results of our Internet searches and define the mode of payment we should use in an Internet shop on the basis of our address, from which they can deduce with a certain degree of probability what our financial situation is.

In Germany, too, there is a lot of discussion about the effects of algorithms. Many people are wondering whether they are actually being remotely controlled by computers. The idea that we might be influenced by something that is not tangible and that cannot be understood by a layman is a source of insecurity, even fear sometimes, for some people.

Machines involved in the decision-making process

Algorithms are effective – with the help of mathematical and statistical processes probability prognoses for all kinds of situations can be ascertained from the user’s previous behaviour. What we see on Facebook is in fact what the algorithm has computed for us – depending on the way we have interacted with the platform’s contents.

As such it would even be theoretically possible to influence the way people form their political opinions. For quite some time there has been a lot of discussion on this phenomenon – a phenomenon that was termed a filter bubble by the American net activist, Eli Pariser. In his book of the same name, published in 2011, Pariser established that his personalised news stream on Facebook presented him above all with news items that were in line with his political views. In this way the user becomes isolated in a “bubble” that blocks out any information that might go against the way he thinks. The influence of algorithms, believe it or not, actually goes a step further – today machine-produced prognoses are already deciding whether a person is creditworthy or what insurance contribution he should pay. The place where he lives becomes a risk of non-payment, his state of health a risk of illness.

Do we, however, really have to be worried that people have lost control over their decision-making? The calculations arrived at by algorithms are not reliable forecasts, but statements of probabilities. Algorithms also do not constitute the complex system of values that enables us to make decisions according to the situation we are faced with. They remain on the level of simple actions, like clicking on to a text or buying a product.

The demand for transparency

Even the filtered reality of a Facebook stream is not a problem that was created anew by algorithms. “Basically the filter bubble is just one bubble among many others. Our social environment also forms the way we perceive things in quite a definite way,” says Ben Wagner, director of the Internet and Human Rights Research Unit at the European University Viadrina in Frankfurt an der Oder. “The main thing is, however, that we have to be aware of these things.”

Nevertheless, many algorithms, like those of Facebook or Google, are not transparent. The people who use these services are not aware of the fact that decisions are being made for them “in advance”. “At the moment there is the danger that fringe groups in society are being marginalised, for example, people who cannot take out a medical insurance policy due to the data on their health records,” warns Ben Wagner – one of many other researchers who think the same. This is why the demand for more transparency remains the main issue to be considered when it comes to “algorithm ethics.”

A German debate

In the German public debate on the “algorithmisation” of people there are above all a lot of critical voices to be heard. One of the best known warning voices was that of the journalist and co-publisher of the Frankfurter Allgemeine Zeitung, Frank Schirrmacher, who unfortunately died in the summer of 2014. In his bestsellers, Payback (2009) and Ego (2013), he asserted the opinion that the Internet was a drug and that computers changed the way we think. For quite a few years now German politicians have also been warning us about a “data dictatorship”. In his 2014 book, Finger weg von meinen Daten (Get Your Hands of My Data), the MEP for the German Green Party, Jan Philipp Albrecht, feels that the human being has been more and more deprived of his right to make decisions, more and more degraded into a mathematically calculable system that can optimise itself.

In contrast, there have been attempts to critically examine the criticism itself. Just how strong is the power of algorithms really? Why are we directing our criticism at supposedly negative developments, instead of emphasising the positive aspects of digitalisation, asks the author, Kathrin Passig, in the Internet culture magazine Berliner Gazette. Science historian, Klaus Mainzer, professor at the TU-München (Technical University of Munich), on the other hand, advocates a “de-technologisation” of the discussion. He went on to say that algorithms were neither an invention from Silicon Valley nor in themselves “evil”. As he wrote analogously in his book Die Berechnung der Welt (The Computation of the World), published in 2014, their roots are to be found in the search for a way of describing phenomena in mathematical terms and the scientific desire to develop a theory about the world. And this is something people have been working on for millennia.

At the same time it also came to light that, although the Germans complain about the lack of transparency with Google and Facebook, they are in fact using their services more and more. A broad awareness of the risks, even some kind of political movement, is at the moment, however, not in sight. Why this was the case was the question put to the German law professor, Indra Spiecker, by the Frankfurter Allgemeinen Zeitung newspaper. Her answer – “The technology is simply too beautiful.” Maybe, however, there is an even simpler way of putting it – for many people the concrete benefits, the cultivation of contacts and networking over long distances with so little effort by far outweigh the risks which are still quite abstract.

Top