Freepost Filterbubbles and the fake-news age

Filterbubbles and the fake news age
Graphik: Bernd Struckmeyer

Once upon a time there were hopes that the Internet would democratize social discourse – but today the talk is mainly about fake news and filter bubbles whenever the subject turns to the question of how digitization influences politics. What can journalists do to regain the trust that has been lost? And what can ordinary people do to engage to a greater extent in discussions with one another again? Over the next few weeks, this will be debated here by the journalists Robert Misik from Austria and Antony Loewenstein from Australia. Their digital correspondence is postage-free – and open to all, so join in the discussion and give your opinion! Contradict! Ask questions! You can take part using the comments field on this page, or on Twitter using the hashtag #freepost. Geraldine de Bastion, who is chairing the debate, will contribute your comments to the exchange.

Geraldine de Bastion Photo: Roger von Heereman / Konnektiv Geraldine de Bastion: 4 December 2009 marked a paradigm shift on the Internet, as it was on this day that Google began creating personal profiles for every user and individually filtering search results. Internet activist Eli Pariser described this as the start of an “era of personalization”, coining the term “filter bubble” for it in his book Filter Bubble: What the Internet Is Hiding from You.

This growing individualization is evident when we are presented with personalized advertising – and indeed when we use supposedly neutral tools such as search engines to navigate our way through the information medium number one; tools we have to use because otherwise the Internet would be simply impenetrable.

“Customized services” are omnipresent. Rather than being an encyclopaedia of world events, the Internet is more reminiscent of a special interest paper. In our social media profiles too, which should really be connecting rather than isolating us, we find ourselves faced initially with a kind of “one-way mirror”, as Eli Pariser describes it in his book. By watching what we click, algorithms learn more and more about us, and we get increasingly entangled in our own personal bias online: when surfing the web, users only see stuff that matches their profile, their worldviews and their convictions.

Some critics of this theory claim that the filter bubble is not a purely digital phenomenon, and that it is intrinsic in all of us from the start. We view the world through our own particular glasses, surround ourselves with like-minded people and read only things that confirm our own opinions. 

So how do you perceive your filter bubble, online and offline? And do filter bubbles in fact exist at all? 
 
Robert Misik Photo: Helena Wimmer Robert Misik: Of course filter bubbles exist. That is not something that requires any discussion – it is rather a question of interpretation: do the filter bubbles in digital communication enclose and confine us to a greater extent than would otherwise be the case? If this is the question to be addressed, the situation is already more complicated. Modern societies are comprised of a large number of subgroups that differ from one another in terms of their ways of life, political persuasions, personal styles and so on. We have inner city dwellers, working class urban districts, middle classes in the suburbs, the super-rich in their favoured areas, big cities, small towns, villages … The people who live in these various sub-communities also have little contact with those in other sub-communities in real life – and when they do have contact, it tends rather to be on a superficial level.

Digital communication, be it in social networks, forums or other online media, reinforces this logic on the one hand while breaking with it on the other. Reinforced in the sense that, assuming we fit into the patchwork of a community with a particular set of opinions, we will find ourselves inundated with ever more messages that reinforce this community’s prevailing opinions. This entrenches our views and gives us tunnel vision. Yet that is of course only one side of the truth. We can see the opinions of others on a daily basis in the social media and forums – where we are confronted with attitudes that we might otherwise not even notice. That is something that is often overlooked when we talk about filter bubbles. 
 

Antony Loewenstein Photo: Reuben Brand Antony Loewenstein: A key deficiency of modern society is lack of empathy for the underprivileged, a disease caused by experiencing our daily lives in a bubble. Too often what we read and don’t see online and what we hear and experience in our real lives reduces our ability to relate to others who look or sound different to us. It’s tempting to hate refugees coming from the Middle East or Africa if you feel economic and racial insecurity and are told by your trusted newspaper, TV host or friend that you should fear the “other” because they’re worsening your personal situation. Resisting this impulse requires widening what you consume and consider on a daily basis. This tendency existed before the rise of the internet and social media but it’s now easier to find your own tribe online. 
 
I’ve experienced this in my own work. When I visit Gaza as a journalist and tell people that I don’t feel threatened as a Jew by locals or the Islamist government, the instant reaction is often suspicion because the media has fed a line for decades that Palestinians are inherently violent and Muslims want to kill all Jews. This lie can only be challenged by constantly explaining the truth and showing the fallacy of the position. 
 
The rise of Donald Trump, Brexit and rampant nationalism in Europe, the US and Australia has made me spend even more time reading, listening and reporting on the movements that caused these political earthquakes. Contemptuously dismissing Trump won’t make his supporters disappear. I don’t personally know any Trump or Brexit voters, and nor do I associate with white nationalists who loathe Islam, but I’m drawn to exploring why many people are.