Quick access:

Go directly to content (Alt 1) Go directly to first-level navigation (Alt 2)

Social Networks
“We’re Dependent on These Platforms”

Is Elon Musk a curse or a blessing for Twitter now? Tweet by Musk to launch his takeover of the platform.
Is Elon Musk a curse or a blessing for Twitter now? Tweet by Musk to launch his takeover of the platform. | Photo (detail): © Adobe/Koshiro

Is it possible to control the power of the big social media platforms using democratic processes? Cultural scientist Michael Seemann talks about the shaping of public opinion in the digital space.
 

Mr Seemann, in your book “Die Macht der Plattformen” (i.e. The power of platforms), you describe the influence social networks like Facebook and Twitter have on our society. What is the nature of power in these platforms, and how does that affect us as users?

In short: they are the owners of our connections – our relationships and friendships. Not only does it put the platforms in a position of control, we’re also becoming increasingly dependent on them as we use them to maintain more connections. For instance, if I have a friendship that I only keep up on Facebook, then I’m dependent on the platform to continue the friendship.

But it isn’t just our social contacts that are affected. In recent years the platforms have been subject to criticism, in particular because of filter bubbles and echo chambers – in other words for mostly showing us posts that reflect our preferences. What influence do these algorithms have on the shaping of public opinion?

Of course it’s correct to say that the structure of the platforms and the algorithm design influence the discourse held there. But that should not lead to the naïve idea that there is such a thing as the ideal discourse, and that you only need flick some switch or other to end up with that. It takes a lot of trial and error to work out which decision has which outcome. The people who tinker about with the platforms are usually surprised themselves at the effects their decisions ultimately have.

We’re becoming increasingly dependent on the platforms as we use them to maintain more connections.”

Can you give an example?

For instance after Donald Trump was elected as US president, it was found that in fact Facebook’s algorithm probably had been partially responsible for the election result. Interaction with news articles was “rewarded”, which gave fake news an incredible reach. Because of this, Facebook reprogrammed the algorithm so that instead of external articles, more communities within the platform were shown. The idea was to network users locally and according to their interests. But the development took a completely different direction: QAnon groups and other conspiracy theorists simply set up their own communities and then spread their fake news there.

After the Trump election in particular, there was a call for more legal regulation. You were an expert on the subject of platform regulation in the German Bundestag. To what extent can the power of these networks be regulated anyway?

It depends on what you expect from regulation. If you’re aiming for a specific change, then regulation works very well. For example we can use the General Data Protection Regulation (GDPR) to stipulate how platforms are supposed to handle data. And the major platforms at least generally comply with these specifications.

And where are the limitations of regulation?

Laws like GDPR don’t make us less dependent on the major platforms per se. In fact the opposite is true: new barriers are often capital-intensive and smaller competitors can’t keep up. So it also causes a market shakeout. We saw that the independent online advertising market practically dried up as a result of GDPR, while Google and Facebook market shares rose again sharply. In other words it means that we’ve become even more dependent on these platforms because of the regulation.

That sounds counter-productive at first. So is the intention of governments and state institutions really to limit the spread of fake news – or isn’t it more a case of using the platforms to shape public opinion and for “message control” – which means spreading messages of their own?

You can’t generalise there, examples of both probably exist. The Twitter Files demonstrated how many interest groups are trying to influence opinion – using sometimes legitimate and sometimes illegitimate tactics. Above all, this sort of thing should not be allowed to take place covertly. There need to be regulated and transparent processes that are democratically protected and appropriately documented. In this respect the platforms have huge shortcomings.

The Twitter Files are threads published by Elon Musk that are intended to show how Twitter is supposed to have restricted the reach of unpopular opinions. There was also mention of a cooperation with secret services. To what extent do the Twitter Files provide insight into what happens in the background with this kind of platform?

In my opinion, the Twitter Files are an attempt on the part of Elon Musk to discredit the old Twitter management and frame them as “corrupt, woke, left-wing managers” in order to legitimise himself and his own governance in the company and to please his right-wing friends. If he was concerned about transparency, then he would hand over the archive to a press consortium, as happened with the Panama Papers. Instead he specifically chose right-wing journalists who put their own spin on the narrative. For that reason the Twitter Files should be approached with great caution.

Do you doubt the authenticity of the Twitter Files?

No, the authenticity is widely confirmed. What’s published is correct. But it’s put across only in part and very selectively. For example there were emails from Joe Biden’s election campaign team in which Twitter had been asked to delete certain Tweets. It was scandalised – without reporting that the Tweets concerned were spreading private nude photos of Biden’s son.

But it’s also about a news story in the New York Post, the authenticity of which is proven, and which focuses on criminally relevant material on Hunter Biden’s laptop. This article was blocked by Twitter in the middle of the US election campaign in 2020. Is that not classed as influencing the political competition?

If I’m honest, I can’t see anything worth creating a scandal over.”

Exactly. I just wanted to point out for context that certain narratives have been created with the Twitter Files, to serve certain interests, and for that reason you just have to be incredibly careful. The leaked internal communication at Twitter involving the Hunter Biden laptop story also illustrates what’s happening in the background and how much toing and froing there is with issues like this. Then on top of that along comes the FBI and says it could be a Russian disinformation campaign. So it was a difficult decision with an uncertain information basis. If I’m honest, I can’t see anything worth creating a scandal over.

Some media scientists are proposing that social media platforms should check and moderate every post and even every comment before publication – similar to how the readers’ letters page works in a printed newspaper. Would that be a possibility, or does that circumvent the fundamental principle of the internet?

I don’t think much of that idea, because it would restrict the diversity and freedom of opinion considerably. Maybe the main platforms would be able to put it into practice, but users wouldn’t accept it and would probably drift off to other channels or onto the dark web. I think that in principle we need an internet that’s free for everyone to use. How exactly moderation should operate is a deeply political question, but that doesn’t mean it should only be decided in the Bundestag. Users of the platforms need to share the responsibility for legitimate decisions. So it’s crucial to establish structures in order to take a political and democratic approach to tackling issues like this.

Top