The problem at the heart of the Brussels disinformation playbook – POLITICAL

The European Union’s revamped strategy to combat disinformation has one flaw: it relies on social media giants controlling their own tasks.

In a voluntary regulation announced on Thursday, Brussels unveiled its long-awaited (and awaited) code of conduct on disinformation, or a series of dos and don’ts when it comes to Facebook, TikTok and YouTube fighting online falsehoods that they can quickly spread to the rest of the world.

Yet, at its core is a grand compromise, which could prove fatal to the ongoing efforts of the 27-country blockade to counter Russian propaganda, prevent online trolls from making money through online advertising, and curb the spread of COVID disinformation. 19 that continues unabated.

The new code still relies on companies, and only companies, deciding whether to comply with a number of new requirements. This includes providing greater access to data to outside researchers, providing detailed country-by-country analysis of how disinformation circulates, and cutting the purse strings of fake news traders who sell such material online to make a quick buck.

This voluntary – or, in European Commission parlance, co-regulatory approach – stems from the difficulty of controlling a stream of digital fakes which, while nasty, brutal and cruel, often don’t break any of the bloc’s existing rules.

This is the conundrum of the EU disinformation strategy: approving binding rules for online fakes that risk platforms removing too much content, or relying on a voluntary mechanism intended to push social media giants to tackle the problem alone. it could result in content not harmful enough to be removed.

Vice-President of the Commission for Values ​​and Transparency Věra Jourová | Francois Walschaerts / AFP via Getty Images

Faced with these options, Brussels has opted for a non-binding playbook that relies on the goodwill of tech companies – and the potential reputational damage if they don’t take action – to avoid being accused of cracking down too hard on people’s legitimate right to freedom of word. Nobody in Brussels wants to be seen as the creation of a Ministry of Truth.

“We didn’t want to set stone and put stricter rules against disinformation in the digital services law because it could easily get to the slippery slope that leads to some kind of censorship,” Věra Jourová, vice president of the Commission on Values ​​and Transparency, told reporters. in reference to the new online content rules of the block.

EU officials know there is a compromise.

Under Thursday’s rules, which will take effect in early 2023, a group of organizations that have joined the code, which includes social media companies, advertisers, fact-checkers and civil society groups, will meet regularly to ensure that everyone is playing by the rules. The Commission will oversee that body, although there are few, if any, controls other than public naming and shaming if a company decides to withdraw or not be fully transparent.

Brussels has also tried to attack the biggest companies like Alphabet and Meta, the parent companies of YouTube and Facebook, respectively, by linking the voluntary regulation to the separate revision of the online content rules of the block, or the Digital Services Act. Such proposals, which will enter into effect by 2024, they mainly focus on policing illegal speeches such as the sexual exploitation of children online. But it also includes a number of outlandish measures like mandatory risk assessments and external audits aimed at shedding light on the inner workings of social media platforms.

And this is the problem. Participation in the voluntary disinformation code can be used as part of companies’ risk assessment measures, providing a carrot needed to support voluntary settlement versus the stick of a potentially hefty fine of 6 percent of a company’s global revenue being incorporated into the legally binding content rules. If you want to stay on the right side of the new online content rules, the theory goes, so subscribe to the code and prove you’re a good corporate citizen.

Yet the fundamental friction remains. All the companies registered in the playbook say they are eager to participate and welcome the new rules, which probably go beyond any other jurisdiction in trying to solve the complexity of the disinformation problem. But, ultimately, the code is voluntary and relies on everyone who signed up to abide by the rules.

“Having a code is just the beginning, but implementation and control are key,” said Carlos Hernández-Echevarría, head of the Spanish fact-checker for public policy Maldita, who adhered to the new rules. “I need to see if these commitments translate into real and meaningful actions by the platforms and this is obviously to be seen.”

This article is part of POLITICIAN Pro

The one-stop-shop solution for policy professionals who blend the depth of POLITICAL journalism with the power of technology


Exclusive scoops and insights


Custom policy intelligence platform


A high-level public relations network