6 min read

Q&A: Content moderation under Europe's Digital Services Act

A spin-off from Meta's quasi-Supreme Court is delving into the complex world of deciding which posts should be allowed on Facebook, YouTube and TikTok
Q&A: Content moderation under Europe's Digital Services Act
This image was created via DALL·E.

WELCOME TO A BONUS DIGITAL POLITICS. I'm Mark Scott, and as I experiment with this newsletter, that includes interviews with people I think you should know about.

This first installment (hopefully of many, if you want to chat, email me on digitalpolitics@protonmail.com) is with Thomas Hughes who was, until this week, director of Meta's Oversight Board, or the independent group with quasi-judicial powers over what content is allowed on Facebook and Instagram.

The British human rights expert just left to establish the Appeals Centre Europe. It's a group, funded initially via the Oversight Board's Trust, that will act to resolve disputes for those within the European Union who believe platforms have incorrectly acted against their content within the bloc's Digital Services Act.

Ahead of the Appeal Centre's creation this week, I talked to Hughes about what his new Ireland-based entity expects to achieve; its ties to the now-separate Oversight Board; and why he thinks Europe's new social media rules can empower people's rights online.

Let's get started:


EVER SINCE META'S OVERSIGHT BOARD was announced more than four years ago, the body of legal and human rights experts has been a lightning rod. Some view it as a necessary step to better content moderation decisions on Facebook and Instagram, respectively. Others criticize it as a mere public relations exercise for Meta at a time when the company's role in what posts stay up remains opaque and, often, wrong. At the center of those decisions, Hughes — the former head of Article 19, a human rights organization — ran the Oversight Board's day-to-day operations.

Now, after five years of doing that (he started before the Oversight Board was officially announced), his next project attempts to build on that somewhat complicated legacy. "What the project is really focused on is trying to make constructive use of the legislation that's coming into force under the Digital Services Act," he told me via Zoom.

OK, before we get into whether this is a good idea, let's lay out the basics. Under Article 21 of the DSA, the rules provide for so-called out-of-court dispute settlements, or non-binding arbitration between platforms and users over whether content moderation decisions are correct. These are separate to direct appeals to platforms to reinstate posts and legal efforts, via courts, to have such material reinstated. It's the type of uber-complicated, quasi-regulatory structure that Europe excels (fails?) at.

The Appeals Centre is one of the only groups taking on this role. At its outset, Hughes and his 20 Dublin-based colleagues will take content moderation requests linked to Facebook, TikTok and YouTube. They'll also focus on challenges in English, German, French, Italian and Dutch, though anyone across the 27-country bloc will be able to submit appeals. Over time, they want to add more platforms and languages.

"We think we will already be able to cover Portuguese and Polish, as well as Arabic and Russian," Hughes added. His team is trialing AI-powered translation tools to expand its offering, though that is a work in progress. "It is a resource question," he admitted. "It's also a question of testing (the translation tools) in a live environment."

Thanks for reading Digital Politics. Not a paying subscriber? This is what you missed from Monday's installment: 1) The digital legacy of Oct 7; 2) The dual digital challenge for the incoming European Commission; 3) Why Taiwan and South Korea dominate the microchip industry. Sign up here.

If you're looking to make a request — and this only applies to people within the EU — you'll have to pony up 5 euros to cover some of the costs. The three platforms that have signed up with the Appeals Centre will equally be charged 95 euros, per complaint. That's how Hughes' organization expects to cover its costs. The Oversight Board Trust provided $15 million to fund the group's initial work. FWIW, the Oversight Board has received at least $150 million from Meta, leading to (somewhat overblown) questions about the group's independence.

"That has been part of the discussion that we've had with the Commission," Hughes said when I asked him about the links to Meta's cash, in reference to Ireland's Media Commission. That's the local regulator which had to approve the Appeals Centre's content moderation mandate. "The reality of independence is very strongly embedded in the model that we've adopted," he added. "The perception (of reliance on Meta's cash), I hope, is transitory because we will be working across platforms."

Now, I can already hear the 'come on, now!' from many of you reading this. If the Appeals Centre's upfront costs are covered by Meta — even via a blind trust over which the tech giant has no control — then how can it be viewed as truly independent? In truth, it's a similar question the Oversight Board has grappled with since its inception, too.

But here's the thing. Starting a pan-European content moderation operation, in multiple languages and with quasi-judicial powers, is not something you can do for free. Someone has to front those costs. And in a world where few, if any, European funders exist for this type of work, Hughes legitimately had limited options to turn to for his initial investment.

For me, the Oversight Board has shown its funding ties to Meta has not stopped it from taking whacks at the platform. The question for that organization has never been about its finances. Instead, its limitations lie with the Oversight Board's inability to make mass content moderation decisions across Meta's platforms.

Its binding decisions (and, now, there are many of them) relate to individual pieces of content. Mere drops in a digital ocean, if you will. The group's wide-ranging recommendations on how Meta should change its content moderation practices remain non-binding. Even though, at times, the company has implemented them.

A similar problem awaits Hughes and the Appeals Centre. In practice, providing an impartial "judge" who can decide whether Facebook, YouTube and TikTok should have removed someone's post or video makes sense. "In the digital age, individual empowerment over speech has really been lacking," he told me. "This is a sea change moment. It will adjust the balance of control between the individual, companies and governments. It will allow people to define both what they post, but also what they see online."

I'm not so sure. What the British human rights expert underestimates is people's apathy toward much of this. Yes, the Oversight Board has received, globally, tens of thousands of (free) submissions from people who believe Meta made wrong content moderation decisions. But I'm not convinced enough Europeans will be willing to cough up 5 euros, per complaint, and then go through the hoops required to challenge TikTok, YouTube and Facebook via the Appeals Centre.

Call me cynical, but I would venture that most of us — if/when confronted with a message from any of these companies that our posts have been removed — would shrug our shoulders and just give up. A minority may go through the internal procedures already available within YouTube, TikTok and Facebook to get posts reinstated. Few will actively seek out something like the Appeals Centre, especially in languages currently not covered by the group, to take it one step further.

It goes to the wider difficulties Europe is having with its new social media oversight. Much of this work is technocratic, not pragmatic. The Digital Services Act focuses on wonky risk assessments. It includes lengthy audits to show how platforms are combating online abuse. It includes complex options like out-of-court settlement bodies like the Appeals Centre, which rely on a level of user interest that, so far, has been lacking.

Still, that hasn't stopped Hughes from doubling down on these efforts. "It could be enormously beneficial across all sorts of different segments of society," he told me in reference to parts of Europe's new social media rulebook coming into force. "But, if it's done badly, it could be enormously detrimental," he added. "A failed initial regulatory environment could then lead to something that is very draconian, very censorial, and very controlled by national governments."

This newsletter has been updated to reflect the Oversight Board Trust provided $15 million in start-up funding for the Appeals Centre Europe.