New Times,
New Thinking.

How will the Republic of Facebook tackle its Donald Trump problem?

In the absence of lawmakers who are able to control it, Facebook’s solution is to imitate them, building its own system of government from scratch.

By Martha Gill

Facebook’s decision is looming: will it ban Donald Trump indefinitely? YouTube has already done so, while Twitter has said it will lift its ban only when the “risk of violence has decreased”. 

Facebook’s Oversight Board will make the call: a “supreme court” made up of 20 members – academics, think tankers and a Nobel Peace laureate. The board was set up in May 2020 to sift through appeals from users when their content has been removed, and is meant to work as an “independent” policymaking body. Its answer to the former president question is important: it will have implications for the site’s future moderation decisions, and will likely put pressure on other social networks too. 

It’s a question of free speech, and at present the question is framed like this: is Facebook a public square, which would make kicking people out a serious infringement of freedom of expression? Or is it more like a newspaper – obliged not to serve its users with incendiary language and lies?  

The trouble is, of course, that both ideas are wrong; Facebook is neither. It’s a company uncontrolled by governments that has the unaccountability of a public square and the editorial power of a newspaper. If it decides one day to ban Trump and the next to make Trump-themed content the first thing everyone sees when they open up the site, it can. It may not be good PR, so it probably wouldn’t. But it could. 

They are so large that it’s easy to get confused about what internet giants actually are. They have made themselves indispensable, so we tend to assume they have a responsibility towards us in return. But they are not, say, countries, whose lawmakers are accountable to their citizens. Nor are they even public services, committed to acting in the best interests of their users. They are accountable only to their shareholders. 

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

That’s not to say it’s a bad thing when the tech giants take big decisions seriously. If large companies want to consider whether their choices are benefiting the public, that is all to the good. But the problem is that when we frame the Trump decision as the big free speech issue, it distracts us from an even bigger one: that increasingly free speech itself depends on the whims of those who own digital platforms.  

[see also: Tech giants should pay the price for the abuse posted on their sites]

Social media companies now censor and regulate more speech than any government. Many millions of times a day, these firms determine how content is ranked and sorted, and who sees what, when. They choose which accounts to suspend or block, which content to remove, fact check or post a warning about. They govern largely in secret. Despite endless pages of terms and conditions, many of the rules that exist in practice are not publicly available to those subject to them.  

In a way, it barely matters what decision Facebook comes to on Trump. If it wanted, it could change its mind, or make contradictory decisions in future. Other internet giants could follow its lead, or choose not to. There are no guarantees.

Even Facebook has noted the problem. “Every day, Facebook makes decisions about whether content is harmful,” its VP of global affairs, a certain Nick Clegg, wrote in a statement on Trump and the Oversight Board. “It would be better if these decisions were made according to frameworks agreed by democratically accountable lawmakers.”

It’s interesting that, in the absence of lawmakers who can control it, Facebook’s solution is to imitate them, building its own laws and system of government from scratch. It was perhaps inevitable that, in managing their users, social media companies have imitated the journey of every civilisation on Earth – starting with pure tribalism and ending at something that looks closer to a nation state. (Facebook even has plans to launch its own cryptocurrency. There are reports this week that it could be trialled later this year.)

But of course Facebook’s lawmakers will never really be accountable to the site’s users – that would mean giving us the power to vote for them. Even its Oversight Board, which is supposed to force it into binding rules on policy, has serious weaknesses. It can only review and overturn content moderation decisions; it can’t tamper with the algorithms that produced them. It has time to look at just a few cases a year. Members make decisions according to guidelines Facebook hands them. And then there’s the fact that Facebook handpicked those members and pays their salaries via a trust.

If there is a solution, it will involve remembering that Facebook is not a publisher, or a public service, or indeed a democratic nation, no matter how closely it may imitate these things. It’s a company that cannot be trusted to regulate itself. And it’s just too big. 

[see also: Scott Galloway: the wolf at Big Tech’s door]

Content from our partners
The Circular Economy: Green growth, jobs and resilience
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on