Wael Abbas, an Egyptian human-rights activist and journalist, disappeared from the internet around 2017. After years of meticulously documenting state abuses, a succession of bans from Facebook and Twitter coincided with Abbas being arrested by Egyptian state security in 2018 on (spurious) charges of “involvement in a terrorist group,” “spreading false news,” and “misuse of social networks”.
Abas’s first social media ban came in 2007, when Youtube removed his account after complaints of graphic police violence (around the time, another one of his videos helped convict two police officers of torture). But pressure from high-profile human rights activists managed to get it restored. In 2017, a similar campaign proved unsuccessful. Abbas called the obliteration of ten years of activism and documentation on his Twitter account akin to “Hitler burning books”.
Tech Monitor: GCHQ’s plans to use AI to tackle disinformation raise thorny questions Part of New Statesman Media Group
In Egypt, one of the biggest official newspaper Al-Ahram positively reported Abbas’s ban. “His country’s government—with the cooperation of two of the world’s biggest social media platforms—finally got its way,” writes Jillian C York in her new book, Silicon Values: The future of free speech under surveillance capitalism.
Silicon Values traces the haphazard and chaotic creation of the speech principles that now govern the platforms billions of us use to communicate every day – demonstrating the patchwork influences of governments, campaigners, and the public on the complex and often arbitrary-seeming content rules that fluctuate wildly by jurisdiction.
The deplatforming of former US president Donald Trump in the dying gasp of his premiership, has sparked an intensified conversation around social media content moderation; York says it’s the most attention the topic has attracted since the panic over ISIS content, yet even that debate was more “niche”.
But it’s not the preservation of history or the documentation of war crimes that dominates the discussion anymore. Like most of today’s issues, it’s dissected down the line of partisanship, with the political right obsessed with the notion there’s a stealth campaign underway to rid the internet of conservative ideas, while the liberal left mostly champions ever more stringent speech controls and content removals, with little thought for the potential collateral. Collateral like Abbas.
York’s book drags us back to the early, heady days of social media’s inception. It’s easy to forget, in the current climate, that social media was originally billed as a conduit of revolution, that the ascendant Silicon Valley titans crowed about democratising information access and participatory tools that could bring people together to exchange ideas, form bonds, and topple governments. Skin-crawlingly, Facebook even tried to claim credit for both the Arab Spring uprisings and the Black Lives Matter movement.
Tech Monitor: Amazon gives a masterclass in botched corporate comms Part of New Statesman Media Group
But that promise, as York illustrates, melted away almost instantly. The desire to provide citizens a revolutionary information-sharing tool, was rapidly superseded by another desire: to stay on the right side of governments in order to retain market access.
In the aftermath of the Trump’s social media ban, it was common to see left-wing commentators take to Twitter to warn that left-wing ideas would be targeted next. “I think that gets it backwards,” York tells me over the phone. “I think [deplatforming] affected the left long before it affected the right.” She says perhaps not in terms of politicians, but marginalised people on the left such as human rights activists in authoritarian countries, Palestinians and sex workers, all felt the brunt of restrictive speech policies and lack of the right to appeal long before Trump.
“Ultimately, Facebook and its counterparts operate more like churches than courts,” writes York in Silicon Values. “They are subject to influence by states and the wealthy, and all too content with disregarding the needs of their subjects in favour of those with power.”
Despite her position as director for international freedom of expression at the Electronic Frontier Foundation, York is not a free speech maximalist with regards to social media platforms. In her book, she says she used to veer closer to this stance, but says now her view is tempered by concerns over “freedom of reach” – the ability of hatred and incitement to spread like bushfire across millions of screens in seconds.
But she says that the debate over what content constitutes material harm, and what best to do about it, desperately requires nuance. She says much of the current conversation is “uninformed” or “misinformed”, highlighting as an example the widespread misunderstanding of Section 230, a piece of US legislation that protects social media networks from being sued over content on their platform.
“It’s very clear to me that we have lawmakers who have no idea what the law actually says, and I’ve seen incorrect reporting from both the New York Times and the Wall Street Journal…about what Section 230 actually allows and what it does,” says York. While Big Tech critics are adamant Section 230 reform will deal a blow to Facebook and its ilk, York says it’s more likely to enshrine their market dominance.
“I think the last four years…have created, frankly, a lot of trauma, particularly within marginalised communities and communities on the left, and those communities are kind of grasping for justice in a way that is resulting in a lot of calls for censorship,” she says.
[See also: Laurie Clarke on why Big Tech’s favourite law is running out of time]
She draws a distinction between the “deep work” that characterises an academic approach to the issue of content moderation, and the “publicity stunt stuff” that is more about calling loudly for more content to be removed. York says that although she has a lot of respect for its members, she believes the group that calls itself The Real Facebook Oversight Board falls into the latter category. York says it has demonstrated resistance to collaborating with civil society organisations and academics that have been examining these problems for a long time. (She declined the offer of a position on the board.)
When you’re debating something as delicate and nuanced as the ability of people all over the world to share ideas and express themselves, York says that a balanced, evidence-based approach is vital. “Maybe Trump’s tweets do constitute real harm, but I don’t think that we’re even thinking that way,” she says. “We also have to think about who we’re harming every time we push for any type of censorship.”