After months of hearings – on both sides of the Atlantic – arguments, revelations, and with an ever-expanding remit, parliament’s Digital, Culture, Media and Sport committee has finally produced an interim report on its inquiry into fake news, which expanded to take in misinformation, Russian interference, Cambridge Analytica, and more.
Much of the focus is, understandably, on the committee’s ruling around Cambridge Analytica and Brexit, but there is much else worth looking at – as the report notes, it is other authorities which will ultimately decide what really happened there, and look at potential prosecutions.
Instead, the committee’s report will play a role in shaping the future of how the UK governs the institutions that spread information and misinformation, and it has especially focused on the social media giants.
And unusually – perhaps even unprecedentedly for a report produced by politicians – when it comes to making recommendations on social media, the report is intelligent, incisive and realistic. Even better, it suggests a solution to one of the oldest and most tedious debates in how we run the internet: platform vs publisher.
The argument centres around how much responsibility social media companies should take for the content that we as their users post on them – and at present, the rules give them a lot of carve-outs that most news outlets would kill for.
The “publisher” side of the debate is the easiest to define: it’s any online site that acts as a traditional news publisher would, commissioning articles (usually paid), vetting them pre-publication, and having some form of editorial standards. Publishers are therefore legally liable for every piece they publish: if it’s defamatory, they can expect to be sued. If it’s hate speech, they can expect a prosecution.
The “platform” definition grew out of the earlier days of the internet, when poorly defined law started to create concerns for internet service providers that they could be held liable for the websites they delivered: if people connected to the internet through AOL or Compuserve, and visited sites which were defamatory, or broke copyright law, would it really be fair to hold AOL or Compuserve liable for the content they had no role in producing?
Sensibly, most people agreed they shouldn’t – but, over time, variations of these protections were extended to sites like bulletin boards (online message boards that served as precursors to modern social networks), on the reasoning that they were essentially providing a service for others to communicate, and couldn’t be expected to pre-vet all content. They retain some liability for what’s on their services, but far less than publishers.
These rules, called “platform” rules – for the idea that they provide a platform for others’ speech, rather than publishing their own – govern modern social networks, despite not really being applicable. Both Facebook and Twitter curate what we see, choosing how prominent different pieces of content are, and enforcing their own rules: Facebook, for example, has decided that while Holocaust denial is fine, hell shall freeze over before a woman’s nipple is visible on their site.
While it’s overly generous to treat social media companies as platforms – even if that’s convenient for them – that doesn’t mean it’s reasonable to treat them as publishers. If nothing else, this would destroy most major social networks as we know them: no one has the capacity to pre-vet as much content as is posted on major networks each day.
Similarly, given the scale that we have allowed social networks to reach, anything too draconian in restricting what is posted would start to pose a serious risk to the freedom of expression of millions.
How, then, have the DCMS resolved this dilemma? They’ve taken the answer that should have been simple and obvious for a long time, and yet still hasn’t been acted upon: they have said social networks are neither platform nor publisher, but something different entirely, somewhere between the two.
The consequence of that, they say, is that social networks shouldn’t be made liable for everything on their sites in the way that publishers are, but the rules (and laws) governing them can and should be tightened up – and the platform rules would remain for other sites genuinely operating as platforms.
This will inevitably attract pushback from the social media giants, but has the advantages of being both a simple and sensible starting place, and also a framework which reflects an obvious truth: social network companies aren’t platforms, and at times have even begun to quietly admit that.
There will be many steps from this point to actually making social networks more accountable for the problems they have fuelled – and some delicate judgments to make to ensure any new rules have minimal (ideally no) impact on free expression. But if given due attention and follow-up, the DCMS report has given us a good place to start from – which given the current state of politics makes a refreshing change.