New Times,
New Thinking.

  1. Politics
  2. Feminism
22 July 2013

10 questions about Cameron’s ‘war on porn’

Including, who decides what counts as 'pornography', what happens when people 'opt-in' and what about page 3?

By Paul Bernal

There’s been a bit of a media onslaught from David Cameron about his ‘war on porn’ over the weekend. Some of the messages given out have been very welcome – but some are contradictory and others make very little sense when examined closely. The latest pronouncement, as presented to/by the BBC, says “Online pornography to be blocked automatically, PM announces“.

The overall thrust seem to be that, as Cameron is going to put in a speech: “Every household in the UK is to have pornography blocked by their internet provider unless they choose to receive it.”

So is this the ‘opt-in to porn’ idea that the government has been pushing for the last couple of years? The BBC page seems to suggest so. It suggests that all new customers to ISPs will have their ‘porn-filters’ turned on by default, so will have to actively choose to turn them off – “and that millions of existing computer users will be contacted by their internet providers and told they must decide whether to activate filters”.

Some of this is welcome – the statement about making it a criminal offence to possess images depicting rape, for example, sounds a good idea on the face of it. Such material is deeply offensive, though quite where it would leave anyone who owns a DVD of Jody Foster being raped in The Accused doesn’t appear to be clear. Indeed, that is the first of my ten questions for David Cameron.

1     Who will decide what counts as ‘pornography’ and how?

And not just pornography, but images depicting rape? Will this be done automatically, or will there be some kind of ‘porn board’ of people who will scour the internet for images and decide what is ‘OK’ and what isn’t? Automatic systems already exist to do this for child abuse images, and by most accounts they work reasonably well, but they haven’t eradicated the problem of child abuse images. Far from it. If it’s going to be a ‘human’ system – perhaps an extension of the Child Exploitation and Online Protection Centre (CEOP) – how are you planning to fund it, and do you have any idea how much this is going to cost?

2     Do you understand and acknowledge the difference between pornography, child abuse images and images depicting rape? 

Give a gift subscription to the New Statesman this Christmas from just £49

One of the greatest sources of confusion over the various messages given out over the weekend has been the mismatch between headlines, sound bites, and actual proposals (such as they exist) over what you’re actually talking about. Child abuse images are already illegal pretty much everywhere on the planet – and are hunted down and policed as such. As Google’s spokespeople say, Google already has a zero-tolerance policy for those images, and has done for a while. Images depicting rape are another category, and the idea of making it illegal to possess them would be a significant step – but what about ‘pornography’. Currently, pornography is legal – but it comes in many forms, and is generally legal – and too many people have very little to do with either of the first two categories…. which brings me to the third question

3     Are you planning to make all pornography illegal?

…because that seems to be the logical extension of the idea that the essential position should be that ‘pornography’ should be blocked as standard. That, of course, brings up the first two questions again. Who’s going to make the decisions, and on what basis? Further to that, who’s going to ‘watch the watchmen’? The Internet Watch Foundation, that currently ‘polices’ child abuse images, though an admirable body in many ways, is far from a model of transparency (see this excellent article by my colleague Emily Laidlaw). If a body is to have sweeping powers to control content – powers above and beyond those set out in law – that body needs to be accountable and their operations transparent. How are you planning to do that?

4     What about Page 3?

I assume you’re not considering banning this. If you want to be logically consistent – and, indeed, if you want to stop the “corrosion of childhood”, then doing something about Page 3 would seem to make much more sense. Given the new seriousness of your attitude, I assume you don’t subscribe to the view that Page 3 is just ‘harmless fun’, but perhaps you do. Where is your line drawn? What would Mr Murdoch say?

5     What else do you want to censor?

…and I use the word ‘censor’ advisedly, because this is censorship, unless you confine it to material that is illegal. As I have said, child abuse images are already illegal, and the extension to images depicting rape is a welcome idea, so long as the definitions can be made to work (which may be very difficult). Deciding to censor pornography is one step – but what next? Censoring material depicting violence? ‘Glorifying’ terrorism etc?  Anything linking to ‘illegal content’ like material in breach of copyright? It’s a very slippery slope towards censoring pretty much anything you don’t like, whether it be for political purposes or otherwise. ‘Function creep’ is a recognised phenomenon in this area, and one that’s very difficult to guard against. What you design and build for one purpose can easily end up being used for quite another, which brings me to another question…

6     What happens when people ‘opt-in’?

In particular, what kind of records will be kept? Will there be a ‘list’ of those people who have ‘opted-in to porn’? Actually, scratch that part of the question – because there will, automatically, be a list of those people who have opted-in. That’s how the digital world works – perhaps not a single list, but a set of lists that can be complied into a complete list. The real question is what are you planning to do with that list. Will it be considered a list of people who are ‘untrustworthy’. Will the police have immediate access to it at all times? How will the list be kept secure? Will it become available to others? How about GCHQ? The NSA? Have the opportunities for the misuse of such a list been considered? Function creep applies here as well – and it’s equally difficult to guard against.

7     What was that letter to the ISPs about?

You know, the letter that got leaked, asking the ISPs to keep doing what they were already doing, but allow you to say that this was a great new initiative? Are you really ‘at war‘ with the ISPs? Or does the letter reveal that this initiative of yours is essentially a PR exercise, aimed at saying that you’re doing something when in reality you’re not? Conversely, have you been talking to the ISPs in any detail? Do you have their agreement over much of this? Or are you going to try to ‘strong-arm’ them into cooperating with you in a plan that they think won’t work and will cost a great deal of money, time and effort? For a plan like this to work, you need to work closely with them, not fight against them.

8     Are you going to get the ISPs to block Facebook?

I have been wondering about this for a while because Facebook regularly includes images and pages that would fit within your apparent definitions, particularly as regards violence against women, and Facebook show no signs of removing them. The most they’ve done is remove advertisements from these kinds of pages – so anyone who accesses Facebook will have access to this material. Will the default be for Facebook to be blocked? Or do you imagine you’re going to convince Facebook to change their policy? If you do, I fear you don’t understand the strength of the First Amendment lobby in the US… which brings me to another question

9     How do you think your plans will go down with US internet companies?

All I’ve seen from Google have been some pretty stony-faced comments but for your plan to work you need to be able to get US companies to comply. Few will do so easily and willingly, partly on principle (the First Amendment really matters to most Americans), partly because it will cost them money to do so, and partly because it will thoroughly piss-off many of their US customers. So how do you plan to get them to comply? I assume you do have a plan…

10     Do you really think these plans will stop the ‘corrosion’ of childhood?

That’s my biggest question. As I’ve blogged before, I suspect this whole thing misses the point. It perpetuates a myth that you can make the internet a ‘safe’ place, and absolves parents of the real responsibility they have for helping their kids to grow up as savvy, wary and discerning internet users. It creates a straw man – the corrosion of childhood, such as it exists, comes from a much broader societal problem than internet porn, and if you focus only on internet porn, you can miss all the rest.

Plans like these, worthy though they may appear, do not, to me, seem likely to be in any way effective – the real ‘bad guys’ will find ways around them, the material will still exist, will keep being created, and we’ll pretend to have solved the problem – and at the same time put in a structure to allow censorship, create a deeply vulnerable database of ‘untrustworthy people’ and potentially alienate many of the most important companies on the internet. I’m not convinced it’s a good idea. To say the least.

Paul Bernal is a lecturer in Information Technology, Intellectual Property and Media Law at the University of East Anglia Law School

This post originally appeared on his blog

Content from our partners
Building Britain’s water security
How to solve the teaching crisis
Pitching in to support grassroots football