Imagine a government scheme to catalogue and classify every single video on the web.
But you don’t need to imagine: that’s the bizarre proposal being put forward by Theresa May’s government in the Digital Economy Bill, which reached committee stage in the Lords this week.
The Digital Economy Bill proposes that online video should be classified just as films are now, and by the same people – the British Board of Film Classification.
According to the BBFC’s annual report for 2015, the board classified 983 films for distribution in the UK, and 1,143 hours of online content for video-on-demand.
To put this in context, hundreds hours of video are uploaded to YouTube every minute. The demand that the BBFC should be able to make a judgment on the vast amount of video uploaded to a vast range of sites across the world is extremely ambitious.
As David Austin of the BBFC tactfully put it in a letter to Joanna Shields, the minister for Internet Safety and Security, “the internet is constantly evolving and inevitably any initiative in this area needs to be multi-faceted and flexible.” Well quite.
How could this possibly work? Classification can’t simply be left to an algorithm, so we’re left with the prospect of a huge recruitment drive for people to sit around all day and decide what is and isn’t acceptable, as suggested by Open Rights Group New Government Jobs spoof adverts.
But let’s give the government the benefit of the doubt and suppose that this plan has been fully, scalably costed, and thousands of people across the country will be paid to watch and classify millions of hours of video and decide who’s allowed to watch it.
The next step is placing age verification on sites.
The bill states “A person must not make pornographic material available on the internet on a commercial basis to persons in the United Kingdom except in a way that secures that, at any given time, the material is not normally accessible by persons under the age of 18.”
“Commercial basis” here includes free-to-access sites, and “pornographic material” is a phrase that covers a multitude of sins, from R18 content (of the type not to be supplied “other than in a licensed sex shop”) to video of which it is “reasonable to assume from its nature that it was produced solely or principally for the purposes of sexual arousal.”
There is no indication of how age verification will work. A basic option would be through credit cards but this would expose Internet users to greater risk of credit card fraud – nor would it take much for a young person to ‘borrow’ an adult’s card for the purpose of identification. Other proposals could mean that porn sites are creating databases of what sites are viewed by whom – essentially surveillance, with the potential added risk of accidental public disclosure.
Given the trail of leaks of personal data in the last few years, such as the Ashley Madison hack, is anyone comfortable with the idea of a record of every risque but legal site they have visited?
Given these hacks, you would think that the Bill would at least require that Internet users privacy is protected but there are no obligations to do this.
How can the British government impose age verification on the porn industry, especially as the majority of porn sites are based outside the UK? This was not adequately thought about in the first draft of the Bill but since an amendment has been added to give the BBFC the power to tell Internet Service providers to block sites that don’t comply.
So the BBFC could be in charge of a censorship regime overseeing the blocking of thousands of websites, whose content is perfectly legal, without a court order. Once this is in place to censor porn sites, how else could it be used?
Editor’s Note: On 7 February, the DCMS provided the following response: “Government has not suggested cataloguing or classifying every video on the internet. Age verification proposals are aimed at protecting children by making sure the same rules and safeguards that exist in the physical world also apply online. We are working with the BBFC to make sure there are appropriate data protection mechanisms included in age verification technology deemed suitable for this purpose.”