Some of the largest and most popular digital business in the world have started to market their services through an emphasis on greater privacy. They are building end to end encryption into their services: securing messages and data so that only the end users can see what is being shared.
The tech companies market these innovations as a great thing, ensuring user anonymity and privacy.
However, privacy is not an absolute right and for good reason: The UK’s Human Rights Act states that our personal information, including messages we write or images we create “should be kept securely and not shared without our permission, except in certain circumstances.” So this is a qualified right, allowing public authorities to intervene if the interests of the wider community, or other people’s rights, are affected. In contrast, the right to protection from ‘torture and inhuman or degrading treatment’ such as images of child abuse is, an absolute right and must never be limited or restricted.
‘Unbreakable’ end-to-end encryption threatens to frustrate this important legal principle by enabling nefarious actors to easily circulate the internet’s most horrendous material, including images of the sexual abuse of children. Companies that implement such technology will be making it far harder to detect and remove such images. The UK-based No place to hide campaign estimates that 14 million reports of suspected child sexual abuse online could be lost each year.
The UK’s Online Safety Bill, as drafted, will oblige service providers to limit the presence and dissemination of illegal content and to take responsibility for “legal but harmful” content.
Some tech enthusiasts claim the Bill will effectively ban end-to-end encryption and they object to this as a threat to privacy.
In fact, the legislation needs to go further if it is to offer adequate protection: otherwise new services coming to market may evade the ultimate enforcement sanction – the blocking of access in the UK – or are designed in a way that will make it difficult to deliver the requirements of the Bill.
The risk is that these services make it very difficult for Ofcom to regulate in the way the legislation intends, and frustrate law enforcement agencies’ efforts to bring online criminals to justice.
We don’t need a ban on encryption, but we do need more safeguards in the bill.
The Internet Commission proposes that before rolling out a new innovation, a tech company should be required to first assess the risks, and demonstrate how vulnerable groups can be protected and illegal content detected. Companies need to be given incentives to develop the tools to do this. Ofcom approval or licensing of encrypted services would ensure the new regulatory regime remains robust.
In its latest report, the Internet Commission found that where private messaging is a secondary function of a service, safety from illegal and harmful content can take precedence; and where companies are unable to scan messages, companies are incentivised to innovate and develop tools to detect illegal and harmful content – without scanning the content of private messages.
The Bill is posing a challenge to tech companies: how can they use their resources and expertise to design services that ensure the safety of all our families as well as the privacy of their individual users?
We wouldn’t accept a car company that built and sold a car that couldn’t respect UK traffic laws, nor would we think “educate pedestrians to stay out of the way” would be a satisfactory mitigation.
Yet this is the sort of logic that some tech companies put forward. The industry can do better. UK citizens deserve it, and the Government should insist upon it.
Alex Towers is director of policy and public affairs at BT Group. Patrick Grady is a project lead at the Internet Commission.