Remember that time I stupidly ate a cream cake just before taking a blood test, causing my GP to phone in a panic and order that I begin a crash diet to avert diabetes and heart failure? Hopefully, somewhere in a thick file of decaying paper, my doctors’ surgery does.
But who else do I want to know? And if that fact, together with thousands of others about me, is going to be anonymised and aggregated for commercial use, what safeguards do I want to place on that data’s use? These are the questions millions of us should be asking in advance of the government’s 23 June deadline for opting out of the GP Data for Planning and Research project (GPDPR).
If you want to know why, try googling “GP data opt-out”. Chances are that you will land on an NHS webpage, last updated on 20 May, saying “a number of posts are circulating on social media about the national data opt-out, containing incorrect information” and claiming “there is no deadline”. The problem is, it’s a reference to a completely different data opt-out to the one in the headlines – but this is the page Google’s algorithm takes you to.
There is a deadline and it is 23 June. And the messaging debacle, as first reported by The Register, has made doctors fearful that patients will lose trust regardless of the privacy issue.
But the privacy issue is real and fundamental. After 23 June the NHS will scrape the data of 55 million people held by GP surgeries in England. The data will be encrypted and anonymised – with the anonymity made purposely reversible if permitted under law. It will include data about domestic and sexual violence, addictions, treatments, sexually-transmitted diseases and numerous other things you would really not like to be subject to a reversible anonymity request in a country run by incompetents and liars.
There are numerous safeguards. Both the British Medical Association and the Royal College of GPs will sit on the oversight board, together with the Independent Group Advising on the Release of Data (IGARD). If you’ve never heard of that, welcome to the club. It’s yet another group of experts set up by a Tory government that thinks we rely too much on experts. It is said to consist of “information specialists, doctors, lawyers, researchers, ethicists and lay members”. To which my response as a patient concerned with privacy is a long and querulous “Okaaaaaaay…”
[see also: Why the UK’s post-Brexit plans are a threat to data protection]
There are seven current members of IGARD. Three have links with the Ministry of Defence, including Professor Nicola Fear, a military epidemiologist who also sits on the SPI-B committee, an offshoot of Sage that is advising the government on mass behavioural strategies during the Covid-19 pandemic.
How do you get elected to IGARD? You don’t. You get recruited. The only “ethicist” seems to be a nurse with an MA in healthcare ethics. Of the two lay members, one is a corporate lawyer and the other a patient representative. Having ploughed through the minutes of an IGARD meeting, I am absolutely certain everyone on this committee would review any request by Palantir or Google with the utmost professionalism. But I don’t trust IGARD.
Every week it has to field data requests from an array of high-powered civil servants, and it’s on that side of the table that all the information science firepower sits. For what is essentially an ethics committee, it is remarkably devoid of philosophers and astonishingly well connected to the military. And we’ve seen, via Sage and SPI-B during the Covid pandemic, how easily expert panels can be colonised and co-opted by those in power, and dragged towards a political agenda.
I do not, in principle, object to my GP data being anonymised, aggregated and used for research. If given a choice, I would stipulate its use only by non-profits and academia, with a blanket no-access rule for the emerging field of “behavioural science”, which, as Dominic Cummings rightly stated, is awash with dubious methodologies. I would ban absolutely any exploitation by companies such as Palantir or Google’s DeepMind – even though I know that might retard genuine scientific advances – because I don’t trust those companies’ ethical models.
Above all, I would be aware that, even if anonymised, my healthcare data, linked to a DNA profile, would effectively hand a private health insurer the ability to price me, or someone matching my description, in or out of a future private healthcare system. Since an insurance-based health system is exactly what US medical corporations have been pushing on the UK for decades, I would need reassurance that there will never be an attempt to privatise the NHS – which would be like buying a gold watch on Oxford Street.
To protect the data I want to share, I would rather trust something much more simple than a government-appointed committee, namely, the law. I already have the legal right to withhold my data from corporate exploitation, and I don’t need a corporate lawyer or a random person from a patient group to stand as gatekeepers between me and my data protection rights.
The GPDPR scandal illustrates one of the most important conflicts of the 21st century – the three-cornered fight between citizens, corporations and states over who owns our digital identities. Aggregated data is the gold of the new economy. With it, corporations can do much more than sell you beer and Pringles. They can predict and therefore influence your behaviour. They can exert algorithmic control over your choices, just as the Google search might take you to the wrong page if you are worried about the GP data grab. They can, in short, misinform you and control you.
Asymmetric power over data has, in the 20 years since the broadband and mobile data revolution began, become embedded into our social relationships. But the fight has only just begun.
If you can bear to read the minutes of an IGARD meeting, where civil servants haggle with the committee of appointed experts over data access requests, you will see immediately what is missing: the voices of corporations and government are strong. The unmediated voices of citizens are absent.
So what we need to do, strategically, is to fight for inalienable data rights. It’s entirely possible, using blockchain technology, to allow every citizen to give and revoke control over the data they generate, not just in the healthcare system but through our interactions with governments, corporations and each other. This is the principle behind the EU-funded Decode project, whose initial advisory panel I sat on between 2017 and 2019, and which has left a legacy of open-source tools for digital democracy and privacy.
Tactically, we need to tell the NHS to think again over GPDPR. It has tried this extensive data grab once before in 2014 and it failed because the public rejected it. There is nothing Luddite or selfish about withdrawing access to your GP data until the trust, governance and privacy issues are resolved. Cummings’s voluble testimony to MPs last week revealed that, lurking within modern conservatism, is a penchant for tech authoritarianism.
He told MPs he would have appointed a dictator to manage Covid-19: “he has as close to kingly authority as the state has legally to do stuff, and he is pushing the barriers of legality, he is in charge of everybody, he can fire anybody and he can move anybody and he can jiggle the whole thing around”.
Students of tech-bro authoritarianism will know that this view – neo-monarchism – has strong supporters among the neoliberal US right. It was the leitmotif of the Mencius Moldbug blogs produced by programmer Curtis Yarvin and, before that, the work of economist Hans-Herman Hoppe. The logic is that you should let the market run society but, if you need a state, it should be monarchical with absolute powers. So Cummings’s remark was not some quirk or in-joke: it’s a serious philosophical proposition on the right.
Even if Johnson’s government is today restrained, divided and stumbling from one crisis to the next, it is clear that Cummings intends to return to power. His work is not done. Hand your GP data over to the British state today, and there is no guarantee that the safeguards and rules that protect it now will last.
For that reason I will be filling in the opt-out form, and invite readers to do likewise.