New Times,
New Thinking.

  1. Science & Tech
  2. Big Tech
30 January 2024

Are you ready for Elon Musk to read your mind? 

For stroke victims and those who are paralysed, devices that can interpret thoughts are a lifeline. But should Big Tech have access to our brains?

By Sophie McBain

Editor’s note: This piece was originally published on 27 January 2024. On 29 January Elon Musk reported that his Neuralink company had successfully implanted one of its wireless brain-computer interface chips in a human for the first time. This follows a May 2023 decision by the FDA to grant Musk’s company permission to test the chips on humans. 

Andy Mullins was 51 years old when he collapsed at his job on the London Underground. He had suffered a large stroke and spent three months in a coma. When he awoke, in April 2014, the left side of his body was paralysed. He had only a tiny of flicker of sensation between his thumb and index finger, but this gave him hope that he’d regain mobility. Mullins had served in the army as a young man, and he drew on his military background to throw himself into the painstaking work of physical rehabilitation. “For probably the first year, all you could get out of me was ‘positive mental attitude’,” Mullins told me, speaking from his home in west London. The flicker became movement, the sensation spread and over many months he relearned how to walk.

One year after his stroke, Mullins joined a charity support group called Different Strokes. He walked into his first survivors’ meeting and burst into tears. “That’s the part about strokes that people don’t understand,” Mullins said. “They understand the physical bit but it’s the mental bit that really gets you. Going from an active person to a disabled person is hard to comprehend.”

Through Different Strokes, Mullins began volunteering for various scientific trials and tests, hoping to contribute to efforts to improve stroke outcomes. This was how he encountered Cogitat. The company, a spin-off from Imperial College London, is developing wearable brain/computer interfaces: devices that use AI to interpret brain data, translating a user’s thoughts into digital action on a computer. Last year Cogitat received £500,000 from the government’s Innovate UK fund to partner with the NHS for a medical trial to determine if its technology can enhance stroke rehabilitation.

Because of damage to the brain, many stroke survivors find their body doesn’t respond as they want it to; they might will themselves to clench their fist but find their fingers move very little or not at all. Cogitat’s technology can respond to the intention alone. The hope (and the company is very much in the hope stage) is that a mind-controlled video game will be able to alert someone when the correct part of their brain is being activated, even if they are not yet able to produce much movement, helping to retrain damaged neural pathways. The thinking goes that if physiotherapy focuses on the physical part of movement, encouraging people to repeat the same movement until they can do it with ease, Cogitat’s games can focus on the brain side of movement. They also hope that gamifying rehabilitation will help people stick with what can otherwise be boring physio regimes.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com
Visit our privacy Policy for more information about our services, how Progressive Media Investments may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

For Mullins, Cogitat’s focus on the mind was a revelation. He took on the role of adviser to the company, giving tips on how to design games that can be played by people whose brain damage causes aphasia – a difficulty in distinguishing left and right. He thinks that for someone coming to terms with an altered brain and reduced mobility, it can be immensely powerful to experience operating a computer with their mind alone. “That’s why I like Cogitat,” he said, “I’ve seen what my brain can do, and it’s amazing.”

Brain computer interface (BCI) technology has advanced dramatically in recent years. In September, Elon Musk’s Neuralink announced it was recruiting for its first human trials to implant a BCI into the brains of people who are paralysed, with the intention of enabling them to operate an external computer with their thoughts. Neuralink is following in the footsteps of two other US firms, Blackrock Neurotech and Synchron, whose implants have already been tested on a small number of people. In trials, implanted BCIs have enabled paralysed people to operate computers or robotic limbs with their thoughts, translate imagined handwriting into speech and restore movement in paralysed limbs. Research groups have achieved extraordinary feats: in August 2023, for example, US scientists announced that they had enabled a stroke survivor to speak, via a digital avatar, for the first time in 18 years after implanting a BCI that could decode her brain’s electrical signals as she tried to talk.

[See also: The sexual revolution that failed]

While Musk and his competitors are betting on implants, Cogitat is more interested in building software for external devices, the first of which are already available to consumers. For a little over $1,000, you can now buy a headset called “The Crown” that promises to improve your focus by reading brain signals via electrodes, and then playing tailored music to help keep you in the zone. All BCIs operate on the same principles and apply machine learning to the task of interpreting electroencephalogram (EEG) readings. Devices implanted in the brain can gather more detailed data but they require a user to undergo surgery, which carries risk. Wearable devices are an easier sell for most people, and they enable researchers to train the AI on many more subjects. The goal is not to train AI to become good at interpreting one person’s brain, but to train it to interpret anyone’s brain. 

Cogitat’s CEO and co-founder, Allan Ponniah, told me his company wanted to “take science to the next level” by enabling people to communicate with technology directly with their brain, unmediated by movement or speech. He believes that wearable brain/computer interfaces will become commonplace, in medicine and in the gaming industry, within the next five years. “We can find out what’s going on inside your head, and that’s kind of the final frontier,” he said. But are we ready to breach it?

On a bright morning in early December, I visited Cogitat’s base in Imperial College, with Ponniah and the company’s chief technical officer, Dimitrios Adamos, an honorary senior research fellow at Imperial. Adamos has long been interested in ways to bring BCI technology out of the lab. Before co-founding Cogitat, he studied EEG data taken from people while they listened to music and developed an algorithm that could determine how much a person enjoyed what they were listening to. His idea – which is either exciting or dystopian, depending on your perspective – was that streaming services would no longer need to rely on listeners’ active feedback, such as adding a song to a playlist or clicking “like” on a track; they would be able to assess people’s enjoyment of songs automatically. It didn’t take off, but he did partner with Spotify for a publicity campaign in Norway that used the tech to discern the music tastes of famous Norwegian musicians.

At Cogitat, Adamos is helping to develop an algorithm that can decode activity in the brain’s motor cortex, which controls movement. In 2021 Cogitat beat 100 teams, among them the US Army, in an international competition to accurately identify mental states related to movement from EEG data.

This tiny, university-based start-up has ambitions to become the “Microsoft of brain/computer interfaces”, Ponniah told me. Ponniah also works as a plastic surgeon at London’s Royal Free Hospital, where he uses AI to assist in facial reconstruction. He often dials into Cogitat meetings from the hospital, fresh out of the theatre, but on this occasion he was dressed in a three-piece suit. To show how Cogitat’s mind-controlled games work, Ponniah and Adamos led me to a small, windowless room, where a headset made by the Spanish company Bitbrain was already set up in front of a computer monitor. It looked much like a large pair of headphones, only with a few extra headbands, each of which was lined with electrodes that a researcher positioned across my scalp. First, to help the scientists calibrate the headset, I played a robot sorting game, for which I needed to clench and relax my fists. The game responded not to my hand movements but to the corresponding electrical activity in my brain – which in itself felt strange, but a familiar kind of strange, the strangeness of using a touch screen or Face ID for the first time. 

It was during the second game that things began to feel much more surreal. This time I didn’t need to move my body at all, I needed only to think about moving for the computer to respond. A video instructed me that if a green ball, designed to appear a bit like a wizard’s crystal ball, floating above an old-fashioned writing desk, appeared on screen I should imagine clenching my fist to inflate it, and if a red ball appeared I should imagine relaxing my hands until the ball shrank to nothing.

Adamos advised that it can take people some time to work out how to correctly imagine the movement without enacting it. He’s noticed that athletes and musicians tend to take to it more easily, perhaps because they are used to imagining movement. Unfortunately, I am terrible at both music and sport. I found myself wondering how well we can assess our own thoughts – would I even know if the computer was responding to my vivid imaginings, rather than some other brain signal or perhaps even my effortful squinting at the screen? Is there such thing as imagining harder? And how on Earth was I supposed to successfully direct my imagination on cue when my mind had already wander… Shit, a red ball! In my head, my hands went limp. And the ball shrank! A green ball – I pictured clenching my fists. The ball grew! When the game stopped a few minutes later, Adamos looked delighted. He hadn’t seen anyone pick up the game so fast, and he sent a photograph of my scores to show the rest of the team.

A game this simple ought not to feel exhilarating, but it is a powerful sensation to make something happen through thought alone, to feel, if only for a moment, like a human cyborg. If I could compare it to anything, playing a computer game with your mind is like talking while wearing noise-blocking headphones or applying make-up in the dark – robbed of our usual reference points, even familiar routines begin to feel awkward and alien. If the way we understand ourselves is largely in relation to others and in how we assess the boundaries between ourselves and the world, how might this technology – which renders our private, subjective experiences public – shape our self-image?

Writing in the Conversation last year, two New England philosophers argued that BCI technology, because it can “meld mind and machine in a way no other technology can”, was a first step towards what futurists call the singularity – the point at which AI becomes so advanced that it irreversibly changes human civilisation, rendering us inseparable from machines. “If the singularity comes, will we even notice it?” they asked. And so, when I say the game was exhilarating, I partly meant I felt the exhilaration of someone standing on the edge of the abyss, staring down. 

The neuroethicist Stephen Rainey, a senior researcher and lecturer at Delft University of Technology in the Netherlands, is concerned that BCI technology is often overhyped – it cannot read people’s minds, he wants to emphasise, it can only read and interpret brain data and make inferences about people’s thoughts. But he believes it nonetheless raises urgent ethical questions. One of the biggest of these relates to privacy: how much brain data should companies be allowed to collect, and what should they be allowed to do with it?

It is in companies’ interests to collect and keep as much brain data as possible but, Rainey explained when we spoke by video call, “the more data we have, and the more unsupervised learning that goes on with that, well we don’t know what will happen with that in the future. Do we get neural algorithms that can do other things we didn’t expect? Or do we get models of minds, which could make predictions about people that don’t feel quite right? Or, even if they are accurate predictions, do we want machines making accurate predictions about us based on data we didn’t volunteer?”

The lack of legislation protecting how brain data is used was recently raised by the UK government’s Information Commissioner’s Office (ICO). It also flagged the “real danger” of discrimination if biases are built into BCI algorithms, as BCIs could give rise to new forms of discrimination. If organisations begin deeming certain neuropatterns undesirable, for example, people may not be able to rely on existing anti-discrimination laws to protect them, because our brain make-up is not a protected characteristic. “So, you can’t just not give me a job because I’m Irish – you have to think of a better reason,” Rainey said. But “if as part of my HR hiring process, I have a neuroscreening thing, you could exclude me on that basis – there’s no protected characteristic”.

Rainey also raised the point that when brains and machines merge, concepts such as intent and responsibility can become murky. Imagine that in a heated argument, a person has a strong but fleeting desire to punch their antagonist, and the BCI acts on that desire. How should you adjudicate when the person alleges that they never intended to act that way, and when the company behind the BCI – relying on its complex and opaque algorithms for determining intention – claims that they did? Challenged in this way, would we even start to doubt ourselves? Rainey is concerned that people’s testimonial authority could be eroded. Even if the science is shaky, it becomes hard to dispute “experts” who claim they can examine brain data to ascertain what someone is really thinking. In India, amid much controversy, brain-scanning has been used in court cases for more than a decade to try to ascertain guilt. 

Ultimately, Rainey argues, the best means we have of protecting ourselves against abuses of this technology is by strengthening data protection laws; by, for example, extending the legislation to cover brain data. “These machines process brain data, and if you want to steer the development of these and systems in general, we need to have control of the data, and that means data protection legislation,” he said.

Elon Musk, I ventured, might not be motivated by purely altruistic concerns when it comes to implanting Neuralink into people with paralysis. If he’s “trying to create an artificial brain, he wants [to collect] as much brain function data as he can… Let’s say that’s a legitimate aim, but you’re not allowed to just do whatever to get there,” Rainey said. “We’re all involved. Even if only 1 per cent of people in the world gave up their brain data, it has impacts… because the models that are made could be used for or against us. Everyone wants to talk about your rights – what new human rights do you want? Transhumanism? – and I’m thinking: data protection.” 

For all his concerns, Rainey believes that the advance of BCI technology is ultimately a positive thing, because of the remarkable leaps being made in assistive technologies for people with disabilities and locked-in syndrome, and because of the new medical possibilities that are opening up. BCIs have been shown to help people with Parkinson’s disease and epilepsy, and researchers are exploring a host of other applications.

Beyond stroke rehabilitation, Cogitat is interested in testing other potential medical uses for its technology, including developing a machine that can detect signs of dementia or Alzheimer’s early, or one that can alert surgeons to when they are making a mistake. But first it must get its technology ready for its NHS trial.

For Andy Mullins, the stroke survivor, the prospect of the NHS trial was cheering. He found it bleak to consider how little medicine currently has to offer people who have overnight lost their mobility and independence. This technology offered a promise that soon others might not have so difficult a rehabilitation process. It was, he told me, “all about hope.”

[See also: Inside NHS England’s faltering plan to replace the Tavistock]

Content from our partners
Water security: is it a government priority?
Defend, deter, protect: the critical capabilities we rely on
The death - and rebirth - of public sector consultancy