From brain Bluetooth to ‘full RoboCop’: where chip implants will be heading soon
With implants now capable of translating thoughts into words in just three seconds, the future is going to get very weird indeed.
Amin Al-Habaibeh, Professor of Intelligent Engineering Systems, Nottingham Trent University
14 April 2025
In the 1987 classic film RoboCop, the deceased Detroit cop Alex Murphy is reborn as a cyborg. He has a robotic body and a full brain-computer interface that allows him to control his movements with his mind. He can access online information such as suspects’ faces, uses artificial intelligence (AI) to help detect threats, and his human memories have been integrated with those from a machine.
It is remarkable to think that the movie’s key mechanical robotic technologies have almost now been accomplished by the likes of Boston Dynamics’ running, jumping Atlas and Kawasaki’s new four-legged Corleo. Similarly we are seeing robotic exoskeletons that enable paralysed patients to do things like walking and climbing stairs by responding to their gestures.
Developers have lagged behind when it comes to building an interface in which the brain’s electrical pulses can communicate with an external device. This too is changing, however.
In the latest breakthrough, a research team based at the University of California has unveiled a brain implant that enabled a woman with paralysis to livestream her thoughts via AI into a synthetic voice with just a three-second delay.
The concept of an interface between neurons and machines goes back much further than RoboCop. In the 18th century, an Italian physician named Luigi Galvani discovered that when electricity is passed through certain nerves in a frog’s leg, it would twitch. This paved the way for the whole study of electrophysiology, which looks at how electrical signals affect organisms.
The initial modern research on brain-computer interfaces started in the late 1960s, with the American neuroscientist Eberhard Fetz hooking up monkeys’ brains to electrodes and showing that they could move a meter needle. Yet if this demonstrated some exciting potential, the human brain proved too complex for this field to advance quickly.
The brain is continually thinking, learning, memorising, recognising patterns and decoding sensory signals – not to mention coordinating and moving our bodies. It runs on about 86 billion neurons with trillions of connections which process, adapt and evolve continuously in what is called neuroplasticity. In other words, there’s a great deal to figure out.
But whereas these are typically controlled with an external interface like an EEG headset, chip implants are very much the new frontier. They have been enabled by advances in AI chips and micro electrodes, as well as the deep learning neural networks that power today’s AI technology. This allows for faster data analysis and pattern recognition, which together with the more precise brain signals that can be acquired using implants, have made it possible to create applications that run virtually in real time.
For instance, the new University of California implant relies on ECoG, a technique developed in the early 2000s that captures patterns directly from a thin sheet of electrodes placed directly on the cortical surface of someone’s brain.
In their case, the complex patterns picked up by the implant of 253 high-density electrodes are processed using deep learning to produce a matrix of data from which it’s possible to decode whatever words the user is thinking. This improves on previous models that could only create synthetic speech after the user had finished a sentence.
Elon Musk’s Neuralink has been able to get patients to control a computer cursor using similar techniques. However, it’s also worth emphasising that deep learning neural networks are enabling more sophisticated devices that rely on other forms of brain monitoring.
Our research team at Nottingham Trent University has developed an affordable brainwave reader using off-the-shelf parts that enables patients who are suffering from conditions like completely locked-in syndrome (CLIS) or motor neurone disease (MND) to be able to answer “yes” or “no” to questions. There’s also the potential to control a computer mouse using the same technology.
The future
The progress in AI, chip fabrication and biomedical tech that enabled these developments is expected to continue in the coming years, which should mean that brain-computer interfaces keep improving.
In the next ten years, we can expect more technologies that provide disabled people with independence by helping them to move and communicate more easily. This entails improved versions of the technologies that are already emerging, including exoskeletons, mind-controlled prosthetics and implants that move from controlling cursors to fully controlling computers or other machines. In all cases, it will be a question of balancing our increasing ability to interpret high-quality brain data with invasiveness, safety and costs.
It is still more in the medium to long term that I would expect to see many of the capabilities of a RoboCop, including planted memories and built-in trained skills supported with internet connectivity. We can also expect to see high-speed communication between people via “brain Bluetooth”.
It should be similarly possible to create a Six Million Dollar Man, with enhanced vision, hearing and strength, by implanting the right sensors and linking the right components to convert neuron signals into action (actuators). No doubt applications will also emerge as our understanding of brain functionality increases that haven’t been thought of yet.
Clearly, it will soon become impossible to keep deferring ethical considerations. Could our brains be hacked, and memories be planted or deleted? Could our emotions be controlled? Will the day come where we need to update our brain software and press restart?
With every step forward, questions like these become ever more pressing. The major technological obstacles have essentially been cleared out of the way. It’s time to start thinking about to what extent we want to integrate these technologies into society, the sooner the better.
Amin Al-Habaibeh receives funding Innovate UK, The British Council, The Royal academy of Engineering, EPSRC, AHRC, and the European Commission.
This article is republished from The Conversation under a Creative Commons license.
RUGBY A trans-Tasman match-up between Super Rugby champions has increased calls for more crossover affairs between New Zealand and Australian franchises in the future More...
BUSINESS Housing market momentum could buck seasonal trends More...