Scientists in Belgium have tested the security of a wireless brain implant called a neurostimulator – and found that its unprotected signals can be hacked with off-the-shelf equipment.
And because this particularly bit of kit resides amid sensitive gray matter – to treat conditions like Parkinson's – the potential consequences of successful remote exploitation include voltage changes that could result in sensory denial, disability, and death.
If you can't beat AI, join it: Boffinry biz baron Elon Musk backs brain-machine interface bizREAD MORE
In a paper, Securing Wireless Neurostimulators, presented at Eighth ACM Conference on Data and Application Security and Privacy last month, the researchers described how they reverse engineered an unnamed implantable medical device, and how they believe its security can be improved.
They had help doing so from the device's programmer, but argue that an adversary could accomplish as much, though not as quickly.
Beyond these rather dire consequences, the brain-busting boffins – Eduard Marin, Dave Singelee, Bohan Yang, Vladimir Volskiy, Guy Vandenbosch, Bart Nuttin and Bart Preneel – suggest private medical information can be pilfered.
That's hardly surprising given that the transmissions of the implantable medical device in question are not encrypted or authenticated.
What is intriguing is that the researchers suggest future neurotransmitters are expected to utilize information gleaned from brain waves like P-300 to tailor therapies. Were an attacker to capture and analyze the signal, they suggest, private thoughts could be exposed.
They point to related research from 2012 indicating that attacks on brain-computer interfaces have shown "that the P-300 wave can leak sensitive personal information such as passwords, PINs, whether a person is known to the subject, or even reveal emotions and thoughts."
Can the brain be a better defense?
To mitigate this speculative risk, the boffins propose a novel security architecture involving session key initialization, key transport and secure data communication.
Implants of this sort, the researchers say, typically rely on microcontroller-based systems that lack random number generation hardware, which makes encryption keys unnecessarily weak.
The session key enabling symmetric encryption for wireless networking between the implant and a diagnostic base station could be generated by a developer and inserted into implant. But the researchers contend there's a risk of interception and potentially a need for extra security hardware that would make the implant bulkier.
They believe there's an alternative: Using the brain as a true random number generator, a critical element for secure key generation.
"We propose to use a physiological signal from the patient’s brain called local field potential (LFP), which refers to the electric potential in the extracellular space around neurons," the paper explains.
And to transmit the key to the external device, they suggest using an electrical signal carrying the key bits from the neurostimulator, a signal that can be picked up by a device touching the patient's skin. Other modes of transmission, such as an acoustic signal, they contend could be too easily intercepted by an adversary.
The lesson here, the eggheads say, is that security-through-obscurity is a dangerous design choice.
Implantable medical device makers, they argue, should "migrate from weak closed proprietary solutions to open and thoroughly evaluated security solutions and use them according to the guidelines." ®