This article is more than 1 year old

Brain-computer interface researchers warn of a 'bleak' cyberpunk future – unless we tread carefully

Commercial exploitation of a person's inner thoughts 'particularly worrisome'

Researchers at Imperial College London have sounded the alarm over a "bleak panorama" surrounding brain-computer interfaces (BCI), warning of a potential future in which BCI-equipped cyborgs divide the world – or have their inner thoughts harvested for commercial exploitation.

BCIs are, potentially, the next big thing. From playing slot-car racing to restoring speech or motor function, there's considerable interest in using electronics to read peoples' minds.

But not all progress on the BCI front is positive, researchers at Imperial College London warned – and the industry needs to take care amid reports of reliance on technology, thoughts of a world divided into those with access and those without, and the potential for commercial exploitation of a person's innermost musings.

"For some of these patients, these devices become such an integrated part of themselves that they refuse to have them removed at the end of the clinical trial," said Rylie Green PhD, one of the authors of a new review on the state of the art in BCI research, of subjects in studies where BCI devices have been used to control wheelchairs or prostheses.

"It has become increasingly evident that neurotechnologies have the potential to profoundly shape our own human experience and sense of self."

The review turned up increasing interest from commercial ventures, including work on reading subjects' minds to turn the intention of speech into written text – though Facebook, the funder of a successful project to do just that, announced it was pulling funding to go in a different and somewhat less-creepy direction with its commercialisation efforts.

BCI isn't just about restoring functionality, though, but can also be used to augment it – allowing a typist, for example, to operate at the speed of thought. Potentially, in a lesson taught by a range of dystopian science-fiction media, it could lead to a world of haves and have-nots: the augmented, and the natural.

"This bleak panorama brings forth an interesting dilemma about the role of policymakers in BCI commercialisation," Green, reader in polymer bioelectronics at Imperial College London, continued in a statement on the topic.

"Should regulatory bodies intervene to prevent misuse and unequal access to neurotech? Should society follow instead the path taken by previous innovations, such as the internet or the smartphone, which originally targeted niche markets but are now commercialised on a global scale?

"Despite the potential risks, the ability to integrate the sophistication of the human mind with the capabilities of modern technology constitutes an unprecedented scientific achievement, which is beginning to challenge our own preconceptions of what it is to be human."

Roberto Portillo-Lara PhD, a researcher associate at Imperial's department of bioengineering, had his own concerns. "This is particularly worrisome," he said of the potential for commercial ventures to gain access to readings taken by BCI systems, "since neural data is often considered to be the most intimate and private information that could be associated with any given user.

"This is mainly because, apart from its diagnostic value, EEG data could be used to infer emotional and cognitive states, which would provide unparalleled insight into user intentions, preferences, and emotions."

The review which triggered these concerns has been published under open-access terms in the journal APL Bioengineering. ®

More about

More about

More about

TIP US OFF

Send us news


Other stories you might like