DARPA commits to brain-computer interface development project

Hang on, didn’t Firefox have this feature years ago?

Got Tips? 28 Reg comments
Batou from Ghost in the Shell

Do you use your brain to control your computer, like Clint Eastwood did in Firefox (yes, that one), the rather underrated 1982 spy drama? No, us neither, but thanks to a new research project launched by the US military, we may one day be able to.

A new programme from the US's Defense Advanced Research Projects Agency (DARPA) is seeking to develop a brain-computer interface, without any pesky middle-men such as limbs eating up those precious miliseconds of signal movement.

The effort, revealed in an outreach blogpost dubbed Bridging the Bio-Electronic Divide, aims to develop "an implantable neural interface able to provide unprecedented signal resolution and data-transfer bandwidth between the human brain and the digital world".

Such an interface would take the form a "biocompatible device" no larger than 1cm3 to convert "the electrochemical language used by neurons in the brain" and the "ones and zeros that constitute the language of information technology".

Neural Engineering System Design (NESD) is the name of the programme, and may spur a dramatic enhancement of research capabilities in neurotechnology, as well as deliver therapeutic benefit.

“Today’s best brain-computer interface systems are like two supercomputers trying to talk to each other using an old 300-baud modem,” said Phillip Alvelda, the NESD program manager. “Imagine what will become possible when we upgrade our tools to really open the channel between the human brain and modern electronics.”

Neuromancy

How and whether the human brain functions as a computer is an ongoing and complicated debate, with further issues arising when strictly measured against the standards for Turing completeness. The philosophical implications which are considered to be tied to that debate's conclusion make it a fairly vituperative topic too.

Research on neural interfaces began in the 1970s at UCLA. Neurochips, integrated circuits designed to interact with neuronal cells, were first claimed to be working by researchers at Caltech in 1997. This chip had room for 16 neurons.

According to the agency, the currently approved neural interfaces "squeeze a tremendous amount of information through just 100 channels, with each channel aggregating signals from tens of thousands of neurons at a time".

"The result is noisy and imprecise," according to the project announcement. NESD however will develop a means of creating systems which will enable the clear and precise communication between computers and single, individual neurons among a million others in a given region of the brain.

Achieving the program’s ambitious goals and ensuring that the envisioned devices will have the potential to be practical outside of a research setting will require integrated breakthroughs across numerous disciplines including neuroscience, synthetic biology, low-power electronics, photonics, medical device packaging and manufacturing, systems engineering, and clinical testing.

In addition to the program’s hardware challenges, NESD researchers will be required to develop advanced mathematical and neuro-computation techniques to first transcode high-definition sensory information between electronic and cortical neuron representations, and then compress and represent those data with minimal loss of fidelity and functionality.

DARPA includes among its previous projects the ARPANET, a packet-switching network which debuted TCP/IP. Sadly nothing much came of that, but this programme might be a goer. ®

Sponsored: Webcast: Discover and secure all of your attack surface

SUBSCRIBE TO OUR WEEKLY TECH NEWSLETTER


Keep Reading

smartphone_camera

Proof-of-concept open-source app can cut'n'paste from reality straight into Photoshop using a neural network

Video Code available if you want to toy with similar camera-grabbing projects
chip

Arm gets edgy: Tiny neural-network accelerator offered for future smart speakers, light-bulbs, fridges, etc

Meet the Ethos-U55 and the Cortex-M55 for edge devices
asteroid

Good news: Neural network says 11 asteroids thought to be harmless may hit Earth. Bad news: They are not due to arrive for hundreds of years

And also, crucial point, the software may be wrong and we'll never be released by these angels of death
garden wall. pic by shutterstock

Just as we all feared, killer AI is coming... for weeds: How to build a grass-monitoring neural network for your home

Roundup Plus: Waymo to resume self-driving car tests amid pandemic, and more
extreme_cold

And now, here's Cli-Mate 9000 with the weather... Pattern-recognizing neural network tries its hand at forecasting

Not perfect, not going to replace supercomputer math engines, fascinating nonetheless
cough

Thankfully, our AI savior is here to nail the COVID-19 pandemic: A neural network that can detect coughing

False positive rates, we've heard of them
ai_doc_data

IBM Watson Health cuts back Drug Discovery 'artificial intelligence' after lackluster sales

And seemingly uses machine learning to explain why it's kinda not but kinda is
Abstract image of a CPU

AI startup accuses Facebook of stealing code designed to speed up machine learning models on ordinary CPUs

Neural Magic claims algos in social network's open-source compiler on GitHub look awfully familiar

Biting the hand that feeds IT © 1998–2020