NASA boffins have pulled off a seemingly impressive feat - reading words which have not actually been spoken.
The system works by computer analysis of "sub-auditory" speech at the throat. NASA's Ames Research Center developer Chuck Jorgensen explains further: "A person using the subvocal system thinks of phrases and talks to himself so quietly it cannot be heard, but the tongue and vocal cords do receive speech signals from the brain.
"What is analyzed is silent, or subauditory, speech, such as when a person silently reads or talks to himself. Biological signals arise when reading or speaking to oneself with or without actual lip or facial movement."
To capture the necessary data, NASA put sensors on subjects' throats and under their chins. In trials, the software has been able to successfully recognise six words and 10 numbers with 92 per cent accuracy.
Although the technology is in its infancy, NASA has high hopes for it. The disabled are an obvious candidate group, but air traffic controllers and astronauts - both of whom have to work in difficult, noisy environments - may also benefit.
It will shortly be tested in an experiment to exert basic control over a Mars rover type vehicle, although NASA admits that there is much to be done on the pre-analysis amplification and noise reduction of the nerve signals: "The keys to this system are the sensors, the signal processing and the pattern recognition, and that's where the scientific meat of what we're doing resides." said Jorgensen. ®