It's all in the mind: Thought-command slot car racing in the city of the future

Controlling a Scalextric car with the mighty power of the Vulture brain


Suffering from sore thumbs after a serious session of table-top electric car racing? The Reg took a look at the logical conclusion of wearable electroencephalography: controlling Scalextric cars with thought.

The demonstration at the Goodwood Festival of Speed's Futurelab was ostensibly part of a cleaner city stand, sandwiched between electric car concepts and the eardrum-exploding jetpacks of Gravity Industries. A slot car set snaked between clear Perspex atop an active desktop, with the cars themselves featuring 3D printed shells enclosing the familiar electric units.

Future Lab Emotiv racing

The city of the future is Scalextric

Benedict Sheehan, contracted to technically lead and develop the setup by Solar Flare Studio, described the process involved in getting the Emotiv headsets worn by participants to talk to the slot racing tech, which dates back to the mid-20th century.

The initial design replaced the handsets with a Raspberry Pi to demonstrate that the voltages involved could be controlled to make the cars move. Once working, Sheehan swtiched attention to the headsets, which Emotiv modestly calls "Brainwear" but are in reality consumer-optimised headwear; a world away from the wired showercap you see in hospitals, or the vaguely alarming tech promoted by Elon Musk's brain-to-computer interface startup Neuralink, which was being tested on pigs in August 2020.

The Emotiv Insight gear is also most definitely not a medical device.

However, what it does do is pick up on what is going on within the brain, meaning Sheehan's software is able to detect signals and translate those into making the toy cars stop or start with next to no user training. In fact, when we had a go, that training required little more than visualising the push and pull on a cube, which the calibration software interpreted into the go and stop of the car.

The software itself merits further attention. "I had about seven weeks to make it," explained Sheehan, who opted for Microsoft's C# language and Unity as the glue to hold the gizmos together. He had assistance for some of the COMs and API from colleague Grigor Tudorov. Data streams from the headsets are accessible from an API and thus the cars go round and round.

We found the process remarkably straightforward. A few short minutes of training had the toy cars whizzing around at our mental whim. It is quite an improvement over the gizmos launched in the previous decade by the likes of Neurowear.

"The neutral is just to find what your ambient level is," explained Sheehan regarding the training and configuration. "That's where you just look around and try not to think too hard about anything. Then anything you can parse on that is used as the command."

And, by your mental command, the car sets off.

As prices come down, and devices get ever more consumer-friendly, Sheehan said he sees potential beyond slot car racing by utilising machine learning to interpret data from the brain.

"I think you'd have to spend a lot of time with it," he added. Certainly, those 1,000 pictures of elephants (his example) aren't going to look at themselves. "But in terms of paraplegics," he said, "I think, yes; I think we are almost there."

While wearable electroencephalography is hardly a new technology, Sheehan's work seemngly demonstrates how accessible it is becoming, a world away from wires and sensors sprouting from the head of a wearer.

And who wouldn't enjoy dispensing with a hand-held controller in favour of setting the speed with just one's thoughts alone? ®


Other stories you might like

Biting the hand that feeds IT © 1998–2021