Canadian boffins are testing semi-autonomous exoskeletons that could help people with limited mobility walk again without the need for implanted sensors.
Researchers at the University of Waterloo, Ontario, are hard at work trying to combine modern deep-learning systems with robotic prostheses. They hope to give disabled patients who have suffered spinal cord injuries or strokes, or are inflicted with conditions including multiple sclerosis, spinal, cerebral palsy, and osteoarthritis, the ability to get back on their feet and move freely.
The project differs from other efforts for amputees that involve trying to control the movement of machines using electrodes implanted in nerves and muscles in the limbs and brain, explained Brock Laschowski, a PhD student at the university who is leading the ExoNet study. “Our control approach wouldn’t necessarily require human thought. Similar to autonomous cars that drive themselves, we’re designing autonomous exoskeletons that walk for themselves.”
Like self-driving cars, the robotic limbs are kitted out with sensors and cameras that feed images taken from the surrounding environment to computer-vision algorithms. These are used by the exoskeleton's control systems to carry out specific actions – whether it’s walking, standing, sitting, or going up and down stairs – all while comprehending and navigating the environment as necessary. People strapped into these exoskeletons thus are able to move around, with the software handling the necessary movements.
Having said that, ExoNet is still in its infancy. Laschowski and his colleagues began by collecting data to train the convolutional neural networks in the computer vision portion of the software. Cameras were strapped to participants' chests, waists, and calves, and they were instructed to walk around to gather information to teach the models.
Our environment recognition system, which uses wearable cameras and AI algorithms, is used to estimate upcoming locomotor activities and therefore allow robotic exoskeletons to predict and plan accordingly to different walking environments
Hours of footage was recorded and sorted into nearly a million images that were then labelled to describe specific motions, including walking on level ground, walking up and down stairs, and meeting various objects, such as doors, walls, and seats.
“Data from wearable sensors are used to control robotic exoskeletons by inferring the user’s locomotor intent, for example, wanting to walk upstairs,” Laschowski told The Register. "Our environment recognition system, which uses wearable cameras and AI algorithms, is used to estimate upcoming locomotor activities and therefore allow robotic exoskeletons to predict and plan accordingly to different walking environments."
In other words, the system tries to figure out what to do from the situation the wearer finds themselves in.
If the algorithms detect something like stairs, the code sends this information to the exoskeleton’s control system to kick-start the series of commands to lift the knee and step up or down. If it sees a wide flat ground, it’ll be more likely to send instructions to the machine to activate a different set of controllers used for walking, or if it recognizes a door, the software should correctly guess the user is trying to open it and get the exoskeleton to slow down and stand still.
Exoskeletons-as-a-service offered as helping hand to warehouse workers exhausted by pandemicREAD MORE
Deep-learning systems have advantages over other traditional algorithms that have been used to control prosthetic limbs. In theory, they should be more robust since the software automatically learns features from common patterns in data without engineers having to handcraft them. But their effectiveness is dependent on being able to accurately recognize objects and perform the computation quickly in order for exoskleteon-wearers to move safely in real time.
“One of the biggest challenges to developing AI-powered lower-limb exoskeletons is control, having the legs move according to the user’s intent,” Laschowski added.
"Errors could result in performing inaccurate movements, for example, selecting a level-ground walking mode when actually the user wants to climb stairs which could potentially cause injury."
Disasters are prevented by including a kill switch. “To ensure safety, systems such as ours have human-controlled override buttons to disengage the automated controllers,” John McPhee, the Canada Research Chair in Biomechatronic System Dynamics at the University of Waterloo, told El Reg. “Furthermore, we use extensive simulation-based testing of our controllers prior to human testing.”
The team isn't quite ready to begin testing their exoskeletons on people with impaired movement. “We’re currently focusing on developing the environment recognition system, specifically improving the accuracy and real-time performance of the environment classification. This technical engineering development is essential to ensuring safe operation for future clinical testing with autonomous robotic exoskeletons,” said Lachowski.
Eventually, the researchers hope that robotic exoskeletons running their environment recognition system will be able to give people the freedom to sit, stand, walk, and go up and down stairs. ®