Tetraplegic patient can now move his four limbs with the help of a badass neuroprosthetic suit

Also Brown University and Intel also want to develop an AI brain-machine interface too


A neuroprosthetic robotic suit controlled by brain signals has allowed a paralysed man walk again for the first time, according to new research published in The Lancet Neurology.

The 28-year-old patient has tetraplegia resulting from a spinal cord injury that stops the nervous system from moving all four limbs. Doctors and physicians from Clinatec, a biomedical research lab at the University of Grenoble, France, built an exoskeleton device for the man to wear.

He also had to implant 128 electrodes covering the left and right side of the upper sensorimotor region in his brain. These monitored and recorded electrical signals as he performed various exercises, and were passed to an algorithm to decode.

The software analysed what brain signals correspond to which limb he wants to move in order to control the movement of the neuroprosthetic. It was a slow process that took over two years before the patient could walk, move his arms, reach and touch objects and rotate his wrists.

He practiced controlling the brain-machine interface three times a week via simulations and donned the real robotic suit once a week every month for 27 months. The researchers called this a ‘proof-of-concept’ and hope that their results will help them develop more sophisticated algorithms to learn how to make the robotic limbs perform more complicated tasks, like holding objects.

"This device is an important step forward in helping people with disabilities become self-sufficient,” said Alim-Louis Benabid, lead author of the paper and a neurosurgeon at the University of Grenoble. “We are extremely proud of this proof of concept and are already considering new applications to make everyday life easier for people with severe motor disabilities."

brain

Neuroscientist used brainhack. It's super effective! Oh, and disturbingly easy

READ MORE

It looks like they aren’t the only ones with the same idea. Researchers at Brown University in the US are partnering up with Intel on a two-year project to study how AI could help paralyzed people walk again.

The goal is to train neural networks that can decode electrical signals emitted from people’s spinal cords so they can eventually “communicate motor commands”. Surgeons will help implant electrodes to to their injured limbs and monitor the patients as they attempt to move so that the algorithms can learn what features in a brain signal match particular movements.

“A spinal cord injury is devastating, and little is known about how remaining circuits around the injury may be leveraged to support rehabilitation and restoration of lost function,” said David Borton, an assistant professor of engineering at Brown University, who is part of the Intelligent Spine Interface project.

“Listening for the first time to the spinal circuits around the injury and then taking action in real time with Intel’s combined AI hardware and software solutions will uncover new knowledge about the spinal cord and accelerate innovation toward new therapies.” ®

Similar topics


Other stories you might like

  • DuckDuckGo tries to explain why its browsers won't block some Microsoft web trackers
    Meanwhile, Tails 5.0 users told to stop what they're doing over Firefox flaw

    DuckDuckGo promises privacy to users of its Android, iOS browsers, and macOS browsers – yet it allows certain data to flow from third-party websites to Microsoft-owned services.

    Security researcher Zach Edwards recently conducted an audit of DuckDuckGo's mobile browsers and found that, contrary to expectations, they do not block Meta's Workplace domain, for example, from sending information to Microsoft's Bing and LinkedIn domains.

    Specifically, DuckDuckGo's software didn't stop Microsoft's trackers on the Workplace page from blabbing information about the user to Bing and LinkedIn for tailored advertising purposes. Other trackers, such as Google's, are blocked.

    Continue reading
  • Despite 'key' partnership with AWS, Meta taps up Microsoft Azure for AI work
    Someone got Zuck'd

    Meta’s AI business unit set up shop in Microsoft Azure this week and announced a strategic partnership it says will advance PyTorch development on the public cloud.

    The deal [PDF] will see Mark Zuckerberg’s umbrella company deploy machine-learning workloads on thousands of Nvidia GPUs running in Azure. While a win for Microsoft, the partnership calls in to question just how strong Meta’s commitment to Amazon Web Services (AWS) really is.

    Back in those long-gone days of December, Meta named AWS as its “key long-term strategic cloud provider." As part of that, Meta promised that if it bought any companies that used AWS, it would continue to support their use of Amazon's cloud, rather than force them off into its own private datacenters. The pact also included a vow to expand Meta’s consumption of Amazon’s cloud-based compute, storage, database, and security services.

    Continue reading
  • Atos pushes out HPC cloud services based on Nimbix tech
    Moore's Law got you down? Throw everything at the problem! Quantum, AI, cloud...

    IT services biz Atos has introduced a suite of cloud-based high-performance computing (HPC) services, based around technology gained from its purchase of cloud provider Nimbix last year.

    The Nimbix Supercomputing Suite is described by Atos as a set of flexible and secure HPC solutions available as a service. It includes access to HPC, AI, and quantum computing resources, according to the services company.

    In addition to the existing Nimbix HPC products, the updated portfolio includes a new federated supercomputing-as-a-service platform and a dedicated bare-metal service based on Atos BullSequana supercomputer hardware.

    Continue reading

Biting the hand that feeds IT © 1998–2022