Technology

Mind-Controlled Wheelchairs Are Made Possible via Brain Machine Interface

Mind-Controlled Wheelchairs Are Made Possible via Brain Machine Interface

A new study revealed how the severely disabled may operate wheelchairs with their thoughts in a realistic environment after non-invasively training an artificial intelligence (AI)-enabled brain-machine interface (BMI) algorithm. The study was published in the Cell Press journal iScience.

“The primary objective of this study is to probe the hypothesis that BMI skill acquisition by end-users is fundamental to control a non-invasive brain-actuated intelligent wheelchair in real-world settings,” wrote The University of Texas at Austin research study authors.

Brain-computer interfaces (BCIs), often referred to as brain-machine interfaces (BMIs), are neurotechnology tools that allow the decoding of human brain activity into commands that could control gadgets like cellphones, computers, robotic limbs, and wheelchairs.

“In this work, we demonstrate that three individuals affected by severe tetraplegia after spinal cord injury (SCI) learned to operate a self-paced sensorimotor rhythm (SMR)-based BMI to drive an intelligent robotic wheelchair in real-world scenarios with different degrees of proficiency,” the researchers wrote.

The three male quadriplegic, or tetraplegic, study participants were already wheelchair users as a result of spinal cord injury. Three times per week, for either two, three, or five months, the participants’ neural activity was noninvasively captured using an electroencephalography (EEG) skullcap while they were sat in wheelchairs.

The technology used for the study includes Matlab, OpenGL, Robotic Operating System (ROS), eego™sport by ANT Neuro, TDX SP2 by Invacare, and Hokuyo URG-04LX-UG01.

The participants were asked to visualize certain movements in order to practice the brain-machine interface. The participants were given the job of controlling the wheelchair with their thoughts in a chaotic, real-world setting after practicing the brain-machine interface.

“Our work shows that the participants’ ability to navigate in a natural, cluttered clinical environment is directly proportional to the acquired BMI skills,” the scientists reported.

As the accuracy of using the brain-machine interface device increased, two out of the three participants experienced noticeable alterations in the patterns of their cerebral activity. The remaining participant’s brain activity was rather constant throughout the time. His performance plateaued and didn’t change much after the initial instruction, with the exception of a slight increase in accuracy.

The power of AI and human learning

Based on each participant’s individual performances, the researchers discovered that the individuals’ AI and human learning together enabled the machine learning system to distinguish between and decode the brain activity for various movement commands.

The scientists found that subject learning and shared control with robotic artificial intelligence are both essential elements for translational noninvasive brain-machine interfaces.

With these new insights, the researchers suggest that future studies should explore ways to “couple machine learning and subject learning” and that “larger studies are needed to determine the exact translational potential of BMI assistive robotic technology.”

“The results achieved in this work allow us to highlight how shared-control and, in general, human-robot interaction approaches and collaborative robotics may support the user to achieve safety, efficiency, and usability of the brain-controlled wheelchair, especially in the case of mediocre BMI performance,” concluded the scientists.