| Literature DB >> 35837250 |
David A Handelman1, Luke E Osborn1, Tessy M Thomas2, Andrew R Badger1, Margaret Thompson1, Robert W Nickl3, Manuel A Anaya3, Jared M Wormley1, Gabriela L Cantarero3, David McMullen4, Nathan E Crone5, Brock Wester1, Pablo A Celnik3, Matthew S Fifer1, Francesco V Tenore1.
Abstract
Advances in intelligent robotic systems and brain-machine interfaces (BMI) have helped restore functionality and independence to individuals living with sensorimotor deficits; however, tasks requiring bimanual coordination and fine manipulation continue to remain unsolved given the technical complexity of controlling multiple degrees of freedom (DOF) across multiple limbs in a coordinated way through a user input. To address this challenge, we implemented a collaborative shared control strategy to manipulate and coordinate two Modular Prosthetic Limbs (MPL) for performing a bimanual self-feeding task. A human participant with microelectrode arrays in sensorimotor brain regions provided commands to both MPLs to perform the self-feeding task, which included bimanual cutting. Motor commands were decoded from bilateral neural signals to control up to two DOFs on each MPL at a time. The shared control strategy enabled the participant to map his four-DOF control inputs, two per hand, to as many as 12 DOFs for specifying robot end effector position and orientation. Using neurally-driven shared control, the participant successfully and simultaneously controlled movements of both robotic limbs to cut and eat food in a complex bimanual self-feeding task. This demonstration of bimanual robotic system control via a BMI in collaboration with intelligent robot behavior has major implications for restoring complex movement behaviors for those living with sensorimotor deficits.Entities:
Keywords: activities of daily living (ADL); bimanual control; brain computer interface (BCI); human machine teaming; robotic shared control
Year: 2022 PMID: 35837250 PMCID: PMC9274256 DOI: 10.3389/fnbot.2022.918001
Source DB: PubMed Journal: Front Neurorobot ISSN: 1662-5218 Impact factor: 3.493
Figure 1System diagram for BMI-based shared control of bimanual robotic limbs. (A) Movements are decoded from neural signals through the brain-machine interface and mapped to two external robotic limbs while using a collaborative shared human-machine teaming control strategy to complete a self-feeding task requiring simultaneous bimanual manipulations. (B) NeuroPort electrode arrays (Blackrock Neurotech) implanted in the motor and somatosensory regions of the left and right hemispheres record neural activity. (C) Neural data is streamed from the cortical implants and processed before being decoded. Decoded gestures are passed to the shared control strategy for mapping onto robot degrees of freedom depending on the current state of the task. Autonomous portions of the task are performed by the robot while semi-autonomous steps are controlled in part by the participant using attempted gestures to modulate a subset of robotic limb end effector degrees of freedom using the current DOF mapping. The degrees of freedom controlled via BMI are based on a task library accessed by the robot.
Figure 2Bimanual self-feeding task flowchart. The collaborative shared control strategy enables the participant to control a minimum set of DOFs while still maximizing task performance.
Figure 3Offline decoding performance for the left and right hands during simultaneous attempted gestures (including “rest”). Contralateral neural signals were used to decode each hand (i.e., left motor and somatosensory cortex signals were used to decode right hand movements). On each hand, there were 48 instances for each of the movement classes (open, pinch, wrist flex, wrist extend) and 64 for the “rest” condition. Of the movements, “pinch” gestures were decoded with the highest accuracy for both left and right hands, while “wrist extend” was notably more difficult to decode (45 and 34% for left and right hands, respectively).
Figure 4Select screenshots of self-feeding task performance. The robot holds a fork in right hand and a knife in left hand. (A) Step 1: Participant initiates task by moving robot right hand forward. (B) Step 3: Participant repositions fork horizontally to align with desired piece of food. (C) Step 6: Participant repositions knife horizontally to select cut point. (D) Step 7: Participant moves knife down and fork back and right to cut food. (E) Step 10: Robot moves food to default position in front of participant's mouth. (F) Step 12: Participant places food in mouth.