| Literature DB >> 30440831 |
Fu-Jen Chu, Ruinian Xu, Zhenxuan Zhang, Patricio A Vela, Maysam Ghovanloo.
Abstract
A human-in-the-loop system is proposed to enable collaborative manipulation tasks for person with physical disabilities. Studies show that the cognitive burden of subject reduces with increased autonomy of assistive system. Our framework obtains high-level intent from the user to specify manipulation tasks. The system processes sensor input to interpret the user's environment. Augmented reality glasses provide ego-centric visual feedback of the interpretation and summarize robot affordances on a menu. A tongue drive system serves as the input modality for triggering a robotic arm to execute the tasks. Assistance experiments compare the system to Cartesian control and to state-of-the-art approaches. Our system achieves competitive results with faster completion time by simplifying manipulation tasks.Entities:
Mesh:
Year: 2018 PMID: 30440831 DOI: 10.1109/EMBC.2018.8512668
Source DB: PubMed Journal: Annu Int Conf IEEE Eng Med Biol Soc ISSN: 2375-7477