| Literature DB >> 29110736 |
Lotte N S Andreasen Struijk1, Line Lindhardt Egsgaard2, Romulus Lontis2, Michael Gaihede3,4, Bo Bentsen2.
Abstract
BACKGROUND: For an individual with tetraplegia assistive robotic arms provide a potentially invaluable opportunity for rehabilitation. However, there is a lack of available control methods to allow these individuals to fully control the assistive arms.Entities:
Keywords: Assistive devices; Assistive robotic arm; Disabilities; Rehabilitation; Tetraplegia; Tongue interface
Mesh:
Year: 2017 PMID: 29110736 PMCID: PMC5674819 DOI: 10.1186/s12984-017-0330-2
Source DB: PubMed Journal: J Neuroeng Rehabil ISSN: 1743-0003 Impact factor: 4.262
Fig. 1Overview of the tongue-based robotic control system. a The inductive tongue interface incorporating 10 sensors in the keypad and 8 sensors in the mousepad. The sensors are activated using a glued or pierced metal activation unit shown at the bottom. b The central unit which wirelessly receives the signal from the tongue interface and transforms these into characters. c The computer which wirelessly receives the characters from the central unit and transforms these into commands. The commands are passed on to the assistive robotic arm through the USB port. d The computer screen showing the visual feedback to the experimental participant (left), the mapping of the sensors to the movement of the robot (middle) and the robot software (right)
Fig. 2Experimental setup and results for direct tongue-robot actuator control. a Characters and functions assigned to the sensors of the tongue interface (b). The mapping of the sensors of the tongue interface to the robot movements (c). The assistive robotic arm in the “home position” next to E1 who is wearing the tongue interface (left) and the experimental set-up for the functional task of picking up a roll of tape placed on a metal holder (d). The sequence of issued commands during a functional task. The arrows indicate the intended duration of a command. The commands refer to the character-command map shown in (b)
Fig. 3Tongue based endpoint control of the assistive robotic arm by E2. a Successful reach and grasp of the roll of tape by E2 sitting in her wheelchair to the left of in the picture (b). Successful reach and touch of the roll of tape, but the roll was dropped during lift off (c). E2 poring a cup of water for the first time in 19 years
Results of tongue-controlled robotic grasping, number of trials = 10
| Participant | Robot control method | Robot | Roll picked up [Times] | Roll touched but dropped | Completion time: Mean ± STDb | No. of issued commandsa: Mean ± STDb |
|---|---|---|---|---|---|---|
| E1 | Direct actuator | 0.20 | 8 | 2 | 71.3 ± 16.7 | 17.6 ± 5.5 |
| E2 | Cartesian endpoint | 0.07 | 5 | 5 | 70.1 ± 15.3 | 6.0 ± 1.5 |
aWhen the same command was issued several times in a row, it was only counted as one command. bSTD Standard deviation
Fig. 4The unsuccessful pick-up of the roll of tape by E2, classified as touch. The 5th trial classified as touch is shown in Fig. 3b
Fig. 5The average time spend between two commands is shown in together with the average duration of the resulting robotic movement for participant E2