| Literature DB >> 35937617 |
Ivan Rulik1, Md Samiul Haque Sunny1, Javier Dario Sanjuan De Caro2, Md Ishrak Islam Zarif3, Brahim Brahmi4, Sheikh Iqbal Ahamed3, Katie Schultz5, Inga Wang6, Tony Leheng7, Jason Peng Longxiang7, Mohammad H Rahman2.
Abstract
Throughout the last decade, many assistive robots for people with disabilities have been developed; however, researchers have not fully utilized these robotic technologies to entirely create independent living conditions for people with disabilities, particularly in relation to activities of daily living (ADLs). An assistive system can help satisfy the demands of regular ADLs for people with disabilities. With an increasing shortage of caregivers and a growing number of individuals with impairments and the elderly, assistive robots can help meet future healthcare demands. One of the critical aspects of designing these assistive devices is to improve functional independence while providing an excellent human-machine interface. People with limited upper limb function due to stroke, spinal cord injury, cerebral palsy, amyotrophic lateral sclerosis, and other conditions find the controls of assistive devices such as power wheelchairs difficult to use. Thus, the objective of this research was to design a multimodal control method for robotic self-assistance that could assist individuals with disabilities in performing self-care tasks on a daily basis. In this research, a control framework for two interchangeable operating modes with a finger joystick and a chin joystick is developed where joysticks seamlessly control a wheelchair and a wheelchair-mounted robotic arm. Custom circuitry was developed to complete the control architecture. A user study was conducted to test the robotic system. Ten healthy individuals agreed to perform three tasks using both (chin and finger) joysticks for a total of six tasks with 10 repetitions each. The control method has been tested rigorously, maneuvering the robot at different velocities and under varying payload (1-3.5 lb) conditions. The absolute position accuracy was experimentally found to be approximately 5 mm. The round-trip delay we observed between the commands while controlling the xArm was 4 ms. Tests performed showed that the proposed control system allowed individuals to perform some ADLs such as picking up and placing items with a completion time of less than 1 min for each task and 100% success.Entities:
Keywords: 6DOF; activities of daily living; assistive robot; motor dysfunction; multimodal control; wheelchair
Year: 2022 PMID: 35937617 PMCID: PMC9354078 DOI: 10.3389/frobt.2022.885610
Source DB: PubMed Journal: Front Robot AI ISSN: 2296-9144
FIGURE 1(A) Joint coordinate definition (36, 2022). (B) Overview of Permobil M3 corpus. (C) Finger joystick. (D) Chin joystick.
Modified Denavit–Hartenberg parameters for xArm 6.
|
|
|
|
|
|
|---|---|---|---|---|
| 1 | 0 | 0 |
|
|
| 2 | 0 | − | 0 |
|
| 3 |
| 0 | 0 |
|
| 4 |
| − |
|
|
| 5 | 0 |
| 0 |
|
| 6 |
| − |
|
|
Dimensional parameters of xArm 6.
|
|
|
|
|
|
|
|
|
|---|---|---|---|---|---|---|---|
| 267 | 289.49 | 77.5 | 342.5 | 76 | 97 | 1.3849 | 1.3849 |
| mm | mm | mm | mm | mm | mm | Rad | rad |
FIGURE 2(A) Roll, pitch, and yaw angle. (B) Considered workspace for daily living activities.
FIGURE 3Control architecture of the system.
FIGURE 4A flowchart summarizing the program to manipulate the xArm’s end-effector position.
FIGURE 5Block diagram of the experimental setup.
Characteristics of the 10 healthy participants.
| Characteristics | Value |
|---|---|
| Age (years) | |
| Mean | 27.8 |
| Standard Deviation | 2.9 |
| Gender | |
| Male | 10 |
| Female | 0 |
FIGURE 6Participants use the system while seated in the wheelchair.
FIGURE 7ADL experiments picking objects from the ground (A,E), the shelf (B), and the table (C,D).
FIGURE 8Cartesian trajectory of picking an item from (A) table and (B) ground.
FIGURE 9Angular position, torque, and speed of the six joints in the robotic arm while picking an object from the ground.
FIGURE 10Angular position, torque, and speed of the six joints in the robotic arm while picking an object from the table.
FIGURE 11Distribution of task completion time.
Distribution of completion time of activities of daily living with both controls (N = 10).
| Task | Completion time | ||
|---|---|---|---|
| Min (s) | Max (s) | Median (s) | |
|
| |||
| Chin joystick | 44 | 60 | 51 |
| Finger joystick | 38 | 57 | 48 |
|
| |||
| Chin joystick | 43 | 69 | 52.50 |
| Finger joystick | 39 | 63 | 46 |
|
| |||
| Chin joystick | 47 | 69 | 59.50 |
| Finger joystick | 38 | 67 | 52 |