Aura Ximena González-Cely1, Mauro Callejas-Cuervo2, Teodiano Bastos-Filho1. 1. Postgraduate Program in Electrical Engineering, Federal University of Esṕırito Santo, Av. Fernando Ferrari, 514, Vitoria 29075-910, Brazil. 2. Software Research Group, Universidad Pedaǵogica y Tecnoĺogica de Colombia, Av. Central del Norte 39- 115, Tunja 150001, Colombia.
Abstract
A prototype that simulates a wheelchair was built using electronic commercial devices and software implementation with the aim to operate the prototype using head movement and analyzing the system response. The controllers were simulated using MATLAB® toolbox and Python™ libraries. The mean time response of the system with manual control was 37,8 s. The mean orientation control response with constant speed was 36,5 s and the mean orientation control response with variable speed was 44,2 s in a specific route. The variable speed response is slower than constant speed due to head motion error. The system was rated such as" very good" by 10 participants using a System Usability Scale (SUS).
A prototype that simulates a wheelchair was built using electronic commercial devices and software implementation with the aim to operate the prototype using head movement and analyzing the system response. The controllers were simulated using MATLAB® toolbox and Python™ libraries. The mean time response of the system with manual control was 37,8 s. The mean orientation control response with constant speed was 36,5 s and the mean orientation control response with variable speed was 44,2 s in a specific route. The variable speed response is slower than constant speed due to head motion error. The system was rated such as" very good" by 10 participants using a System Usability Scale (SUS).
The prototype was designed taking into account the wheelchair characteristics to transport a person with physical disabilities. The prototype has sensors, unit control, communication system, actuators, frame and configuration head motion capture system. The inputs and outputs system, unities control and communication elements are shown in Fig. 1. The blocks in blue color mention the elements of the wheelchair prototype and the blocks in green color mention the external elements such as the computer and the cup where the motion capture system was implemented.
Fig. 1
Block diagram of the electronic devices located on the wheelchair prototype.
Block diagram of the electronic devices located on the wheelchair prototype.Sensors: The prototype has sensors to measure distance, speed and orientation. The sensors implemented are: Ultrasonic module HC-SR04 for measuring distance between the prototype and the obstacles near to itself, which are located on the front and sides of the prototype; the speed sensor is composed of an optical encoder located on the wheels to measure the number of turns and obtain the angular velocity of the system. The optical encoder model is HC-020 K. The movement sensor is MPU-6050, which is used for knowing the prototype orientation and determine possible risks of collision or inclination that affects the stability of the prototype.Actuators: The actuators are two DC gearmotors of 25 mm with a nominal speed of 20 rpm and a turning radius of 217.0:1. The model motor is 225–801 of Makeblock company and the maximum power output is 1090 mW. The actuators are connected to a L298N driver used to control the speed and direction of the wheels.Unit Control: The microcontroller Teensy 3.2 of PJRC company is the unit control of the prototype system, which has an ARM Cortex-M4 processor, Flash memory of 256Kbytes, Random Access Memory (RAM) of 256 KBytes, and Electrically-Erasable Programmable Read-Only Memory (EEPROM) of 2 Kbytes. The physical part has 34 digital input/output pins and twelve Pulse Width Modulation (PWM) output pins. The communication systems are composed of three serial pins, one Serial Peripheral Interface (SPI) and two Inter-Integrated Circuit (I2C) pins. The microcontroller is programmable using Arduino libraries.The elements mentioned above are shown in a block diagram in Fig. 2.
Fig. 2
Elements of the wheelchair prototype: actuators, unity control and sensors.
Elements of the wheelchair prototype: actuators, unity control and sensors.Frame: The structure has four wheels, a base and caterpillar links. The structure was developed by Makeblock, with the Starter Kit Bluetooth 90020. The structure is shown in Fig. 3.
Fig. 3
Structure used for prototype implementation of the system.
Structure used for prototype implementation of the system.Communication system: The system uses a User Datagram Protocol (UDP), which exchanges datagrams on the network. The Open System Interconnection (OSI) model utilizes the UDP protocol in the transport layer. UDP supplies data transport with low overhead due to datagram header size that is small and it does not have network management traffic. This protocol has an interface between network and application layers of the OSI model. It does not guarantee that the message will be delivered, but does guarantee the speed of sending and receiving data [14]. The prototype communication allows transferring data between the head motion capture system and the computer, receiving head motion angles. The wheelchair prototype sends and receives information. The prototype sends the sensors data such as distance, speed, movement angles and temperature and receives control actions to operate it. Fig. 4 shows a bidirectional communication between the unit control and the prototype and, the unidirectional communication between the head motion capture system and the computer.
Fig. 4
Bidirectional communication between the prototype and the computer and unidirectional communication between the head motion capture system and the computer.
Bidirectional communication between the prototype and the computer and unidirectional communication between the head motion capture system and the computer.Configuration head motion capture system: The user can activate the control system by means of a Graphic User Interface (GUI) and start the different control types developed by using a head motion capture system. The capture system generates Euler angles (Pitch, Yaw and Roll) to send to the computer for processing and produce control actions to operate the wheelchair prototype. The capture system also has an initial configuration and data calibration for sending Euler angles and connecting to the WiFi network. For another hand, the capture system has a power system to charge it when the light red of the system is turned-off. Fig. 5 shows the different head movements made by the user and the direction of the wheelchair prototype in response to these movement.
Fig. 5
Head motion to operate the wheelchair prototype in seven directions: (a) backward movement; (b) left movement; (c) stop; (d) right movement; (e) forward movement; (f) right-back movement; (g) left-back movement.
Head motion to operate the wheelchair prototype in seven directions: (a) backward movement; (b) left movement; (c) stop; (d) right movement; (e) forward movement; (f) right-back movement; (g) left-back movement.
Design files
The design files section is divided in hardware and software files where the hardware section illustrates the schematics and PCB layouts of the system and the software section describes the main algorithms of the fuzzy controllers, graphical interface and communication using block diagrams for a better comprehension.
Electronics
The schematics of each component is mentioned below. The schematic for ultrasonic sensor, optical encoder, movement sensor, WiFi module and microcontroller inputs and outputs are shown in the Fig. 6, Fig. 7, Fig. 8, Fig. 9, Fig. 10. Fig. 6 illustrates the sensor HC-SR04 using 5 V for operating.
Fig. 6
HC-SR04 ultrasonic sensor schematic made in EAGLE.
Fig. 7
Optical encoder schematic made in EAGLE.
Fig. 8
MPU6050 movement sensor schematic made in EAGLE.
Fig. 9
WiFi module ESP8266 schematic made in EAGLE.
Fig. 10
Teensy 3.2 microcontroller with inputs and outputs.
HC-SR04 ultrasonic sensor schematic made in EAGLE.Optical encoder schematic made in EAGLE.MPU6050 movement sensor schematic made in EAGLE.WiFi module ESP8266 schematic made in EAGLE.Teensy 3.2 microcontroller with inputs and outputs.The schematic of Fig. 7 illustrates the connections of the speed sensor at 5 V and an output analog signal that is processed with the microcontroller using an equation to calculate the speed in RPM. The output signal needs an voltage divider using an input signal of the microcontroller.The movement sensor sends and receives data using I2C protocol. The pins used for this protocol are SDA (data line) and SCL (clock data). The sensor sends information about the prototype movement and the microcontroller processes and obtains the Euler angles that describe this movement in three axis. The schematic is shown in Fig. 8.The WiFi module voltage is 3,3 V and uses serial communication through RX (reception data) and TX (trans- mission data) pins. The module also has a reset pin controlled by the microcontroller for communication reset. The schematic is shown in Fig. 9.The microcontroller has the inputs and outputs signals for the sensors and actuators. The schematic is shown in Fig. 10.The L298N is used for motor control using 5 V. The input signal voltage is 3.3 V because the microcontroller generates this level. The input signals are enabled for each motor, and two signals for each motor, which operates the direction of motor rotation. The PCB layout was generated using one layer taking into account the schematics mentioned above. The PCB contains voltage divider for the optical encoder and ultrasonic sensors. Fig. 11 shows the PCB layout of the wheelchair prototype.
Fig. 11
PCB Layout of the wheelchair prototype using EAGLE.
PCB Layout of the wheelchair prototype using EAGLE.
Software and firmware
The software developed in this design includes the communication algorithm, the GUI algorithm for manual and fuzzy controllers, and the control algorithm using Python™. Table1 shows the different commands utilized for sending to the microcontroller for each direction and speed desired.
Table 1
Commands to send to the prototype to stablish speed and direction.
Command
Speed[PWM]
Direction
@
–
–
0
255
Forward
1
255
Right
2
255
Left
3
0
Stop
4
255
Backward
5
100
Forward vel1
6
140
Forward vel2
7
180
Forward vel3
8
210
Forward vel4
9
240
Forward vel5
a
255
Turn on its axis
b
100
Right vel1
c
140
Right vel2
d
180
Right vel3
e
210
Right vel4
f
240
Right vel5
g
100
Left vel1
h
140
Left vel2
i
180
Left vel3
j
210
Left vel4
k
240
Left vel5
l
100
Backward vel1
m
140
Backward vel2
n
180
Backward vel3
o
210
Backward vel4
p
240
Backward vel5
q
255
Backward right
r
255
Backward left
Commands to send to the prototype to stablish speed and direction.Communication algorithm: The communication algorithm between the computer and the prototype is described using a block diagram that is shown in Fig. 12, where the command sent is”@” to know the sensors data located in the prototype.
Fig. 12
Block diagram of the communication algorithm between the computer and the prototype.
Block diagram of the communication algorithm between the computer and the prototype.GUI algorithm: The GUI algorithm shown in Fig. 13 shows a main window of the application with the manual control importing the control and graphical libraries, and choosing the control type to be executed. In this case, the user chooses the desired speed and executes the application using the buttons of the interface to operate the prototype.
Fig. 13
Block diagram of the GUI algorithm using manual control for operate the wheelchair prototype.
Block diagram of the GUI algorithm using manual control for operate the wheelchair prototype.Fuzzy controllers algorithm: The fuzzy controller was made taking into account the Euler angles to establish membership functions with the maximum ranges of head motion. The universe discourse was made and the algorithm uses commands for fuzzy outputs that are sent to the microcontroller to operate the wheelchair prototype. The algorithm is shown in Fig. 14.
Fig. 14
Block diagram of the fuzzy algorithm using Python™.
Block diagram of the fuzzy algorithm using Python™.
Design files summary
The system has files for hardware and software system. The hardware files include the instrumentation, communication and control actions for the wheelchair. The software files include the manual and fuzzy con– troller algorithms and the GUI application. For more details, Callejas-Cuervo et al. [15] shows the design and implementation of the controllers designed and results of the system that are available in https://data.mendeley.com/datasets/ys9s9pgvbg/1.
Bill of materials
The bill of materials is shown in Table2.
Table 2
Materials characteristics and price used for the wheelchair prototype construction.
Component
Description
Quantity
Cost per unit – currency (USD)
Total cost – currency (USD)
Source of material
Microcontroller Teensy®3.2 of PJRC
Controller that com- municates with Iner- tial Measurement Units (IMUs) and another sensors and actuators
1
$31
$31
https://www.pjrc.com/store/teensy32.html
Inertial Mea- surement Unit InvenSense MPU9150
Sensor used for head motion capture
2
$24
$48
https://learn.sparkfun. com/tutorials/mpu-9150-hookup-guide? ga = 2.37669923.2076180685.1631109717–233043870.1631109717
Inertial sensor MPU6050
Sensor used for mea- suring wheelchair move- ment
1
$2
$2
https://www.sparkfun. com/products/11028
Ultrasonic Sen- sor HC-SR04
Sensor used for measur- ing distances and detect obstacles
3
$2
$6
https://www.sparkfun. com/products/15569
WiFi Module ESP8266
Wireless communica- tion module between the prototype and computer
1
$4
$4
https://www.sparkfun. com/products/17146
Optical Encoder
Sensor used for mea- suring the speed of the prototype
1
$7
$7
https://www.sparkfun. com/products/12629
Driver L298N DC motors
Controller On/Off to operate the DC motors of the prototype
1
$3
$3
https://www.sparkfun. com/products/15451
DC Gearmotor
DC motor to drive the prototype
2
$11
$22
https://www.sparkfun. com/products/15277
Lithium Batter- ies 1000mAh
Energy system
1
$31
$31
https://www.sparkfun. com/products/11856
Robotic Struc- ture
Physical structure that simulates a wheelchair prototype
Materials characteristics and price used for the wheelchair prototype construction.The materials used for the application can change with the aim to obtain better results such as the distance sensor, improving the measurement error.
Build instructions
The build instructions are divided in step-by-step construction instructions of software and hardware of the system.Software.The software designs were made using Python™ using examples such as https://scikit-fuzzy/auto examples/plot tipping problem newapi.html for the fuzzy controllers. Multiparallelism was the method utilized, specifically the use of threads for real time response of the system, as the reading sensors and fuzzy output control have to occur at the same time. The threads implementation is explained in https://docs. python.org/3/library/threading.html. The GUI was made using a library PyQt5 of Python™ with examples of implementation and basic commands that are shown in https://https://www.guru99.com/pyqt-tutorial.html. The fuzzy controllers also have developed using a toolbox”LogicFuzzyDesigner” of MATLAB®, and the simulation analysis was made based on the results. The toolbox example is shown in https://www. mathworks.com/help/fuzzy/building-systems-with-fuzzy-logic-toolbox-software.html.HardwareThe build instructions utilize the schematics mentioned above. The assembly of the system was made in EAGLE, and the PCB was printed through a PCB board and acid to remove the unnecessary parts of copper and exhibit the printed circuit. The resultant board is shown in Fig. 15. The assembly includes the location of the sensors appropriately in the prototype.
Fig. 15
Block diagram of the GUI algorithm using manual control for operate the wheelchair prototype.
Block diagram of the GUI algorithm using manual control for operate the wheelchair prototype.
Operation instructions
The building instructions are shown in Fig. 16, where the user turns on the device using a switch button that is located on the prototype. The communication system is activated by means of the algorithm explained above. The user links the prototype with the computer through a WiFi network that generates the prototype. The user connects the computer with this network and run the application that contains the different algorithms for graphical interface, manual and fuzzy controllers as well as time response graphics. The application has a main window where the user selects the type of control. The main window has buttons and the user chooses a velocity if and only if the user wants to drive the wheelchair prototype with a manual control. In the left part of the window, there are buttons for position, speed and orientation controllers. The user can choose the type of control and return to the main window by closing the window that was generated. Fig. 17 shows the main window of the application.
Fig. 16
Block diagram for the application use.
Fig. 17
Main window of the graphical interface.
Block diagram for the application use.Main window of the graphical interface.The user sets the speed range using the keyboard of the computer to establish a number between 0 and 5. Pressing the enter button, the user can select directions through the interface buttons: forward, left, right, backward, back-right, back-left and stop. If the user needs to control the prototype with another type of control, there are six options in the left hand of the GUI.The new window that is generated when the user selects another type of control different to manual is shown in Fig. 18.
Fig. 18
New window generated when the user selects one controller different to manual control.
New window generated when the user selects one controller different to manual control.
Validation and characterization
Experimental tests were conducted with 10 participants including one person with reduced mobility in the lower limbs. The user operated the system using a computer where the GUI was shown. The head motion capture system and the wheelchair prototype explained in this work are shown in Fig. 19 representing the head movement and the prototype location.
Fig. 19
Use of the system with a person without movement in lower limbs.
Use of the system with a person without movement in lower limbs.The electronic validation was made with the time response of the different controllers implemented. Table3 shows the time response of the orientation controllers and manual controller.
Table 3
Response time of the manual and orientation controllers.
Participant
Manual Control
Orientation control response with constant speed
Orientation control response with variable speed
1
40
27
32
2
38
29
34
3
36
24
32
4
45
60
90
5
22
31
38
6
44
34
52
7
22
27
24
8
45
40
42
9
34
35
36
10
52
58
62
Response time of the manual and orientation controllers.The response time considers one route made by the user in form of Bernoulli lemniscate. The manual control response is slow because the user uses buttons instead of head movements. The mean velocity utilized is the maximum velocity of the prototype, and the freedom degrees of the prototype are limited. The time response of the system with manual control was 37,8 s. A mean orientation control response with constant speed was 36,5 s, and the mean orientation control response with variable speed was 44,2 s, taking into account the results of Table 3 results.The fuzzy logic controllers were made for position, speed and orientation. Qamar et al. [16] developed a fuzzy controller with orientation inputs and the output system variables were speed and steering. The time response of the system is not comparable with this research because the implementation made by Qamar et al. was in an electric powered wheelchair. Onishi et al. [17] controlled a direction and/or speed of the electric powered wheelchair applying fuzzy logic with a head mounted display. The fuzzy variables were the tilt-angle input movements and tilt-angular-velocity movements. The users made a questionnaire about the maneuverability, comfortability and safety of the wheelchair system but do not describe the time response or electrical validations of the development. Future works for this research are the implementation of the system in an electric-powered wheelchair and comparison with other technologies taking into account the response time of the system in specific routes and electronic response of the different controllers. The authors that developed another types of fuzzy controllers such as [11], [12], [13], for position or speed they did not take into account the fusion of fuzzy controllers with the aim of operate a wheelchair or robot. The objective of this research was the fusion controllers in a proof of concept to control the wheelchair directions.The literature review described different purposes about wheelchair control using head motion, instrumentation and control techniques, such as Ruzaij et al. [3], [4], [5] which developed a system using a prototype and a wheelchair. The system uses motors that needs more power to be activated. The authors also used a physical wheelchair with a reaction time of 100 ms. These values cannot be compared with the results of our system because the system developed was tested with a prototype.Marins et al. [2] used an Arduino Uno for extracting and processing data, as well, MATLAB® for classification data. The use of MATLAB® for this type of control requires a computer and the system cannot be migrated to another unit control. In our case, the system can use any type of unit control.Gomes et al [7] used similar components for the wheelchair prototype, with the difference of laser sensors which have a better accuracy than ultrasonic sensors. Mahmud et al. [9] used a raspberry pi unit for data processing and an Arduino Nano for hand movements processing. Finally, Kader et al. [10] used another system for communication by means of SMS through GSM modem.
Conclusion
A device that simulates a wheelchair prototype using head movements was built using a graphical interface. The system was made to evaluate seven controllers: manual, position, speed, orientation with constant speed, orientation with variable speed and orientation with obstacle detection. The hardware used for the application has commercial components and the software was described using toolkits and libraries of free use and published in repository’s Github or MATLAB® documentation. The reaction response of the system was 100 ms and each controller obtain good responses time through the stablished route for the experiments. Our development used the graphical interface in the computer for monitoring and control the interface taking into account the best visibility and resources for the user. The system can use any type of unit control because it is portable.Ethics statementsThe informed consent of the participants can be obtained writing an e-mail for gis@uptc.edu.co.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Hardware name
Wheelchair prototype controlling by head movements through graphic user interface
Subject area
Engineering and assistance technologyRehabilitationRoboticsSensorsIntelligent control
Hardware type
Field measurements and sensorsElectronic engineering and computer science