Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair

Similar documents
Bi-modal human machine interface for controlling an intelligent wheelchair

intelligent wheelchair

Flexible Bi-modal Control Modes for Hands-Free Operation of a Wheelchair by Head Movements and Facial Expressions

Yi Zhang and Xinli Xu

SMART Wheelchair by using EMG & EOG

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

Preamble Review of Autonomous Wheelchair Control Mechanism

EMG feature extraction for tolerance of white Gaussian noise

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals

Wavelet Based Classification of Finger Movements Using EEG Signals

Non Invasive Brain Computer Interface for Movement Control

Project: Muscle Fighter

Classification for Motion Game Based on EEG Sensing

Movement Intention Detection Using Neural Network for Quadriplegic Assistive Machine

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Classification of Hand Gestures using Surface Electromyography Signals For Upper-Limb Amputees

Using Eye Blinking for EOG-Based Robot Control

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Gaze-controlled Driving

Voice based Control Signal Generation for Intelligent Patient Vehicle

Research on Hand Gesture Recognition Using Convolutional Neural Network

Physiological Signal Processing Primer

BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Towards Multimodal Human-Machine Interface for Hands-free Control: A survey

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control

Physiological signal(bio-signals) Method, Application, Proposal

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

RESEARCH AND DEVELOPMENT OF DSP-BASED FACE RECOGNITION SYSTEM FOR ROBOTIC REHABILITATION NURSING BEDS

WHEELCHAIR MOVEMENT CONTROL USING TONGUE DRIVEN WIRELESS ASSISTIVE TECHNOLOGY

FINGER MOVEMENT DETECTION USING INFRARED SIGNALS

Controlling a Robotic Arm by Brainwaves and Eye Movement

Hybrid architectures. IAR Lecture 6 Barbara Webb

Non-Invasive Brain-Actuated Control of a Mobile Robot

Implementation of Mind Control Robot

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Classifying the Brain's Motor Activity via Deep Learning

Smart Phone Accelerometer Sensor Based Wireless Robot for Physically Disabled People

Using Electrooculogram and Electromyogram for powered wheelchair

30 Int'l Conf. IP, Comp. Vision, and Pattern Recognition IPCV'15

Fingers Bending Motion Controlled Electrical. Wheelchair by Using Flexible Bending Sensors. with Kalman filter Algorithm

A smooth tracking algorithm for capacitive touch panels

Non-Invasive EEG Based Wireless Brain Computer Interface for Safety Applications Using Embedded Systems

INTELLWHEELS A Development Platform for Intelligent Wheelchairs for Disabled People

SIMULTANEOUS OBSTACLE DETECTION FOR MOBILE ROBOTS AND ITS LOCALIZATION FOR AUTOMATIC BATTERY RECHARGING

A New Fake Iris Detection Method

Training of EEG Signal Intensification for BCI System. Haesung Jeong*, Hyungi Jeong*, Kong Borasy*, Kyu-Sung Kim***, Sangmin Lee**, Jangwoo Kwon*

Presented by: V.Lakshana Regd. No.: Information Technology CET, Bhubaneswar

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

NAVIGATION OF MOBILE ROBOT USING THE PSO PARTICLE SWARM OPTIMIZATION

Combined Approach for Face Detection, Eye Region Detection and Eye State Analysis- Extended Paper

SSRG International Journal of Electronics and Communication Engineering - (2'ICEIS 2017) - Special Issue April 2017

Texture recognition using force sensitive resistors

ROBCHAIR - A SEMI-AUTONOMOUS WHEELCHAIR FOR DISABLED PEOPLE. G. Pires, U. Nunes, A. T. de Almeida

Birth of An Intelligent Humanoid Robot in Singapore

Implementation of a Self-Driven Robot for Remote Surveillance

Research Article. ISSN (Print) *Corresponding author Jaydip Desai

Industrial Automation Training Academy. Arduino, LabVIEW & PLC Training Programs Duration: 6 Months (180 ~ 240 Hours)

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

CHAPTER 7 INTERFERENCE CANCELLATION IN EMG SIGNAL

The User Activity Reasoning Model Based on Context-Awareness in a Virtual Living Space

Face Detection System on Ada boost Algorithm Using Haar Classifiers

International Journal of Computer Sciences and Engineering. Research Paper Volume-5, Issue-12 E-ISSN:

Emoto-bot Demonstration Control System

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Hand Gesture Recognition System Using Camera

Brain Computer Interface Cursor Measures for Motionimpaired and Able-bodied Users

Design a Model and Algorithm for multi Way Gesture Recognition using Motion and Image Comparison

ON HEARING YOUR POSITION THROUGH LIGHT FOR MOBILE ROBOT INDOOR NAVIGATION. Anonymous ICME submission

DESIGN AND IMPLEMENTATION OF EMG TRIGGERED - STIMULATOR TO ACTIVATE THE MUSCLE ACTIVITY OF PARALYZED PATIENTS

Decoding Brainwave Data using Regression

UUIs Ubiquitous User Interfaces

An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi

Design and Experiment of Electrooculogram (EOG) System and Its Application to Control Mobile Robot

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Path Following and Obstacle Avoidance Fuzzy Controller for Mobile Indoor Robots

The Control of Avatar Motion Using Hand Gesture

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot

Using Echo Ultrasound from Schooling Fish to Detect and Classify Fish Types

Face Detection using 3-D Time-of-Flight and Colour Cameras

Mouse Activity by Facial Expressions Using Ensemble Method

Design of Joint Controller for Welding Robot and Parameter Optimization

SIMULATION-BASED MODEL CONTROL USING STATIC HAND GESTURES IN MATLAB

ARTIFICIAL INTELLIGENCE - ROBOTICS

Controlling Humanoid Robot Using Head Movements

Non-Contact Gesture Recognition Using the Electric Field Disturbance for Smart Device Application

Eye Monitored Wheelchair System Using Raspberry Pi

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

Intelligent Robotics Sensors and Actuators

BRAINWAVE RECOGNITION

Face Tracking using Camshift in Head Gesture Recognition System

Wadehra Kartik, Kathpalia Mukul, Bahl Vasudha, International Journal of Advance Research, Ideas and Innovations in Technology

Off-line EEG analysis of BCI experiments with MATLAB V1.07a. Copyright g.tec medical engineering GmbH

How to Use the Method of Multivariate Statistical Analysis Into the Equipment State Monitoring. Chunhua Yang

Design and Implementation of a Service Robot System based on Ubiquitous Sensor Networks

Transcription:

Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair Lai Wei and Huosheng Hu School of Computer Science & Electronic Engineering, University of Essex Wivenhoe Park, Colchester CO4 3SQ, United Kingdom lwei@essex.ac.uk, hhu@essex.ac.uk Abstract. This paper presents a novel human machine interface for people with severe disabilities to control an electric powered wheelchair using face movements. Five face movements including jaw clenching and eye closing movements are identified by extracting movement features from both forehead Electromyography (EMG) signal and facial image information. A real-world indoor environment is setup for evaluating the performance of the new control method. Five subjects participated in the experiment to follow designated routes on a map using the new control method, as well as a traditional joystick control respectively. Comparison of two control methods are made in terms of easiness of control, time duration, wheelchair trajectory and error command rate etc. Wheelchair trajectory and time consumption are recorded for each task and results show the new control method are comparable to a joystick control and can be used as a hands-free control interface for disabled and elderly users. Keywords: Intelligent wheelchairs, eye detection, multi-modality, hands-free control, EMG, HMI, HCI. 1 Introduction Since most of the human machine interface (HMI) or human computer interface (HCI) used today are operated by hand. For people with server physical disabilities such as spinal cord injury, quadriplegia, hemiplegia or amputation or elderly people, it is necessary to develop hands-free control interfaces. Apart from hand movements, a number of other human movements can be deployed for HMI or HCI purposes, such as eye movement, limb movement, neck movement, jaw movement and tongue movement. For decades, these movements are intensively studied for the control of rehabilitation devices such as mobile wheelchairs, prosthetics and assistant computers etc., and for most of people who have severe disability; face movement may be their last possible movements to be used. In the past decade, a number of simple yet effective hands-free HMIs are brought into applications such as sip & puff switch, head infrared pointer and head touch switch buttons and these interfaces are widely used for their simplicity to control and easiness to setup. At the same time, there have been some studies on developing HMIs based on human physiological signals such as electromyography (EMG), electrooculography (EOG) and electro-encephalography (EEG). As can be seen in literature [1-5], H. Liu et al. (Eds.): ICIRA 2010, Part II, LNAI 6425, pp. 659 670, 2010. Springer-Verlag Berlin Heidelberg 2010

660 L. Wei and H. Hu HMIs developed from these signals are used for hands-free control of electric-powered wheelchairs. With the recent advancement of machine and computing ability, many HMIs have been developed from computer vision information [11], speech information and multi-modality phenomenon. Ju and Shin et al. introduce a vision-based intelligent wheelchair system developed for people with quadriplegic [6]. Li and Tan [7] propose a bimodal wheelchair control approach by integrating vision and speech controls. Matsumoto and Ino et al. [8] apply the recognition of head motion and eye gaze onto a locomotive wheelchair system. Ferreira and Silvaet al. [9] proposed an HMI structure to control a robotic wheelchair by scalp EMG and EEG signals. Both eye blinking movements and eye close movements are used as command movement to control a mobile wheelchair through an onboard PDA. This paper presents a new hybrid HMI developed to control an eclectic powered wheelchair via face movements resolved from forehead EMG signals and facial image information. To test the performance of this new control strategy, an indoor experimental environment is setup and five subjects are designated with some given tasks to follow a route on a designed map with either the joystick control or the face movement control in turn. The performance of the new method is evaluated by comparing control results with the joystick control in terms of time consumption, wheelchair trajectories and subject personal experience. The rest of the paper is organized as follows. Section 2 describes the system architecture which consists of three parts: a wearable sensory device Cyberlink TM [10]; an Intelligent Wheelchair system and the HMI. Control movement description, feature selection procedure and control strategy are presented in Section 3. Section 4 shows the designed wheelchair experiment environment and real-world experiment results between two control methods. Finally, a brief summary and potential future extension of the system are given in Section 5 and an appendix shows recorded wheelchair trajectories of five subjects is attached thereafter. 2 System Architecture The proposed experimental system contains three parts. The first part is the data acquisition device Cyberlink TM [10], which is composed of 1) a data processing box and 2) a wearable headband. The second part of the system is an intelligent wheelchair platform which is used for evaluating real-world performance of the control system. The third part is the human machine interface which is responsible for extracting features and classifying selected movement patterns and mapping the classified patterns into wheelchair control commands; Details of these parts will be presented in the following paragraphs. 2.1 Sensory Data Acquisition The sensory system for detecting subject facial movements, as shown in Fig. 1, contains an EMG signal acquisition device Cyberlink TM and a Logitech S5500 web camera which is mounted at the front of the wheelchair pointing to wheelchair user s face as shown in Fig. 1 (Left).

Multi-modality EMG and Visual Based Hands-Free Control 661 Fig. 1. Intelligent wheelchair setup for a subject and indoor experiment environment (Left); The Cyberlink TM headband and data acquisition device (Right) Fig. 1 (Right) shows the composition of EMG device which has a data processing box and a wearable headband with three flat attachable electrodes. The electrodes are attached to human forehead by fastening the headband to user s forehead. From three electrodes, one channel EMG signal can be obtained with one positive electrode, one negative electrode and a reference electrode. The raw EMG amplified from the electrodes are then processed in data acquisition box and turned into digital averaged EMG data sequence with a sampling frequency of 100Hz; 2.2 The Intelligent Wheelchair As shown in Fig. 2, the intelligent wheelchair system employed in this experiment is an electric-powered mobile wheelchair equipped with an industrial computer and a built-in motor control board. The motor control board is embedded with a digital signal processing (DSP) unit which can implement open-loop liner motor control or closed-loop PID motor control by controlling two differentially-driven wheelchair wheels; the wheelchair is mounted with a laser range finder and sonar sensor array, from which the onboard computer can calculate real-time range information to avoidance obstacles such as boxes, people, walls etc. In this experiment, obstacle avoidance function has been disabled. 2.3 Human-Machine Interface (HMI) The human machine interface is responsible for extracting features and classifying selected face movements and mapping the classified movement patterns into wheelchair control commands; detailed procedure of this part is graphically depicted in Fig. 4, as shown, from the EMG sensor, five designed jaw and eye movements are classified into different patterns and with the eye close information detected from the camera, five face movements are identified and these movement patterns are mapped into six wheelchair control commands i.e. GO Forward, Go Backwards, Turn Left,

662 L. Wei and H. Hu Turn Right, Reduce Speed and Stop, which imitates direction and speed control of a joystick. The detailed pattern recognition process and control strategy mapping procedure including data segmentation, feature extraction, classification and logic control will be discussed further in Section 3. Fig. 2. Intelligent wheelchair platform employed in the experiment 3 Control Movement and Pattern Recognition 3.1 Control Movement Selection Five designated face movements employed in this experiment are named as single jaw clenching (SJC), forehead double jaw clenching (DJC), left eye close (LEC), right eye close (REC) and continuous jaw clenching (CJC) movement. Detailed motion of the five selected control movements are described as follows: As shown in Fig.4, SJC is a heavy jaw clenching movement which can be produced by the subject holding up their teeth and pushing the jaw against upper teeth and making a strong contraction of masseter and buccinators muscles with a fast jaw clenching movement. Similar to teeth holding up motion in SJC movement; CJC can be produced by a subject making gentle and repetitive clenching, compared with SJC clenching, CJC is gentle and persistent jaw clenching movement. As shown in Fig. 3 and Fig. 4, CJC movement can cause a series of short and regular EMG waveforms while SJC arouse a sharp and high waveform. DJC is made of two consecutive SJC movements. LEC and REC movements are one eye open and

Multi-modality EMG and Visual Based Hands-Free Control 663 one eye close movements, starting from eye closing movement of both left and right eyes. The movement state can last by keeping the eye closed. Unlike gentle eye closing movements during blinking or sleeping, LEC and REC are heavier muscle movements and can be described as eye squeezing movements. The finishing state of the movement is marked by opening the closed eye. 3.2 EMG Pattern Classifications In order to recognize five face control movements described above, features from EMG signals and face image information are jointly extracted and classified. Fig. 3 shows EMG waveform patterns obtained during corresponding control movements. These EMG patterns are classified with two separate procedures i.e. SJC-DJC procedure and CJC-EC procedure. In the SJC-DJC procedure, SJC and DJC are recognized by a threshold based strategy. SJC is detected by calculating uninterrupted EMG overshoot amplitudes when EMG amplitude is continuously above a preset threshold. If the length of time falls within a preset range, then SJC patterns is detected. This strategy is also implemented by Torsten and Bernd in [1], which uses SJC movements to control a powered wheelchair. Fig. 3. Class division for EMG movement pattern classification, movement patterns are displayed in 50 samplings (500ms) In the second procedure, an support vector machine (SVMs) classifier are trained by using training data collected from a standard subject performing various face movements including CJC, LEC, REC, SJC, DJC, relaxing, talking, smiling etc. These movement patterns are assembled into three classes which are Class EC, Class

664 L. Wei and H. Hu CJC and Class Exclude (EX). As shown in Fig. 3, Class EC contains two patterns which are EMG patterns produced during LEC and REC movements. Class CJC contains a single pattern which from CJC movements. And Class EX contains all the other patterns other than the patterns in EC and CJC classes. Class EX contains a range of EMG patterns such as SJC and DJC movement patterns, patterns from facial movements when subject is relaxing, talking, smiling etc. Training data are sampled according to the movement patterns described in these three classes, and the recorded movements data are therefore segmented into 200ms sections for feature extraction purpose. Eight waveform features, including five from time domain which are mean absolute value (MAV), root mean square (RMS), waveform length (WL), zero crossing (ZC) and slope sign changes (SSC) and three from frequency domains which are frequency mean (FMN), frequency median (FMD) and frequency ratio (FR), are extracted from each data segment. With extracted features from three classes, a RBF kernel SVM classifier is selected, which gives a best classification accuracy of 93.5% on a threefold cross validation procedure. 3.2 Face Tracking and Eye Close Recognition Alongside with classified EMG patterns, two face image patterns denoting LEC and REC movements are also classified with an adaptive Boosting learning approach on Haar-like features. As shown in Fig. 3, two classifiers for detecting left-closed eye and right-closed eye are trained. The training is based on sampling closed-eye images from 5 different subjects under various illumination conditions, changing subject face orientations and distance relative to the sampling camera. As a result, a number of 1475 closed-eye images are collected for left eye training, and for the right eye, and the positive image set has 1318 closed eye images. Based on these images, a left-closed eye classifier with 20 cascades is trained, and the classification accuracy validated among sample training images is 91.75%. For the rightclosed eye, the training process stops at 19 cascades with a classification validation accuracy of 92.2%. 3.3 Decision Fusion and Control Strategy Fig. 4 describes the decision fusion process for classified EMG and visual patterns and wheelchair control commands. Six control commands namely Go Forward (GF), Turn Left (TL), Turn Right (TR), Reduce Speed (RS), Stop (ST) and Go Backwards (GB) are employed to mimic and replace an ordinary joystick control. As depicted in Fig. 4, the wheelchair control logic can be described as follows: wheelchair will go forward and accelerate its speed when subject making CJC movement and will turn left or right when the subject is closing his right or left eye with LEC or REC movements. The wheelchair will keep on turning until the closed eye is opened, in this way subject can control how much angle the wheelchair has turned. If a SJC movement is detected, the wheelchair will drop its current speed into a safe speed and going forward. DJC movement will cause the wheelchair to stop and another DJC movement when the wheelchair has stopped will make wheelchair going backwards.

Multi-modality EMG and Visual Based Hands-Free Control 665 Fig. 4. Diagram showing sample face control movements and control strategy of EMG and visual based control methods 4 Real-World Experiment Setup and Results 4.1 Experiment Setup To test the performance of the hybrid (EMG and Visual based) control method, an indoor experiment environment is setup. Fig. 5 shows the schematic configuration of the map, as can be seen, the size of the map is 5600 4100mm and the actual size of the wheelchair is 1200 800mm. The environment is supplied with ample light intensity which refers to an indoor fluorescents illumination condition. Five intact subjects are involved in the experiment and each subject are asked to follow a designed route (depicted in Fig. 5) from Docking area A to Docking area B using either joystick control or hybrid control method. There are ten tasks from A to B to be completed for each control method. For each designed task, the time consumption and the trajectory of the wheelchair are recorded. The position of the wheelchair is tracked via a VICON tracking system by attaching identifiable markers to the wheelchair (as shown in Fig.1 left). During the experiment, we found that the layout and size of the map can have influence on the experiment results with respect to different control methods. In order to apply equal rules for both control methods, the map is designed to have adequate space for the user to maneuver, and each task the routes is designed for made up of

666 L. Wei and H. Hu full control functions including two alternatively left and right turnings, following strait lines and docking into narrow places. For the hybrid control method, the wheelchair speed (going forward and turning) is tuned for each subject to an optimized condition so that the subject can have a better performance with the control method on the map. As the joystick control interface is a mature and universal, the acceleration, angular and linear velocities are adjustable from zero to the maximized wheelchair carrying ability. 1300 5600 Docking Area B 1300 4100 Wood box barrier Docking Area A Pitch boundary Planed routes Fig. 5. Map designed for indoor experiment and the route for subjects to follow during wheelchair control tasks 4.2 Experiment Results and Analysis Fig. 6 to Fig. 10 in appendix show experimental results for five subjects, namely from subject A to subject E. Each diagram shows twenty recorded time durations corresponding to twenty tasks in which ten tasks are finished by the hybrid control and other ten by the joystick control. As can be seen, for all the five subjects completing the same tasks, joystick control took less time than facial movement based control. we can see that compared with the new hybrid control method, joystick control is more smooth and steady while the new hybrid control method costs more time and the performance between tasks varies considerably with the same subject. Fig.6 and Fig. 10 show that for some subjects, the new control method presents a relatively inferior performance. The individual difference on EMG features and image

Multi-modality EMG and Visual Based Hands-Free Control 667 features can affect the performance and make the new control method unsteady and difficult to handle. A control experience survey from five subjects also show that the new method is more difficult to manipulate compared with traditional joystick control especially during persistent turning actions; subjects feel fatigue when making eye closing and squeezing movement for up to ten seconds. 5 Conclusion and Future Work This paper provides a solution for a face movement based wheelchair control interface that combines features from both forehead EMG data and facial images. Five subjects participated in the test experiments in order to evaluate the performance of the new interface. From the experiment results, we conclude that: (1) Computer vision and facial EMG based control can be effectively used in real-world real-time control due to their fast respond speed, but its control complexity, error rate and muscle fatigue should be improved; (2) Although the trajectories resulted from the face movement based control is more sparse and irregular compared with the ones resulted from the traditional joystick control, the two systems are comparable.; (3) By combining multi-modality sensor information from both forehead EMG signals and facial images, facial motion noises such as eye blinks and random facial expression movements including laughing, smiling and talking are effectively avoided during control. Our future work will improve the current control method in two ways. One way is to keep the current sensor modalities and investigate more robust and extinctive features and movement patterns on existing EMG signal and image information. The other way is to include more sensory modalities and develop more complex control system with manifold sensor combinations. References [1] Felzer, T., Freisleben, B.: HaWCoS: The Hands-free Wheelchair Control System. In: Proceedings of International ACM SIGACCESS Conference on Computers and Accessibility, pp. 127 134. ACM Press, New York (2002) [2] Moon, I., Lee, M., Chu, J., Mun, M.: Wearable EMG-based HCI for electric-powered wheelchair users with motor disabilities. In: Proceedings of IEEE International Conference on Robotics and Automation, pp. 2649 2654 (2005) [3] Barea, R., Boquete, L., Mazo, M., Lopez, E.: System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering 10, 209 218 (2002) [4] Rebsamen, B., Burdet, E., Guan, C., Zhang, H., Teo, C.L., Zeng, Q., Laugier Jr., C., Marcelo, A.H.: Controlling a Wheelchair Indoors Using Thought. IEEE Intelligent Systems 22(2), 18 24 (2007) [5] Tsui, C.S.L., Jia, P., Gan, J.Q., Hu, H., Yuan, K.: EMG-based hands-free wheelchair control with EOG attention shift detection. In: Proceedings of IEEE International Conference on Robotics and Biomimetics, pp. 1266 1271 (December 2007)

668 L. Wei and H. Hu [6] Ju, J.S., Shin, Y., Kim, E.Y.: Vision based interface system for hands free control of an Intelligent Wheelchair. Journal of Neuroengineering and Rehabilitation 6 (2009) [7] Li, X., Tan, T., Zhao, X.: Multi-modal Navigation for Interactive Wheelchair. In: Tan, T., Shi, Y., Gao, W. (eds.) ICMI 2000. LNCS, vol. 1948, pp. 590 598. Springer, Heidelberg (2000) [8] Matsumoto, Y., Ino, T., Ogasawara, T.: Development of Intelligent Wheelchair System with Face and Gaze Based Interface. In: Proceedings of 10th IEEE Int. Workshop on Robot and Human Communication (ROMAN 2001), pp. 262 267 (2001) [9] Ferreira, A., Silva, R.L., Celeste, W.C., Bastos Filho, T.F., Sarcinelli Filho, M.: Human machine interface based on muscular and brain signals applied to a robotic wheelchair. Journal of Physics: Conference Series 90, 012094 (2007) [10] Cyberlink Brainfingers, Hands-free Computer Access Solution, http://www.brainfingers.com/ [11] Jia, P., Hu, H., Lu, T., Yuan, K.: Head gesture recognition for hands-free control of an intelligent wheelchair. Journal of Industrial Robot 34(1), 60 68 (2007) Appendix The following figures show the trajectories of real-world experiment results for five subjects, i.e. alphabetically from Subject A to E. Fig. 6. Trajectories recorded on Subject A during completing four designed tasks: upper left and upper right (EMG and Visual based control); lower left and lower right (joystick control)

Multi-modality EMG and Visual Based Hands-Free Control 669 Fig. 7. Trajectories recorded on Subject B during completing four designed tasks: upper left and upper right (EMG and Visual based control); lower left and lower right (joystick control) Fig. 8. Trajectories recorded on Subject C during completing four designed tasks: upper left and upper right (EMG and Visual based control); lower left and lower right (joystick control)

670 L. Wei and H. Hu Fig. 9. Trajectories recorded on Subject D during completing four designed tasks: upper left and upper right (EMG and Visual based control); lower left and lower right (joystick control) Fig. 10. Trajectories recorded on Subject E during completing four designed tasks: upper left and upper right (EMG and Visual based control); lower left and lower right (joystick control)