intelligent wheelchair

Size: px
Start display at page:

Download "intelligent wheelchair"

Transcription

1 80 Int. J. Biomechatronics and Biomedical Robotics, Vol. 3, No. 2, 2014 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair Ericka Janet Rechy-Ramirez* and Huosheng Hu School of Computer Science and Electronic Engineering, University of Essex, Colchester CO4 3SQ, UK *Corresponding author Abstract: This paper presents a human machine interface (HMI) for hands-free control of an electric powered wheelchair (EPW) based on head movements and facial expressions detected by using the gyroscope and cognitiv suite of an Emotiv EPOC device, respectively. The proposed HMI provides two control modes: 1) control mode 1 uses four head movements to display in its graphical user interface the control commands that the user wants to execute and one facial expression to confirm its execution; 2) control mode 2 employs two facial expressions for turning and forward motion, and one head movement for stopping the wheelchair. Therefore, both control modes offer hands-free control of the wheelchair. Two subjects have used the two control modes to operate a wheelchair in an indoor environment. Five facial expressions have been tested in order to determine if the users can employ different facial expressions for executing the commands. The experimental results show that the proposed HMI is reliable for operating the wheelchair safely. Keywords: human machine interface; HMI; head movement ; facial expression; Emotiv sensor; wheelchair. Reference to this paper should be made as follows: Rechy-Ramirez, E.J. and Hu, H. (2014) Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair, Int. J. Biomechatronics and Biomedical Robotics, Vol. 3, No. 2, pp Biographical notes: Ericka Janet Rechy-Ramirez received her BSc in Administrative Computer Systems from the University of Veracruz, Mexico in 2004 and MSc in Computer Science from National Laboratory of Advanced Computing (LANIA) Educative Centre, Mexico in She is currently a PhD student in the University of Essex, and financially supported by the Mexican Government. Her current research interests are in the areas of human machine interfaces, myoelectric control and machine learning. Huosheng Hu is a Professor in School of Computer Science and Electronic Engineering, University of Essex, UK, leading the Robotics Research Group. His research interests include autonomous mobile robots, human-robot interaction, multi-robot collaboration, embedded systems, pervasive computing, sensor integration, intelligent control and networked robotics. He has published over 380 papers in journals, books and conferences, and received few best paper awards. He is the Editor-in-Chief for International Journal of automation and Computing, Founding Editor for Robotics Journal and Executive Editor of International Journal of Mechatronics and Automation. 1 Introduction The majority of the electric powered wheelchairs (EPS) are controlled by using a joystick manually and cannot be used by disabled and elderly people who suffer from spinal cord injuries, tetraplegia or amputation of their limbs. Hands-free control alternatives are required for these people to operate the wheelchairs. Therefore, in order to fulfil this need, various human machine interfaces (HMIs) have been developed using different types of bio-signals, e.g., electromyography (EMG) signal, electrooculography (EOG) signal, electroencephalography (EEG) signal, as well as head movements. Jia et al. (2007) developed a visual-based HMI for controlling a wheelchair by recognising head movements of the user. Gajwani and Chhabria (2010) obtained eye tracking and blinking by a camera mounted on a cap to control a wheelchair. The common limitation of these visual HMIs is that their performances are likely affected by Copyright 2014 Inderscience Enterprises Ltd.

2 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 81 environmental noises such as illumination, brightness and the camera position. Other HMIs have used EMG signals to detect shoulder movements (Han et al., 2003; Moon et al., 2005) or facial expressions (Felzer and Freisleben, 2002; Firoozabadi et al., 2008; Tamura et al., 2010; Tsui et al., 2007) in order to achieve the hands-free control of a wheelchair. Wei and Hu (2010) used eye winking and jaw clenching movements to provide the control commands to the wheelchair, in which two types of information, EMG signal and facial image, were integrated. Bergasa et al. (2000) identified head movements, lip hiding and eye winking by using a 2D face tracker and a fuzzy detector. Huo and Ghovanloo (2009) have operated a wheelchair by using tongue movements, in which the movement data were obtained from a magnetic tracer on the tongue. Although, this HMI is flexible in its configuration for giving the commands, it is invasive for long-term usage since the user should receive a tongue piercing embedded with the magnetic tracer. Palankar et al. (2008) used EEG signals to operate a wheelchair through a mounted robotic arm that was operated by using a P300 brain computer interface (BCI). The user was able to control the motion of the arm and chair by focusing attention on a specific character on the screen. However, its response time is slow. On the other hand, Gomez-Gil et al. (2011) have used a sensor called Emotiv EPOC headset, i.e., an EEG sensor, to recognise four fixed trained muscular events to steer a tractor, while Carrino et al. (2011) have developed a system, namely virtual move, which allows users to navigate through Google street view (GSV) using head movements, facial expressions, thoughts and emotional states detected with the Emotiv sensor. This paper proposes a HMI for hands-free control of an EPW with two control modes, based on two types of bio-signals, head movements and facial expressions. These bio-signals have been detected by using the data obtained from the gyroscope and the Cognitiv suite of the Emotiv sensor, respectively. In control mode 1, the control commands are displayed in its graphical user interface (GUI) by employing the head movements of the user and their executions are achieved by performing just one facial expression. In control mode 2, only one head movement is used for stopping the wheelchair, while the execution of the control commands for turning and going forward and backward are done by performing two facial expressions. Thus, the user is free to move his/her head without affecting the control on the wheelchair in both modes. Also, the proposed HMI is flexible, allowing the user to train and record his/her desired facial expressions in order to select the most comfortable one for executing the commands. Both modes provide five control commands: going forward, turning right, turning left, stopping and going backward. To verify the reliability and safety of the two control modes, healthy subjects have been recruited in experiments to control the wheelchair in an indoor environment using different facial expressions. The rest of this paper is organised as follows. Section 2 presents an overview of the HMI, including a brief description of the employed equipment, explanations of the two control modes, the training process of the facial expressions and the evaluation protocol. Experimental results and analysis are given in Section 3 to show the feasibility and performance of the proposed HMI in the real-world setting. Finally, a brief conclusion and potential future work are given in Section 4. 2 HMI overview 2.1 Equipment The Emotiv EPOC headset The Emotiv EPOC headset is used in our HMI in order to detect head movements and facial expressions, which is a device that measures EEG activity from 14 saline electrodes (plus CMS/DRL references, P3/P4 locations). These electrodes are arranged according to the 10/20 system, and their locations are AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8 and AF4 (Emotiv Software Development Kit (SDK) for research, researchers/), as shown in Figure 1(c). The Emotiv control panel is used to configure and demonstrate its detection suites, which has a window for users to see the contact quality of each electrode. The green colour indicates a good signal, yellow for fair signal, orange for poor signal, red for very poor signal and black for no signal. The Emotiv Software Development Kit for research includes an application programming interface (API) to develop applications by using three different Emotiv suites: Cognitiv, Expressiv and Affectiv. Each suite has different functions as follows: 1 Cognitiv suite recognises 14 conscious thoughts of the user ( neutral, right, left, push, pull, lift, drop, rotate left, rotate right, rotate clockwise, rotate anti-clockwise, rotate forwards, rotate reverse and disappear ); but only a maximum of four actions can be used apart from the neutral action in each session. 2 Expressiv suite recognises facial expressions such as blink, right wink, left wink, look right/left, raise brow, furrow brow, smile, clench, right smirk, left smirk and laugh. 3 Affectiv suite recognises the emotional states of a user, including engagement, boredom, frustration, meditation, instantaneous excitement and long-term excitement. The EmoEngine communicates with the Emotiv headset, receives the pre-processed EEG signals, manages user-specific or application-specific settings, performs post-processing, and translates the detection results into an easy-to-use structure called EmoState. An EmoState is an opaque data structure that contains the current state of the Emotiv detections, which in turn reflects the facial,

3 82 E.J. Rechy-Ramirez and H. Hu emotional and cognitive state of users (EmotivSDK, Figure 1 (a) Intelligent wheelchair setup for a subject, (b) the Emotiv EPOC headset and (c) the locations of the electrodes of the Emotiv sensor according to the 10/20 system (see online version for colours) when he/she is controlling the wheelchair. The two control modes deployed in our HMI are based on the integration of two types of bio-signals, namely the head movements and the facial expressions of the user. Both modes offer five control commands: going forward, turning right, turning left, stopping and going backward Control mode 1 In this control mode, the head movements of the user are used to display the control commands in its GUI, except for the down head movement that is used for giving the stopping command at any time for safety reasons. On the other hand, one facial expression of the user is employed to execute the command. (a) (b) Displaying the control commands As mentioned earlier, the head movements of the user only have the function of displaying the control command that the user wants to execute. The gyroscope of the Emotiv sensor is used to detect four head movements ( up, down, right and left head movements). As can be seen in Figure 2, each head movement is associated with a control command as follows: (c) Besides the suites, the Emotiv EPOC sensor has a gyroscope with two axes, X and Y, through which its EmoEngine gives the position data of the user s head. The X axis provides horizontal head movement data, e.g., a positive value representing an up head movement and a negative value representing a down head movement. Y axis gives vertical head movement data, e.g., a positive value representing a right head movement and a negative value representing a left head movement The intelligent wheelchair As shown in Figure 1(a), the intelligent wheelchair in this research is equipped with an industrial PC with the following features: Processor: Intel (R) Pentium (R), 2.20 GHz Installed memory (RAM): 4.00 GB Operating system: Windows 7 Installed software: Microsoft Visual Studio 2010 and Emotiv research edition SDK The detailed description of the hardware structure of the wheelchair system can be found in Jia et al. (2007). 2.2 Methodology With the aim of providing freedom to the user, it is important to enable a user to move the head in a safe way an up head movement displays the going forward control command a right head movement shows the turning right control command a left head movement displays the turning left control command a down head movement executes the stopping command and shows the going backward control command only when the stopping command is being executed. The X and Y axes of the gyroscope are used to distinguish each head movement as shown in Figure 2. Furthermore, the user has the freedom of adjusting the definition of the axes for the head movements according to his/her convenience by just modifying the sliders provided in the GUI of the mode. It is important to remark that in order to offer security and confidence to the user, the down head movement does not need the confirmation through facial expression, when it is associated with the stopping command. Therefore, the user has the warranty that at any time, he/she can stop the wheelchair immediately Executing the control commands The user can issue the control command displayed in the GUI of the mode by performing a facial expression. The user only has to hold for one second the facial expression selected in the GUI in order to run the control command that is being displayed. After that, there is no need of keeping

4 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 83 the expression to continue running the command. The following control commands need the confirmation by means of the facial expression to be executed: going forward, turning right, turning left and going backward. With the aim of training and detecting different facial expressions, the Cognitiv suite is employed instead of the Expressiv suite, due to its good accuracy and immediate response. With the Cognitiv suite, users are able to record their most comfortable facial expressions to execute the control commands, providing flexibility in the choice of facial expressions of a user. Thresholds and timers were used in both modes in order to avoid the execution of false positive actions. Figure 2 Definition of the axes for the head movements and the control commands associated with them (see online version for colours) expression previously trained to execute the control commands. Once that the user has chosen his/her facial expression for executing the control commands and has accepted it, the GUI will display a ready symbol indicating that it is ready for showing and executing control commands. The HMI starts at the stopping command. Furthermore, our GUI provides visual feedback of the commands in two senses: the symbol of the control command that is being displayed in the GUI by the head movements of the user (top of the GUI) the symbol of the command that is being executed at the moment (middle of the GUI). In addition, after the user has performed the facial expression to trigger the command, a beep sound is produced in order to emphasise and inform the user its execution. Figure 3 GUI of control mode 1, after selecting the facial expression for executing the control commands (see online version for colours) GUI for control mode 1 The GUI for control mode 1 is shown in Figure 3. As can be seen, our GUI offers the options of selecting, adding, removing and saving users; adjusting the sensitivities of the up, down, right and left head movements of a user; choosing the speed of the wheelchair; and selecting the facial Control mode 2 In this control mode 2, facial expressions are used to execute the control commands for going forward and backward, as well as turning, and only one head movement is used for stopping. The user can choose between the up or down head movements for giving the stopping command. As can be seen in Figure 4, each facial expression is associated with two control commands. One facial expression executes the control commands of going forward and turning left ; while the other facial expression issues the control commands of turning right and going backward. Hence, a double performance of the facial expression is required to execute the control commands of turning left and going backward. As we mentioned earlier, thresholds and timers were employed in the data retrieved by the Cognitiv suite to avoid the execution of false positive actions.

5 84 E.J. Rechy-Ramirez and H. Hu GUI for control mode 2 The GUI for control mode 2 is shown in Figure 5. As can be seen, our GUI offers the options of adjusting the sensitivities of the up and down head movements of the user; choosing the speed of the wheelchair; and selecting the previously trained facial expressions for executing the commands. As the same case of the previous mode, once the facial expressions were chosen by a user for executing the commands and have been accepted, the GUI will display a ready symbol indicating that it is ready for executing commands. The HMI starts at the stopping command. In this control mode, our GUI only provides visual feedback of the control command being executed at the moment and the beep sound to inform the user its execution. Figure 4 A flow diagram of control mode 2 (see online version for colours) drop ), a cube presented in its GUI is moving according to the training action. Each cognitiv action has an associated facial expression, i.e., the right cognitiv action corresponds to right smirk. The training of the facial expressions using the Cognitiv suite is as follows: 1 Firstly, the user has to train a neutral state (normal pose) in which the user does nothing. The neutral action can be trained by using two modalities: one modality lasts 8 seconds and the other one lasts 30 seconds. During the training process of this action, the user has to keep a normal pose avoiding performing any facial expression, so that action data can be recorded by using Cognitiv suite. This action needs more training than the other ones in order to avoid false triggered expressions. For our experiments, the subjects trained the neutral state 7 times using the 8 seconds modality and 3 times using the 30 seconds modality. Figure 6 A user training right smirk facial expression by employing Cognitiv suite (see online version for colours) Figure 5 GUI of control mode 2 after selecting the facial expressions for executing the control commands (see online version for colours) Note: A head band is fixed on the top of Emotiv EPOC to achieve a better contact quality of the electrodes. 2 Then, the user has to train each facial expression during which the pose has to be kept for 8 seconds. The user has to train the same expression for several times until the action is performed easily and reliably. When the user is training a facial expression, he/she has to perform it in the same way each time. If the user does not do that, then it will be difficult for Cognitiv suite to detect the action reliably The training process of facial expressions Before using the control modes of our HMI, the user has to train his/her facial expressions in order to be able to execute the control commands for operating the wheelchair. As can be seen in Figure 6, when a user is training an action of Cognitiv suite (i.e., right, left, push, pull, lift, One way to see whether the user is performing the same facial expression is to check the skill rating of the training in the GUI of the suite; it should be increasing each time. Each facial expression has to be added and trained, once at the time. During the training of each facial expression, its sensitivity was kept as the default setting of Cognitiv suite (medium). Each facial expression was trained for six or eight trials depending on the ability of each person.

6 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 85 3 Finally, if necessary, the user can adjust his/her facial expressions by increasing the sensitivity of each action associated to each expression in Cognitiv suite, at his/her convenience. The sensitivities of the facial expressions of both users were increased by two units of Cognitiv suite to reduce the fatigue of the user during the performance of the facial expression. 2.3 Evaluation protocol Subjects and their experiences using the HMI Two healthy subjects have operated the wheelchair in an indoor environment by employing the two control modes of our HMI. Both subjects are 31 years old, subject A is a female and subject B is a male. With regard to the experiences of the subjects using both modes, it can be said that subject A was familiar with the use of both control modes due to her participation in the project; while subject B had no experience with them Experiment Ten experiments were carried out by each subject in each control mode. In each experiment, the subject has to follow the route shown in Figure 7 (indicated with a blue dotted line) without hitting the obstacles and without departing from the edges. The subject can stop at any time as he/she wants. Due to the lack of experience of subject B in using the HMI, he did the route once with each control mode one day before doing the experiments. Both subjects first did the experiments employing control mode 1 and then the experiments employing control mode 2. Each subject did a round of five trials per day. The order of the experiments can be seen at the top of each control mode in Figure 12 and Figure Facial expressions employed in the experiments To choose and identify the most comfortable facial expressions for controlling the wheelchair, the users deployed the Cognitiv suite of Emotiv. They tested the facial expressions on the HMI but without moving the wheelchair, to evaluate their responses and comfort. Both subjects identified five facial expressions to be used in both modes, nevertheless due to time availability of subject B, he only used some of them. The five facial expressions employed in the experiments involve muscles of the forehead and mouth, namely 1 FB which was performed by furrowing the brows without closing eyes 2 LS which was performed by pulling the left side of the lip to the left 3 RB which was performed by raising the brows 4 RS which was performed by pulling the right side of the lip to the right 5 S which was performed by doing a big smile. During the training of different facial expressions, both subjects noticed that the facial expressions of LS and RS could have false detections with the S expression. Also, FB and RB could be confused between them. For that reason, it is advisable that the user should avoid using S at the same time that LS and RS in order to have an optimal performance of the HMI. The same case applies for FB and RB. Figure 7 The route to follow in the indoor environment (see online version for colours)

7 86 E.J. Rechy-Ramirez and H. Hu 3 Experimental results and analysis Travelling times and trajectories were recorded. Means and standard deviations of the travelling times were calculated. The position of the wheelchair was obtained by a VICON motion tracking system that tracks the five markers attached to the wheelchair. Figure 8 shows the times achieved by subject A to travel the route ten times per control mode. For control mode 1, subject A employed the facial expressions of FB, LS, RS and S, while (LS, RS) and (S, FB) were used for control mode 2. In four experiments, control mode 1 using S reported the fastest times, while control mode 2 using (LS, RS) and (S, FB) achieved the fastest times in three and two experiments, respectively. On the other hand, control mode 1 using FB and RS reported the slowest times in four experiments. Finally, it can be seen that in the experiment ten, the slowest time was achieved by three modes [control mode 1 using RS and S and control mode 2 using (S, FB) ]. Subject B decided to use S for control mode 1, and (S, RB) for control mode 2. The times achieved by subject B repeating the route ten times per each control mode are presented in Figure 9. As can be seen, the user took more time to achieve the route in the first two experiments of control mode 1. In the experiment 1 of this mode, the user was about to collide with the obstacles as a consequence of his lack of experience. Nevertheless, he was able to stop immediately and fix the trajectory by using the going backward command. For the rest of the experiments, it can be seen that the times were quite similar and the difference between modes were not significant. Figure 10 and Table 1 show the means and standard deviations of the travelling times for both subjects at each control mode employing different facial expressions. It can be seen that the smallest mean and standard deviation were of and 4.7 seconds, respectively. These values corresponded to control mode 1 with the facial expression of S operated by the subject A. On the other hand, the largest mean and standard deviation were of 166 and 31.5 seconds, respectively. These values were achieved by the subject B using control mode 1. Also, it can be seen that subject A using LS in control mode 1 and subject B using (S, RB) in control mode 2 had the same means (152.5 seconds). Figure 11 and Table 2 show the minimum, maximum and median travelling times of both subjects in the different control modes using several facial expressions. As can be seen, although both subjects used control mode 1 with S, their results reflected different scenarios. The largest of the maximum travelling times was of 240 seconds corresponding to control mode 1 with S operated by subject B. While the smallest of the minimum travelling times was of 135 seconds achieved by subject A using control mode 2 with (LS, RS). It can be seen that control mode 1 operated by subject A using FB and control mode 2 operated by subject B using (S, RB) had the same minimum travelling times (143 seconds). Figure 8 Time in seconds for subject A following the route for ten times using the two different control modes with different facial expressions (see online version for colours)

8 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 87 Figure 9 Time in seconds for subject B following the route for ten times using two different control modes (see online version for colours) Figure 10 Means and standard deviations of the travelling times for subjects A and B using the control modes with different facial expressions (see online version for colours) Figure 11 Minimum, maximum and median travelling times of subjects A and B at different control modes with different facial expressions (see online version for colours)

9 88 E.J. Rechy-Ramirez and H. Hu Figure 12 Trajectories of subject A employing two control modes with different facial expressions (see online version for colours) Also, it can be noticed that control mode 1 using LS and control mode 2 using (LS, RS) both operated by subject A had the same maximum travelling times (168 seconds). Finally, if only the medians of the travelling times are considered, it can be noticed that there were some similarities between them. The medians of control mode 1 using FB and using LS operated by subject A were the same (150.5 seconds). Likewise, the medians of control mode 2 using (LS, RS) operated by subject A and control mode 2 using (S, RB) operated by subject B were the same (151 seconds). The trajectories achieved by both subjects for each experiment using the different control modes with different facial expressions can be seen in Figure 12 and Figure 13. Both subjects were capable of following the predefined route by using control mode 1 and control mode 2 with different facial expressions without hitting the obstacles and without departing from the edges. As can be seen in

10 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 89 Figure 13, subject B was about to collide with the wall during the experiment 1 of control mode 1 on his first day of experiments, he managed to keep this from happening by using the commands of stopping and going backward. Both subjects felt confident in using both modes, only that they preferred control mode 2 over control mode 1. Subject A felt more comfortable using the facial expressions of (S, FB) in control mode 2, while she preferred the facial expression of S in control mode 1. and Hu, 2010) have provided only one configuration to the user for giving the commands. As can be noticed, it is clear that the proposed HMI can be flexible, allowing the user to choose from two control modes, as well as, his/her most comfortable facial expressions for operating the wheelchair. Furthermore, our HMI is not invasive because the user only needs to wear the Emotiv EPOC and it is not necessary to break into the skin (e.g., Huo and Ghovanloo, 2009). Table 1 Table 2 Standard deviations and means of the travelling times in seconds of subjects A and B using control modes 1 and 2 with different configurations Medians, minimum and maximum values of the travelling times in seconds for subjects A and B who used control modes 1 and 2 with different configurations Subject A Mode 1 Subject A Mode 2 (FB) (LS) (RS) (S) Standard deviation Mean Mode 1 (LS, RS) (S, FB) 10.8 (FB) 6.1 Median Subject B Mode 1 Mode 2 (S) (S, RB) Standard deviation Mean Figure 13 (RS) (S) (LS, RS) (S, FB) Minimum value Maximum value Subject B Mode 1 Mode 2 (S) (S, RB) Minimum value Maximum value Median Apart from the use of the stopping command once at experiment 1 of the subject B in control mode 1, no more stopping commands were used. Also, no false detections on the facial expressions were reported during the experiments. Earlier approaches for operating a wheelchair (Felzer and Freisleben, 2002; Han et al., 2003; Moon et al., 2005; Jia et al., 2007; Tsui et al., 2007; Firoozabadi et al., 2008; Gajwani and Chhabria, 2010; Tamura et al., 2010; Wei (LS) Mode 2 Trajectories of subject B employing two control modes (see online version for colours)

11 90 E.J. Rechy-Ramirez and H. Hu 4 Conclusions and future work In this paper, a HMI with two control modes has been proposed to control a wheelchair through head movements and facial expressions, providing safe mobility to disabled and elderly users who can only move their heads and their muscles of their faces. In one control mode, head movements are used to display control commands in its GUI and one facial expression is employed to confirm the execution of the command. In the second mode, one head movement is used for stopping the wheelchair and two facial expressions are used for turning and going forward and backward motion. Five control commands are provided in our HMI: going forward, turning right, turning left, stopping and going backward. In both modes, the stopping command can be execute at any time since it does not need confirmation to be executed. Furthermore, with the aim of providing comfort to the user, the facial expressions used in both modes to execute the commands are flexible, and the user can choose the facial expressions that he/she prefers. Also, the user does not need to maintain the facial expression while the control command is running. Experimental results show that our HMI is safe and reliable. The future work will be focused on the implementation of a hybrid HMI, in which the users can choose the manner of providing their control commands to the wheelchair by using different facial expressions, head movements or both. Also, more subjects will be recruited for experiments in order to optimise the proposed HMI. Acknowledgements This research is supported by EU Interreg IV A 2 Mers Seas Zeen Cross-border Cooperation Programme SYSIASS: Autonomous and Intelligent Healthcare System. More details can be found from the project website and the COALAS project, Cognitive Assisted Living Ambient System ( The COALAS project has been selected in the context of the INTERREG IVA France (Channel) England European cross-border cooperation programme, which is co-financed by the ERDF. The 1st author has been supported by the Mexican National Council of Science and Technology (CONACYT) through the programme Becaspara Estudios de Posgrado en el Extranjero (no ). Our thanks also go to Robin Dowling for his technical support. References Bergasa, LM., Mazo, M., Gardel, A., Barea, R. and Boquete, L. (2000) Commands generation by face movements applied to the guidance of a wheelchair for handicapped people, in Proceedings of the 15th International Conference on Pattern Recognition, Barcelona, Spain, 3 7 September, Vol. 4, pp Carrino, F., Tscherrig, J., Mugellini, E., AbouKhaled, O. and Ingold, R. (2011) Head-computer interface: a multimodal approach to navigate through real and virtual worlds. In Proceedings of the 14th International Conference on Human- Computer Interaction, Part II Interaction Techniques and Environments, Orlando, FL, USA, 9 14 July, Vol. 6762, pp Emotiv Software Development Kit (SDK) for research [online] (accessed 5/92012). EmotivSDK, User Manual for Release [online] (accessed 5 September 2012). Felzer, T. and Freisleben, B. (2002) HaWCoS: the hands-free wheelchair control system, in Proceedings of the Fifth International ACM Conference on Assistive Technologies, July, pp Firoozabadi, S.M.P., Oskoei, M.A. and Hu, H. (2008) A human-computer interface based on forehead multi-channel bio-signals to control a virtual wheelchair, in Proceedings of the 14th Iranian Conference on Biomedical Engineering (ICBME), pp Gajwani, P.S. and Chhabria, S.A. (2010) Eye motion tracking for wheelchair control, International Journal of Information Technology and Knowledge Management, Vol. 2, No. 2, pp Gomez-Gil, J., San-Jose-Gonzalez, I., Nicolas-Alonso, L.F. and Alonso-Garcia, S. (2011) Steering a tractor by means of an EMG-based human machine interface, Sensors, Vol. 11, No. 7, pp Han, J.S., Zenn Bien, Z., Kim, D.J., Lee, H.E. and Kim, J.S. (2003) Human machine interface for wheelchair control with EMG and its evaluation, Proc. of the 25th Annual IEEE International Conference on Engineering in Medicine and Biology Society, Cancún, México, September, Vol. 2, pp Huo, X. and Ghovanloo, M. (2009) Using unconstrained tongue motion as an alternative control mechanism for wheeled mobility, IEEE Transactions on Biomedical Engineering, Vol. 56, No. 6, pp Jia, P., Hu, H., Lu, T. and Yuan, K. (2007) Head gesture recognition for hands-free control of an intelligent wheelchair, Industrial Robot: An International Journal, Vol. 34, No. 1, pp Moon, I., Lee, M., Chu, J. and Mun, M. (2005) Wearable EMG-based HCI for electric-powered wheelchair users with motor disabilities, in Proceedings of the IEEE International Conference on Robotics and Automation, Barcelona, Spain, April, pp Palankar, M., De Laurentis, K.J., Alqasemi, R., Veras, E., Dubey, R., Arbel, Y. and Donchin, E. (2008) Control of a 9-DoF wheelchair-mounted robotic arm system using a P300 brain computer interface: initial experiments, in IEEE International Conference on Robotics and Biomimetics, Bangkok, Thailand, February, pp Tamura, H., Manabe, T., Goto, T., Yamashita, Y. and Tanno, K. (2010) A study of the electric wheelchair hands-free safety control system using the surface-electromygram of facial muscles, Proceedings of the Third International Conference on Intelligent Robotics and Applications Part II, Springer, Shanghai, China, November, Vol. 6425, pp

12 Head movement and facial expression-based human-machine interface for controlling an intelligent wheelchair 91 Tsui, C.S.L., Jia, P., Gan, J.Q., Hu, H. and Yuan, K. (2007) EMG-based hands-free wheelchair control with EOG attention shift detection, in IEEE International Conference on Robotics and Biomimetics, Sanya, China, December, pp Wei, L. and Hu, H. (2010) EMG and visual based HMI for hands-free control of an intelligent wheelchair, in 8th World Congress on Intelligent Control and Automation (WCICA), Jinan, China, 7 9 July, pp

Bi-modal human machine interface for controlling an intelligent wheelchair

Bi-modal human machine interface for controlling an intelligent wheelchair 2013 Fourth International Conference on Emerging Security Technologies Bi-modal human machine interface for controlling an intelligent wheelchair Ericka Janet Rechy-Ramirez and Huosheng Hu School of Computer

More information

Flexible Bi-modal Control Modes for Hands-Free Operation of a Wheelchair by Head Movements and Facial Expressions

Flexible Bi-modal Control Modes for Hands-Free Operation of a Wheelchair by Head Movements and Facial Expressions Flexible Bi-modal Control Modes for Hands-Free Operation of a Wheelchair by Head Movements and Facial Expressions Ericka Janet Rechy-Ramirez and Huosheng Hu Abstract Many kinds of head movements and facial

More information

Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair

Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair Lai Wei and Huosheng Hu School of Computer Science & Electronic Engineering, University of Essex Wivenhoe Park, Colchester

More information

SMART Wheelchair by using EMG & EOG

SMART Wheelchair by using EMG & EOG SMART Wheelchair by using EMG & EOG Ahire N. L.1, Ugale K.T.2, Holkar K.S.3 & Gaur Puran4 1,3(E&TC Engg. Dept., SPP Univ., Pune(MS), India) 2,4(E&TC Engg. Dept, Bhopal Univ.,Bopal(UP), India) Abstract-

More information

Manipulation of robotic arm with EEG signal. Autores: Carolina Gonzalez Rodríguez. Cod: Juan Sebastián Lasprilla Hincapié Cod:

Manipulation of robotic arm with EEG signal. Autores: Carolina Gonzalez Rodríguez. Cod: Juan Sebastián Lasprilla Hincapié Cod: Manipulation of robotic arm with EEG signal Autores: Carolina Gonzalez Rodríguez. Cod: 1802213 Juan Sebastián Lasprilla Hincapié Cod: 1802222 Tutor: I.E Dario Amaya Ph.D Faculta de ingeniería Programa

More information

Yi Zhang and Xinli Xu

Yi Zhang and Xinli Xu 98 Int. J. Modelling, Identification and Control, Vol., No. 4, 4 Surface EMG-based human-machine interface that can minimise the influence of muscle fatigue Xiaodong Xu* School of Automation Engineering,

More information

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair. ABSTRACT This paper presents a new method to control and guide mobile robots. In this case, to send different commands we have used electrooculography (EOG) techniques, so that, control is made by means

More information

Design and Implementation of Brain Computer Interface Based Robot Motion Control

Design and Implementation of Brain Computer Interface Based Robot Motion Control Design and Implementation of Brain Computer Interface Based Robot Motion Control Devashree Tripathy 1,2 and Jagdish Lal Raheja 1 1 Advanced Electronics Systems Group, CSIR - Central Electronics Engineering

More information

WHEELCHAIR MOVEMENT CONTROL USING TONGUE DRIVEN WIRELESS ASSISTIVE TECHNOLOGY

WHEELCHAIR MOVEMENT CONTROL USING TONGUE DRIVEN WIRELESS ASSISTIVE TECHNOLOGY International Journal of Electrical and Electronics Engineering Research (IJEEER) ISSN(P): 2250-155X; ISSN(E): 2278-943X Vol. 3, Issue 5, Dec 2013, 219-228 TJPRC Pvt. Ltd. WHEELCHAIR MOVEMENT CONTROL USING

More information

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals

A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals , March 12-14, 2014, Hong Kong A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals Mingmin Yan, Hiroki Tamura, and Koichi Tanno Abstract The aim of this study is to present

More information

Gaze-controlled Driving

Gaze-controlled Driving Gaze-controlled Driving Martin Tall John Paulin Hansen IT University of Copenhagen IT University of Copenhagen 2300 Copenhagen, Denmark 2300 Copenhagen, Denmark info@martintall.com paulin@itu.dk Alexandre

More information

BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS

BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS BRAINWAVE CONTROLLED WHEEL CHAIR USING EYE BLINKS Harshavardhana N R 1, Anil G 2, Girish R 3, DharshanT 4, Manjula R Bharamagoudra 5 1,2,3,4,5 School of Electronicsand Communication, REVA University,Bangalore-560064

More information

Preamble Review of Autonomous Wheelchair Control Mechanism

Preamble Review of Autonomous Wheelchair Control Mechanism IOSR Journal of Computer Engineering (IOSR-JCE) e-issn: 2278-0661,p-ISSN: 2278-8727, Volume 18, Issue 3, Ver. IV (May-Jun. 2016), PP 52-56 www.iosrjournals.org Preamble Review of Autonomous Wheelchair

More information

Electroencephalogram (EEG) Sensor for Teleoperation of Domotics Applications via Virtual Environments

Electroencephalogram (EEG) Sensor for Teleoperation of Domotics Applications via Virtual Environments Electroencephalogram (EEG) Sensor for Teleoperation of Domotics Applications via Virtual Environments Oscar F. Avilés S Titular Professor, Department of Mechatronics Engineering, Militar Nueva Granada

More information

A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications

A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications A Study on Ocular and Facial Muscle Artifacts in EEG Signals for BCI Applications Carmina E. Reyes, Janine Lizbeth C. Rugayan, Carl Jason G. Rullan, Carlos M. Oppus ECCE Department Ateneo de Manila University

More information

International Journal of Computer Sciences and Engineering. Research Paper Volume-5, Issue-12 E-ISSN:

International Journal of Computer Sciences and Engineering. Research Paper Volume-5, Issue-12 E-ISSN: International Journal of Computer Sciences and Engineering Open Access Research Paper Volume-5, Issue-12 E-ISSN: 2347-2693 Performance Analysis of Real-Time Eye Blink Detector for Varying Lighting Conditions

More information

Using Eye Blinking for EOG-Based Robot Control

Using Eye Blinking for EOG-Based Robot Control Using Eye Blinking for EOG-Based Robot Control Mihai Duguleana and Gheorghe Mogan Transylvania University of Brasov, Product Design and Robotics Department, Bulevardul Eroilor, nr. 29, Brasov, Romania

More information

Path Following and Obstacle Avoidance Fuzzy Controller for Mobile Indoor Robots

Path Following and Obstacle Avoidance Fuzzy Controller for Mobile Indoor Robots Path Following and Obstacle Avoidance Fuzzy Controller for Mobile Indoor Robots Mousa AL-Akhras, Maha Saadeh, Emad AL Mashakbeh Computer Information Systems Department King Abdullah II School for Information

More information

Emotiv EPOC 3D Brain Activity Map Premium Version User Manual V1.0

Emotiv EPOC 3D Brain Activity Map Premium Version User Manual V1.0 Emotiv EPOC 3D Brain Activity Map Premium Version User Manual V1.0 TABLE OF CONTENTS 1. Introduction... 3 2. Getting started... 3 2.1 Hardware Requirements... 3 Figure 1 Emotiv EPOC Setup... 3 2.2 Installation...

More information

Voice based Control Signal Generation for Intelligent Patient Vehicle

Voice based Control Signal Generation for Intelligent Patient Vehicle International Journal of Information & Computation Technology. ISSN 0974-2239 Volume 4, Number 12 (2014), pp. 1229-1235 International Research Publications House http://www. irphouse.com Voice based Control

More information

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged ADVANCED ROBOTICS SOLUTIONS * Intelli Mobile Robot for Multi Specialty Operations * Advanced Robotic Pick and Place Arm and Hand System * Automatic Color Sensing Robot using PC * AI Based Image Capturing

More information

Keywords: Humanoid Robot, Tele-presence, Brain Computer Interface, Human Robot Interaction, Teleoperation.

Keywords: Humanoid Robot, Tele-presence, Brain Computer Interface, Human Robot Interaction, Teleoperation. Achieving Remote Presence using a Humanoid Robot Controlled by a Non- Invasive BCI Device A. Thobbi, R. Kadam, W. Sheng Laboratory for Advanced Sensing, Computation and Control, Oklahoma State University,

More information

Movement Intention Detection Using Neural Network for Quadriplegic Assistive Machine

Movement Intention Detection Using Neural Network for Quadriplegic Assistive Machine Movement Intention Detection Using Neural Network for Quadriplegic Assistive Machine T.A.Izzuddin 1, M.A.Ariffin 2, Z.H.Bohari 3, R.Ghazali 4, M.H.Jali 5 Faculty of Electrical Engineering Universiti Teknikal

More information

Non Invasive Brain Computer Interface for Movement Control

Non Invasive Brain Computer Interface for Movement Control Non Invasive Brain Computer Interface for Movement Control V.Venkatasubramanian 1, R. Karthik Balaji 2 Abstract: - There are alternate methods that ease the movement of wheelchairs such as voice control,

More information

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects NCCT Promise for the Best Projects IEEE PROJECTS in various Domains Latest Projects, 2009-2010 ADVANCED ROBOTICS SOLUTIONS EMBEDDED SYSTEM PROJECTS Microcontrollers VLSI DSP Matlab Robotics ADVANCED ROBOTICS

More information

Mind Reading Technologies.

Mind Reading Technologies. Mind Reading Technologies. By Bradut DIMA, 03 November 2011 Emotiv [www.emotiv.com] Specific brain areas have different functions. When particular types of processing are happening you see characteristic

More information

Using Gestures to Interact with a Service Robot using Kinect 2

Using Gestures to Interact with a Service Robot using Kinect 2 Using Gestures to Interact with a Service Robot using Kinect 2 Harold Andres Vasquez 1, Hector Simon Vargas 1, and L. Enrique Sucar 2 1 Popular Autonomous University of Puebla, Puebla, Pue., Mexico {haroldandres.vasquez,hectorsimon.vargas}@upaep.edu.mx

More information

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many Preface The jubilee 25th International Conference on Robotics in Alpe-Adria-Danube Region, RAAD 2016 was held in the conference centre of the Best Western Hotel M, Belgrade, Serbia, from 30 June to 2 July

More information

Towards a Google Glass Based Head Control Communication System for People with Disabilities. James Gips, Muhan Zhang, Deirdre Anderson

Towards a Google Glass Based Head Control Communication System for People with Disabilities. James Gips, Muhan Zhang, Deirdre Anderson Towards a Google Glass Based Head Control Communication System for People with Disabilities James Gips, Muhan Zhang, Deirdre Anderson Boston College To be published in Proceedings of HCI International

More information

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1

More information

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment Proceedings of the International MultiConference of Engineers and Computer Scientists 2016 Vol I,, March 16-18, 2016, Hong Kong Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free

More information

Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers

Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers Maitreyee Wairagkar Brain Embodiment Lab, School of Systems Engineering, University of Reading, Reading, U.K.

More information

Classification for Motion Game Based on EEG Sensing

Classification for Motion Game Based on EEG Sensing Classification for Motion Game Based on EEG Sensing Ran WEI 1,3,4, Xing-Hua ZHANG 1,4, Xin DANG 2,3,4,a and Guo-Hui LI 3 1 School of Electronics and Information Engineering, Tianjin Polytechnic University,

More information

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu

More information

Towards Multimodal, Multi-party, and Social Brain-Computer Interfacing

Towards Multimodal, Multi-party, and Social Brain-Computer Interfacing Towards Multimodal, Multi-party, and Social Brain-Computer Interfacing Anton Nijholt University of Twente, Human Media Interaction P.O. Box 217, 7500 AE Enschede, The Netherlands anijholt@cs.utwente.nl

More information

Training of EEG Signal Intensification for BCI System. Haesung Jeong*, Hyungi Jeong*, Kong Borasy*, Kyu-Sung Kim***, Sangmin Lee**, Jangwoo Kwon*

Training of EEG Signal Intensification for BCI System. Haesung Jeong*, Hyungi Jeong*, Kong Borasy*, Kyu-Sung Kim***, Sangmin Lee**, Jangwoo Kwon* Training of EEG Signal Intensification for BCI System Haesung Jeong*, Hyungi Jeong*, Kong Borasy*, Kyu-Sung Kim***, Sangmin Lee**, Jangwoo Kwon* Department of Computer Engineering, Inha University, Korea*

More information

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1 Episode 16: HCI Hannes Frey and Peter Sturm University of Trier University of Trier 1 Shrinking User Interface Small devices Narrow user interface Only few pixels graphical output No keyboard Mobility

More information

DESIGN AND DEVELOPMENT PROCESS: ACCESSIBLE, AFFORDABLE AND MODULAR ROBOTICS. Dianne Goodwin, Nicholas Lee BlueSky Designs, Inc.

DESIGN AND DEVELOPMENT PROCESS: ACCESSIBLE, AFFORDABLE AND MODULAR ROBOTICS. Dianne Goodwin, Nicholas Lee BlueSky Designs, Inc. DESIGN AND DEVELOPMENT PROCESS: ACCESSIBLE, AFFORDABLE AND MODULAR ROBOTICS Dianne Goodwin, Nicholas Lee BlueSky Designs, Inc. INTRODUCTION Over 600,000 people in the U.S. use power wheelchairs, including

More information

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira An EOG based Human Computer Interface System for Online Control Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira Departamento de Física, ISEP Instituto Superior de Engenharia do Porto Rua Dr. António

More information

A Kinect-based 3D hand-gesture interface for 3D databases

A Kinect-based 3D hand-gesture interface for 3D databases A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity

More information

I.1 Smart Machines. Unit Overview:

I.1 Smart Machines. Unit Overview: I Smart Machines I.1 Smart Machines Unit Overview: This unit introduces students to Sensors and Programming with VEX IQ. VEX IQ Sensors allow for autonomous and hybrid control of VEX IQ robots and other

More information

PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique

PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique Yoshinobu Ebisawa, Daisuke Ishima, Shintaro Inoue, Yasuko Murayama Faculty of Engineering, Shizuoka University Hamamatsu, 432-8561,

More information

Controlling Humanoid Robot Using Head Movements

Controlling Humanoid Robot Using Head Movements Volume-5, Issue-2, April-2015 International Journal of Engineering and Management Research Page Number: 648-652 Controlling Humanoid Robot Using Head Movements S. Mounica 1, A. Naga bhavani 2, Namani.Niharika

More information

Development of a telepresence agent

Development of a telepresence agent Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented

More information

Emotiv Software Development Kit. User Manual for Emotiv Xavier ControlPanel

Emotiv Software Development Kit. User Manual for Emotiv Xavier ControlPanel Emotiv Software Development Kit User Manual for Emotiv Xavier ControlPanel TABLE OF CONTENTS DIRECTORY OF FIGURES 4 1. Introduction 5 1.1 Glossary 5 1.2 Trademarks 6 2. Getting Started 7 2.1 Hardware Components

More information

understanding sensors

understanding sensors The LEGO MINDSTORMS EV3 set includes three types of sensors: Touch, Color, and Infrared. You can use these sensors to make your robot respond to its environment. For example, you can program your robot

More information

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm Pushkar Shukla 1, Shehjar Safaya 2, Utkarsh Sharma 3 B.Tech, College of Engineering Roorkee, Roorkee, India 1 B.Tech, College of

More information

Interactive guidance system for railway passengers

Interactive guidance system for railway passengers Interactive guidance system for railway passengers K. Goto, H. Matsubara, N. Fukasawa & N. Mizukami Transport Information Technology Division, Railway Technical Research Institute, Japan Abstract This

More information

Undefined Obstacle Avoidance and Path Planning

Undefined Obstacle Avoidance and Path Planning Paper ID #6116 Undefined Obstacle Avoidance and Path Planning Prof. Akram Hossain, Purdue University, Calumet (Tech) Akram Hossain is a professor in the department of Engineering Technology and director

More information

Face Detector using Network-based Services for a Remote Robot Application

Face Detector using Network-based Services for a Remote Robot Application Face Detector using Network-based Services for a Remote Robot Application Yong-Ho Seo Department of Intelligent Robot Engineering, Mokwon University Mokwon Gil 21, Seo-gu, Daejeon, Republic of Korea yhseo@mokwon.ac.kr

More information

Using Electrooculogram and Electromyogram for powered wheelchair

Using Electrooculogram and Electromyogram for powered wheelchair Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics December 7-11, 2011, Phuket, Thailand Using Electrooculogram and Electromyogram for powered wheelchair Nguyen Kim-Tien

More information

Formation and Cooperation for SWARMed Intelligent Robots

Formation and Cooperation for SWARMed Intelligent Robots Formation and Cooperation for SWARMed Intelligent Robots Wei Cao 1 Yanqing Gao 2 Jason Robert Mace 3 (West Virginia University 1 University of Arizona 2 Energy Corp. of America 3 ) Abstract This article

More information

Towards Multimodal Human-Machine Interface for Hands-free Control: A survey

Towards Multimodal Human-Machine Interface for Hands-free Control: A survey Towards Multimodal Human-Machine Interface for Hands-free Control: A survey Technical Report: CES 510 Lai WEI and Huosheng HU School of Computer Science & Electronic Engineering University of Essex, Colchester

More information

Your EdVenture into Robotics 10 Lesson plans

Your EdVenture into Robotics 10 Lesson plans Your EdVenture into Robotics 10 Lesson plans Activity sheets and Worksheets Find Edison Robot @ Search: Edison Robot Call 800.962.4463 or email custserv@ Lesson 1 Worksheet 1.1 Meet Edison Edison is a

More information

Team KMUTT: Team Description Paper

Team KMUTT: Team Description Paper Team KMUTT: Team Description Paper Thavida Maneewarn, Xye, Pasan Kulvanit, Sathit Wanitchaikit, Panuvat Sinsaranon, Kawroong Saktaweekulkit, Nattapong Kaewlek Djitt Laowattana King Mongkut s University

More information

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL Darko Martinovikj Nevena Ackovska Faculty of Computer Science and Engineering Skopje, R. Macedonia ABSTRACT Despite the fact that there are different

More information

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL

LASER ASSISTED COMBINED TELEOPERATION AND AUTONOMOUS CONTROL ANS EPRRSD - 13 th Robotics & remote Systems for Hazardous Environments 11 th Emergency Preparedness & Response Knoxville, TN, August 7-10, 2011, on CD-ROM, American Nuclear Society, LaGrange Park, IL

More information

Off-line EEG analysis of BCI experiments with MATLAB V1.07a. Copyright g.tec medical engineering GmbH

Off-line EEG analysis of BCI experiments with MATLAB V1.07a. Copyright g.tec medical engineering GmbH g.tec medical engineering GmbH Sierningstrasse 14, A-4521 Schiedlberg Austria - Europe Tel.: (43)-7251-22240-0 Fax: (43)-7251-22240-39 office@gtec.at, http://www.gtec.at Off-line EEG analysis of BCI experiments

More information

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2

t t t rt t s s tr t Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 t t t rt t s s Manuel Martinez 1, Angela Constantinescu 2, Boris Schauerte 1, Daniel Koester 1, and Rainer Stiefelhagen 1,2 1 r sr st t t 2 st t t r t r t s t s 3 Pr ÿ t3 tr 2 t 2 t r r t s 2 r t ts ss

More information

Design and Control of the BUAA Four-Fingered Hand

Design and Control of the BUAA Four-Fingered Hand Proceedings of the 2001 IEEE International Conference on Robotics & Automation Seoul, Korea May 21-26, 2001 Design and Control of the BUAA Four-Fingered Hand Y. Zhang, Z. Han, H. Zhang, X. Shang, T. Wang,

More information

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1

More information

Structure Design of a Feeding Assistant Robot

Structure Design of a Feeding Assistant Robot Structure Design of a Feeding Assistant Robot Chenling Zheng a, Liangchao Hou b and Jianyong Li c Shandong University of Science and Technology, Qingdao 266590, China. a2425614112@qq.com, b 931936225@qq.com,

More information

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB

SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB SPY ROBOT CONTROLLING THROUGH ZIGBEE USING MATLAB MD.SHABEENA BEGUM, P.KOTESWARA RAO Assistant Professor, SRKIT, Enikepadu, Vijayawada ABSTRACT In today s world, in almost all sectors, most of the work

More information

An External Command Reading White line Follower Robot

An External Command Reading White line Follower Robot EE-712 Embedded System Design: Course Project Report An External Command Reading White line Follower Robot 09405009 Mayank Mishra (mayank@cse.iitb.ac.in) 09307903 Badri Narayan Patro (badripatro@ee.iitb.ac.in)

More information

ELECTROOCULOGRAPHIC GUIDANCE OF A WHEELCHAIR USING EYE MOVEMENTS CODIFICATION

ELECTROOCULOGRAPHIC GUIDANCE OF A WHEELCHAIR USING EYE MOVEMENTS CODIFICATION ELECTROOCULOGRAPHIC GUIDANCE OF A WHEELCHAIR USING EYE MOVEMENTS CODIFICATION Rafael Barea, Luciano Boquete, Manuel Mazo, Elena López, L.M. Bergasa. Electronics Department. University of Alcala. Campus

More information

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei

More information

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY *Ms. S. VAISHNAVI, Assistant Professor, Sri Krishna Arts And Science College, Coimbatore. TN INDIA **SWETHASRI. L., Final Year B.Com

More information

Wirelessly Controlled Wheeled Robotic Arm

Wirelessly Controlled Wheeled Robotic Arm Wirelessly Controlled Wheeled Robotic Arm Muhammmad Tufail 1, Mian Muhammad Kamal 2, Muhammad Jawad 3 1 Department of Electrical Engineering City University of science and Information Technology Peshawar

More information

Lab Design of FANUC Robot Operation for Engineering Technology Major Students

Lab Design of FANUC Robot Operation for Engineering Technology Major Students Paper ID #21185 Lab Design of FANUC Robot Operation for Engineering Technology Major Students Dr. Maged Mikhail, Purdue University Northwest Dr. Maged B.Mikhail, Assistant Professor, Mechatronics Engineering

More information

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM Aniket D. Kulkarni *1, Dr.Sayyad Ajij D. *2 *1(Student of E&C Department, MIT Aurangabad, India) *2(HOD of E&C department, MIT Aurangabad, India) aniket2212@gmail.com*1,

More information

Analysis of brain waves according to their frequency

Analysis of brain waves according to their frequency Analysis of brain waves according to their frequency Z. Koudelková, M. Strmiska, R. Jašek Abstract The primary purpose of this article is to show and analyse the brain waves, which are activated during

More information

Emoto-bot Demonstration Control System

Emoto-bot Demonstration Control System Emoto-bot Demonstration Control System I am building a demonstration control system for VEX robotics that creates a human-machine interface for an assistive or companion robotic device. My control system

More information

KMUTT Kickers: Team Description Paper

KMUTT Kickers: Team Description Paper KMUTT Kickers: Team Description Paper Thavida Maneewarn, Xye, Korawit Kawinkhrue, Amnart Butsongka, Nattapong Kaewlek King Mongkut s University of Technology Thonburi, Institute of Field Robotics (FIBO)

More information

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses Available online at www.sciencedirect.com ScienceDirect Procedia - Social and Behavioral Sciences 174 ( 2015 ) 3102 3107 INTE 2014 Fabrication of the kinect remote-controlled cars and planning of the motion

More information

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium

More information

Student Attendance Monitoring System Via Face Detection and Recognition System

Student Attendance Monitoring System Via Face Detection and Recognition System IJSTE - International Journal of Science Technology & Engineering Volume 2 Issue 11 May 2016 ISSN (online): 2349-784X Student Attendance Monitoring System Via Face Detection and Recognition System Pinal

More information

Non-Invasive Brain-Actuated Control of a Mobile Robot

Non-Invasive Brain-Actuated Control of a Mobile Robot Non-Invasive Brain-Actuated Control of a Mobile Robot Jose del R. Millan, Frederic Renkens, Josep Mourino, Wulfram Gerstner 5/3/06 Josh Storz CSE 599E BCI Introduction (paper perspective) BCIs BCI = Brain

More information

Multi-robot Formation Control Based on Leader-follower Method

Multi-robot Formation Control Based on Leader-follower Method Journal of Computers Vol. 29 No. 2, 2018, pp. 233-240 doi:10.3966/199115992018042902022 Multi-robot Formation Control Based on Leader-follower Method Xibao Wu 1*, Wenbai Chen 1, Fangfang Ji 1, Jixing Ye

More information

Robust Hand Gesture Recognition for Robotic Hand Control

Robust Hand Gesture Recognition for Robotic Hand Control Robust Hand Gesture Recognition for Robotic Hand Control Ankit Chaudhary Robust Hand Gesture Recognition for Robotic Hand Control 123 Ankit Chaudhary Department of Computer Science Northwest Missouri State

More information

Review on Eye Movement Controlled Wheelchair

Review on Eye Movement Controlled Wheelchair Review on Eye Movement Controlled Wheelchair 1 Roshani Ninama, 2 Rutu Nayak 1 ME Scholar, 2 Assistant Professor, Biomedical Engineering Department, G. E. C. Gandhinagar, Gandhinagar, India 1 Roshani.ninama@gmail.com,

More information

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL World Automation Congress 2010 TSI Press. REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL SEIJI YAMADA *1 AND KAZUKI KOBAYASHI *2 *1 National Institute of Informatics / The Graduate University for Advanced

More information

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities

Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with Disabilities The 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems October 18-22, 2010, Taipei, Taiwan Laser-Assisted Telerobotic Control for Enhancing Manipulation Capabilities of Persons with

More information

Hands-free Operation of a Small Mobile Robot*

Hands-free Operation of a Small Mobile Robot* c,, Hands-free Operation of a Small Mobile Robot* Wendy A Amai, Jill C Fahrenholtz, and Chris L Leger Sandia National Laboratories PO 5800, MS 1125 Albuquerque, NM 87185 (2 23Q 3 pm ~~~ +2!J< o~~ a t Abstract

More information

VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT

VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT 3-59 Corbett Hall University of Alberta Edmonton, AB T6G 2G4 Ph: (780) 492-5422 Fx: (780) 492-1696 Email: atlab@ualberta.ca VIRTUAL ASSISTIVE ROBOTS FOR PLAY, LEARNING, AND COGNITIVE DEVELOPMENT Mengliao

More information

Implementation of Mind Control Robot

Implementation of Mind Control Robot Implementation of Mind Control Robot Adeel Butt and Milutin Stanaćević Department of Electrical and Computer Engineering Stony Brook University Stony Brook, New York, USA adeel.butt@stonybrook.edu, milutin.stanacevic@stonybrook.edu

More information

Humera Syed 1, M. S. Khatib 2 1,2

Humera Syed 1, M. S. Khatib 2 1,2 A Hand Gesture Recognition Approach towards Shoulder Wearable Computing Humera Syed 1, M. S. Khatib 2 1,2 CSE, A.C.E.T/ R.T.M.N.U, India ABSTRACT: Human Computer Interaction needs computer systems and

More information

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY T. Panayiotopoulos,, N. Zacharis, S. Vosinakis Department of Computer Science, University of Piraeus, 80 Karaoli & Dimitriou str. 18534 Piraeus, Greece themisp@unipi.gr,

More information

Design of Joint Controller Circuit for PA10 Robot Arm

Design of Joint Controller Circuit for PA10 Robot Arm Design of Joint Controller Circuit for PA10 Robot Arm Sereiratha Phal and Manop Wongsaisuwan Department of Electrical Engineering, Faculty of Engineering, Chulalongkorn University, Bangkok, 10330, Thailand.

More information

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS Abstract Over the years from entertainment to gaming market,

More information

Fingers Bending Motion Controlled Electrical. Wheelchair by Using Flexible Bending Sensors. with Kalman filter Algorithm

Fingers Bending Motion Controlled Electrical. Wheelchair by Using Flexible Bending Sensors. with Kalman filter Algorithm Contemporary Engineering Sciences, Vol. 7, 2014, no. 13, 637-647 HIKARI Ltd, www.m-hikari.com http://dx.doi.org/10.12988/ces.2014.4670 Fingers Bending Motion Controlled Electrical Wheelchair by Using Flexible

More information

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control

The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control The Virtual Reality Brain-Computer Interface System for Ubiquitous Home Control Hyun-sang Cho, Jayoung Goo, Dongjun Suh, Kyoung Shin Park, and Minsoo Hahn Digital Media Laboratory, Information and Communications

More information

Toward an Augmented Reality System for Violin Learning Support

Toward an Augmented Reality System for Violin Learning Support Toward an Augmented Reality System for Violin Learning Support Hiroyuki Shiino, François de Sorbier, and Hideo Saito Graduate School of Science and Technology, Keio University, Yokohama, Japan {shiino,fdesorbi,saito}@hvrl.ics.keio.ac.jp

More information

Advancements in Gesture Recognition Technology

Advancements in Gesture Recognition Technology IOSR Journal of VLSI and Signal Processing (IOSR-JVSP) Volume 4, Issue 4, Ver. I (Jul-Aug. 2014), PP 01-07 e-issn: 2319 4200, p-issn No. : 2319 4197 Advancements in Gesture Recognition Technology 1 Poluka

More information

Sensing and Processing of EOG Signals to Control Human Machine Interface System

Sensing and Processing of EOG Signals to Control Human Machine Interface System Sensing and Processing of EOG Signals to Control Human Machine Interface System C.Kavitha 1, G. Nagappan 2 1 Department of Computer Science and Engineering/ Saveetha Engineering College/ PG Scholar, Chennai,

More information

Head-Movement Evaluation for First-Person Games

Head-Movement Evaluation for First-Person Games Head-Movement Evaluation for First-Person Games Paulo G. de Barros Computer Science Department Worcester Polytechnic Institute 100 Institute Road. Worcester, MA 01609 USA pgb@wpi.edu Robert W. Lindeman

More information

DESIGN AND DEVELOPMENT OF LIBRARY ASSISTANT ROBOT

DESIGN AND DEVELOPMENT OF LIBRARY ASSISTANT ROBOT DESIGN AND DEVELOPMENT OF LIBRARY ASSISTANT ROBOT Ranjani.R, M.Nandhini, G.Madhumitha Assistant Professor,Department of Mechatronics, SRM University,Kattankulathur,Chennai. ABSTRACT Library robot is an

More information

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study Petr Bouchner, Stanislav Novotný, Roman Piekník, Ondřej Sýkora Abstract Behavior of road users on railway crossings

More information

The Study of Methodologies for Identifying the Drowsiness in Smart Traffic System: A Survey Mariya 1 Mrs. Sumana K R 2

The Study of Methodologies for Identifying the Drowsiness in Smart Traffic System: A Survey Mariya 1 Mrs. Sumana K R 2 IJSRD - International Journal for Scientific Research & Development Vol. 3, Issue 02, 2015 ISSN (online): 2321-0613 The Study of Methodologies for Identifying the Drowsiness in Smart Traffic System: A

More information

Near Infrared Face Image Quality Assessment System of Video Sequences

Near Infrared Face Image Quality Assessment System of Video Sequences 2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University

More information

An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi

An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi An Evaluation of Automatic License Plate Recognition Vikas Kotagyale, Prof.S.D.Joshi Department of E&TC Engineering,PVPIT,Bavdhan,Pune ABSTRACT: In the last decades vehicle license plate recognition systems

More information