Brain Computer Interface Cursor Measures for Motionimpaired and Able-bodied Users

Similar documents
Using Wiimote for 2D and 3D Pointing Tasks: Gesture Performance Evaluation

AN EXTENSIBLE AND INTERACTIVE RESEARCH PLATFORM FOR EXPLORING FITTS LAW

Investigating the use of force feedback for motion-impaired users

Using the Speed-Accuracy Operating Characteristic to Visualize Performance with Pointing Devices

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

Comparison of Relative Versus Absolute Pointing Devices

Do Stereo Display Deficiencies Affect 3D Pointing?

Gesture in Embodied Communication and Human-Computer Interaction

The Leap Motion movement for 2D pointing tasks Characterisation and comparison to other devices

Developing assistive interfaces for motion-impaired users using cursor movement analysis in conjunction with haptic feedback

Haptic and Tactile Feedback in Directed Movements

Microsoft Scrolling Strip Prototype: Technical Description

Evaluating Touch Gestures for Scrolling on Notebook Computers

Target Size and Distance: Important Factors for Designing User Interfaces for Older Notebook Users

Mensch-Maschine-Interaktion 1. Chapter 9 (June 28th, 2012, 9am-12pm): Basic HCI Models

INVESTIGATION AND EVALUATION OF POINTING MODALITIES FOR INTERACTIVE STEREOSCOPIC 3D TV

Quantification of the Effects of Haptic Feedback During a Motor Skills Task in a Simulated Environment

Differences in Fitts Law Task Performance Based on Environment Scaling

Andriy Pavlovych. Research Interests

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

A novel click-free interaction technique for large-screen interfaces

Developing a methodology for the design of accessible interfaces

Reaching Movements to Augmented and Graphic Objects in Virtual Environments

The University of Algarve Informatics Laboratory

Laboratory 1: Motion in One Dimension

Currently submitted to CHI 2002

Experiment G: Introduction to Graphical Representation of Data & the Use of Excel

Haptic Feedback in Remote Pointing

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Multi-modality EMG and Visual Based Hands-Free Control of an Intelligent Wheelchair

A Kinect-based 3D hand-gesture interface for 3D databases

Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI

Quick Button Selection with Eye Gazing for General GUI Environment

3D Data Navigation via Natural User Interfaces

Effect of Screen Configuration and Interaction Devices in Shared Display Groupware

What s an input device. Input Technologies and Techniques. Input Device Properties. Property Sensed. What s an input device

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

The Representational Effect in Complex Systems: A Distributed Representation Approach

Haptic presentation of 3D objects in virtual reality for the visually disabled

Motor Imagery based Brain Computer Interface (BCI) using Artificial Neural Network Classifiers

Evaluating Visual/Motor Co-location in Fish-Tank Virtual Reality

Feasibility Assay for Measure of Sternocleidomastoid and Platysma Electromyography Signal for Brain-Computer Interface Feedback

Interaction via motion observation

Comparing Two Haptic Interfaces for Multimodal Graph Rendering

The essential role of. mental models in HCI: Card, Moran and Newell

Booklet of teaching units

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

Hands-free Operation of a Small Mobile Robot*

Use Linear Regression to Find the Best Line on a Graphing Calculator

Metrics for Assistive Robotics Brain-Computer Interface Evaluation

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Products of Linear Functions

Warm-Up. Complete the second homework worksheet (the one you didn t do yesterday). Please begin working on FBF010 and FBF011.

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

BIO 365L Neurobiology Laboratory. Training Exercise 1: Introduction to the Computer Software: DataPro

CSE440: Introduction to HCI

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

ISONIC PA AUT Spiral Scan Inspection of Tubular Parts Operating Manual and Inspection Procedure Rev 1.00 Sonotron NDT

Optimal Parameters for Efficient Crossing-Based Dialog Boxes

Learning and Using Models of Kicking Motions for Legged Robots

Activity or Product? - Drawing and HCI

Automatic Online Haptic Graph Construction

Barrier Pointing: Using Physical Edges to Assist Target Acquisition on Mobile Device Touch Screens

Use of the LTI Viewer and MUX Block in Simulink

Filtering Joystick Data for Shooter Design Really Matters

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

This lab is to be completed using University computer labs in your own time.

Comparison of Haptic and Non-Speech Audio Feedback

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

CS 315 Intro to Human Computer Interaction (HCI)

HUMAN COMPUTER INTERFACE

Haptic Feedback and Motor Disability : A Preliminary Study

PASS Sample Size Software. These options specify the characteristics of the lines, labels, and tick marks along the X and Y axes.

Off-line EEG analysis of BCI experiments with MATLAB V1.07a. Copyright g.tec medical engineering GmbH

Learning Guide. ASR Automated Systems Research Inc. # Douglas Crescent, Langley, BC. V3A 4B6. Fax:

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

1. INTRODUCTION: 2. EOG: system, handicapped people, wheelchair.

AN EVALUATION OF TEXT-ENTRY IN PALM OS GRAFFITI AND THE VIRTUAL KEYBOARD

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Measuring FlowMenu Performance


Robots Learning from Robots: A proof of Concept Study for Co-Manipulation Tasks. Luka Peternel and Arash Ajoudani Presented by Halishia Chugani

SensorTrace BASIC 3.0 user manual

Learning and Using Models of Kicking Motions for Legged Robots

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

On Merging Command Selection and Direct Manipulation

Investigating the Electromechanical Coupling in Piezoelectric Actuator Drive Motor Under Heavy Load

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Name Kyla Jackson, Todd Germeroth, Jake Spooler Date May 5, 2010 Lab 3E Group 3 Experiment Title Project Deliverable 3

Universal Usability: Children. A brief overview of research for and by children in HCI

3D Virtual Hand Selection with EMS and Vibration Feedback

Cross Display Mouse Movement in MDEs

Experiment 2: Electronic Enhancement of S/N and Boxcar Filtering

Experiment P01: Understanding Motion I Distance and Time (Motion Sensor)

Haptic control in a virtual environment

Direct Manipulation. and Instrumental Interaction. Direct Manipulation 1

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

MicroStation XM Training Manual 2D Level 1

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

University of Tennessee at. Chattanooga

Transcription:

Brain Computer Interface Cursor Measures for Motionimpaired and Able-bodied Users Alexandros Pino, Eleftherios Kalogeros, Elias Salemis and Georgios Kouroupetroglou Department of Informatics and Telecommunications National and Kapodistrian University of Athens Panepistimioupolis, Ilissia, GR-15784, Athens, Greece {pino, e.kalogeros, i.salemis, koupe}@di.uoa.gr Abstract This paper presents the results of experimental studies that aim to measure the effectiveness of a Brain Computer Interface () against a mouse on point and click tasks performed by ablebodied and upper-limp motion-impaired users. Our methodology is based on the ISO 9241-9 guidelines. We examine how Fitts law fits the tested input devices, and we use gross and detailed trajectory measures in order to quantify cursor movement and evaluate performance. We conclude that Fitts law can only describe able-bodied users performance when selecting targets with the mouse. On the other hand, the performance of both user groups with the, and of motionimpaired users with the mouse does not conform to Fitts law. Tables and charts of results are given, showing that the cannot currently compete with the mouse in terms of usability, but can be used as an alternative for motion actuated devices when no other solution is possible. 1 Introduction There are individuals who, because of the severity of their physical limitations, have been unable to access a computer through either direct selection or alternative interaction methods, such as combinations of scanning techniques and switches. On the other hand, there are circumstances in which able-bodied users cannot use their hands during an HCI interaction. For those cases the emerging Brain Computer Interface () technology could be a candidate alternative (Barreto, Scargle & Adjouadi, 2), (Bayliss & Auernheimer, 21), (Ming, Dingfeng, Xiaorong & Shangkai, 21). We conducted a series of methodological experimental studies on the performance of able-bodied () and motion-impaired () users, using a Logitech Cordless Wheel Mouse and a Brain Actuated Technologies Cyberlink Brainfingers System. The subjects were four ablebodied and four disabled users. Two different experiments were set up in order to examine Fitts law application in one-direction point and click tasks (MacKenzie, Sellen & Buxton, 1991), and to extract detailed trajectory and target selection measures in multidirectional tasks (Oh & Stuerzlinger, 22). The design of the tests was based on the guidelines provided by ISO 9241-9: Ergonomic requirements for office work with visual display terminals (VDTs)-Part 9: Requirements for non-keyboard input devices (ISO/TC 159 & CMC, 2), (MacKenzie, 21). Proprietary software was developed using Microsoft Visual Basic to provide the applicable User Interface (UI), and to acquire, store and analyze cursor movement data. Microsoft Excel was also 1462

used for data processing. In all point and click experiments users were instructed not to stop on erroneous clicks and an audio feedback was given in that case (trial was not interrupted). Visual and audio feedback was also given for successful clicks. Each task was explained and demonstrated to the participants and a warm up block of trials was given. A 1 Hz sampling rate was used for trajectory and click data acquisition (x, y coordinates, millisecond precision time). 2 Experiments In the one-direction test (first test), two rectangular targets of the same width W, separated by distance D appear on the screen. The task is to point and click each rectangle 1 times for every block of trials performing back and forth movements between the two targets. Each trial block commences with the cursor locked on the left rectangle and the user must click to unlock the cursor and let the data acquisition begin. The initial click action is not taken into consideration for the measures. The next rectangle that must be clicked each time is highlighted. On every successful selection the cursor moves to the center of the selected target so that the user can continue uninterrupted and D will be the same at all times. Nine test blocks for different Fitts difficulty indexes (i.e. 3 D and 3 W combinations) varying from 1.24 to 4 bits are run, yielding a total of 18 trials for each user and each device (2 trials per block x [3 W x 3 D] blocks). Applied distances were: 15, 3 and 45 pixels and widths: 3, 7 and 11 pixels. We used Equation 1 to examine Fitts law application on our data and to calculate difficulty indexes (ID); this variation of the equation has been proven the most appropriate for our purpose and most widely used (Accot & Zhai, 1997), (MacKenzie, 1991). According to Fitts, Movement Time (MT) must be linearly related to ID. D Equation 1: MT = a + b ID, where ID = log 2 + 1 W In the multi-directional test (second test), 16 square targets are arranged in an equidistance circular layout. The task begins with a click on the topmost target; then the subject must move the cursor directly to the opposite target and click on it, and so on clockwise round. Every time a target is selected, the cursor moves automatically on its centre and the next target is highlighted. Each trial block is completed when all targets have been selected (17 trials as the topmost target is the end) and 9 blocks are run for the combinations of 3 different radii (D/2: 16, 23 and 3 pixels) and target widths (W: 3, 4 and 5 pixels) for each user and each device. Derived Indexes of Difficulty varied form 2.89, to 4.39 bits, but only cursor measures taken with ID=3.64 bits (corresponding to D= 46 and W=4) will be presented in this paper. 3 Results All our motion-impaired subjects were quadriplegic with severe disabilities in their upper limbs and two of them could not manage to finish the tests neither with the mouse nor with the due to various reasons like lack of interest, mental problems, inability to use motor skills and spasm (Langdon, Keates, Clarkson & Robinson, 21). Those users data were excluded. Four and two IM users completed all the tests successfully with both devices. The operation of the device is based on EEG, EMG and OMG signals acquired by three electrodes mounted on a headband (Penny & Roberts, 1999). Nevertheless, mainly EMG and OMG signals were used to control the cursor because we could not manipulate EEG signals (after a two-months trying period). Facial muscles and eye movements controlled the mouse cursor in the following manner: clenching the teeth resulted in click; rapid eye movement or blink moved 1463

the cursor to the left; unwavering stare moved the cursor to the right; putting tension on the forehead moved the cursor upwards and relaxing forehead muscles move the cursor downwards. At the end of the experiment, subjects were interviewed and asked to complete a questionnaire so as to evaluate the perceived performance from the participants perspective (comfort). users were tired from both devices while users were only tired from using the. No one thought that the was an easy to use device and everybody said that its use was not so pleasant and its accuracy and speed quite low. Our first goal was to examine how Fitts Law applies to mouse and cursor movements and compare our measurements and results between subjects, devices, task primitives and data found in the literature (Gillan, Holden, Adam, Rudisill & Magee, 199), (Oel, Schmidt & Schmidt, 21), (Douglas, Kirkpatrick & MacKenzie, 1999). 7, 6, 5, 4, 3, 2, 1, Linear () Mouse y = 312.3x + 3,893.1 R 2 =.943 y() = 141.88x + 23.91 R 2 =.934 1. 1.5 2. 2.5 3. 3.5 4. 4.5 ID (bits) Figure 1: Scatter-plot graph of the Movement Time (MT) Index of Difficulty (ID) relationship In Figure 1 we summarize results from the one-direction tapping test, and Fitts law application is illustrated. In the first graph (mouse) the linearity between MT and ID is quite obvious for ablebodied () users with the fitting line giving R 2 =.934. On the other hand, the estimated values of the trend line doesn t correspond so closely to the actual data for motion-impaired () users, giving R 2 =.94, which is very low. Fitts law cannot describe the rough or spastic movements of users hands. As far as the is concerned, the second graph of Figure 1 shows that Fitts law doesn t fit neither of the two user groups giving R 2 =.681 for, and R 2 =.362 for users. 5, 45, 4, 35, 3, 25, 2, 15, 1, 5, Linear () y() = 796.2x + 6693.4 R 2 =.3625 y() = 2591.6x + 6994.3 R 2 =.6814 1. 1.5 2. 2.5 3. 3.5 4. 4.5 ID (bits) 7, 6, Mouse y() = -13.398x + 898.43 y() = -3.5714x + 3999.5 Linear () 12, 1, y() = -431.4x + 2,558 y() = -2,76x + 61,577 Linear () 5, 8, 4, 3, 2, 1, 6, 4, 2, 1 2 3 4 5 6 7 8 9 1 11 12 13 14 15 16 17 18 19 2 Trial # Figure 2: Average Movement Time (MT) by User Group (-) by Trial number Figure 2 illustrates the learning effect by user group by trial (1-D test). It must be noted here that biofeedback was an important factor in the learning process. Furthermore three of four users were previously trained in use, while users had their first contact with the device 1 2 3 4 5 6 7 8 9 1 11 12 13 14 15 16 17 18 19 2 Trial # 1464

just before the experiments and carried out a learning and warm up procedure (2 hours). All users had experience in using mice or trackballs. In other studies these graphs are presented having block number on x-axis. Our data are presented as Movement Times averaged through blocks of different index of difficulty, in relation with trial number (each block consists of 2 trials), showing the learning effect within an average block. We see a considerable performance improvement for both user groups and devices, which is much larger for the than the mouse and quite impressive in the case of users using the, showing their big potential to learn to use the device much better and surpass users performance (something that cannot happen for the mouse). Fitting line slopes were: (,mouse)=13.4, (,mouse)=3.6, (,)=431.4, (,)=2,76. Our second objective was to quantify cursor movement effectiveness and have comparable results with other studies. We used a number of cursor control measures which can be grouped in two sets: Gross measures for performance evaluation: Movement Time (MT), Effective Target Width (W e ), Throughput (TP=ID e /MT) and Missed Clicks (MCL) and detailed trajectory measures: Target Re-entry (TRE), Task Axis Crossing (TAC), Movement Direction Change (MDC), Movement Variability (MV), Movement Error (ME) and Movement Offset (MO) (MacKenzie, Kauppinen & Silfverberg, 21), (Keates, Hwang, Langdon, Clarkson & Robinson, 22). The results of the multi-directional tasks are summarized in Table 2. The data for the two user groups and mouse use, are comparable to numbers from other studies (see Keates et al, 22), and show clearly the difficulties that users have. For the the results seem rather disappointing, but they were anticipated. With further user training and experience on the device we expect these numbers to improve in future studies. Table 1: Means and standard deviations of the measures for the two user groups for each device MOUSE Cursor Measures users users MCL.3 (.6),17 (,25) TRE.9 (.11).9 (.12) TAC 1.47 (.21) 1.85 (.21) MDC 19.47 (2.54) 65.47 (12.72) ME 14.86 (2.7) 24.27 (11.83) MO -.98 (4.68) -16.82 (15.41) MV 21.58 (4.23) 44.8 (17.31) MT.762 (.45) 4.18 (.77) W e 22.74 (3.42) 19.75 (1.45) TP 5.81 (.57) 1.12 (.18) 4 Conclusions Cursor Measures users users MCL.75 (,79) 3.5 (2.25) TRE.39 (.15) 1.4 (.66) TAC 5.87 (.41) 14.63 (1.28) MDC 422.38 (44.77) 1,161.44 (995.89) ME 13.55 (16.87) 21.82 (112.36) O 58.82 (8.14) 122.85 (73.31) MV 167.15 (22.32) 263.27 (15.8) MT 25.564 (.71) 14.69 (16.65) W e 23.16 (1.98) 3.53 (3.19) TP.182 (.7).81 (.84) For devices, there are no data in the literature to compare and the results we acquired indicated that does not seem to be competitive to hand actuated devices, as this is a distant prospect. On the other hand, the results show that motion impaired users do have an alternative solution for interacting with a computer, even if they don t have any remaining ability of moving or controlling any part of their body (except face muscles). Furthermore, we were able to establish conclusions about how increasing user familiarization and training affect the measures, which is especially interesting in the case. 1465

References Accot, J., & Zhai, S. (1997). Beyond Fitt s law: Models for trajectory-based HCI tasks. Proceedings of the ACM Conference on Human Factors in Computing Systems CHI 97, 295-32. New York: ACM. Barreto, B. A., Scargle, D. S., & Adjouadi, M. (2). A practical EMG-based human-computer interface for users with motor disabilities. Journal of Rehabilitation Research and Development, 37 (1). Retrieved December 2, 22 from http://www.vard.org/jour//37/1/ barre371.htm Bayliss, D. J., & Auernheimer, B. (21). Using a brain-computer interface in virtual and real worlds. Proceedings of HCI International 21 International Conference on Human- Computer Interaction, 312-316. Mahwah, NJ: Lawrence Erbaum Associates. Douglas, A. S., Kirkpatrick, E., & MacKenzie, I. S. (1999). Testing Pointing Device Performance and User Assessment with the ISO 9241, Part 9 Standard. Proceedings of the ACM Conference on Human Factors in Computing Systems CHI 99, 215-222. New York: ACM. Gillan, J. D., Holden, K., Adam, S., Rudisill, M. & Magee, L. (199). How does Fitts law fit pointing and dragging? Proceedings of the ACM Conference on Human Factors in Computing Systems CHI 9, 227-234. New York: ACM. ISO/TC 159, & CMC (2). ISO 9241-9:2(E), Ergonomic requirements for office work with visual display terminals (VDTs)-Part 9: Requirements for non-keyboard input devices. Keates, S., Hwang, F., Langdon, P., Clarkson, P. J., & Robinson, P. (22). Cursor measures for motion-impaired computer users. Proceedings of ACM SIGCAPH Conference on Assistive Technologies - ASSETS 22, 135-142. New York: ACM. Langdon, P., Keates, S., Clarkson, J., & Robinson, P. (21). Investigating the cursor movement parameters for haptic assistance of motion-impaired users. Integration of Assistive Technology in the Information Age, 9, 237-242 MacKenzie, I. S. (1991). Fitts' law as a performance model in human-computer interaction. Doctoral dissertation. Toronto, Ontario, Canada: University of Toronto. Retrieved November 4, 22 from York University Web site: http://www.yorku.ca/mack/phd.html MacKenzie, I. S. (21). A note on ISO testing of computer pointing devices. Retrieved November 11, 22, from York University web site: http://www.yorku.ca/mack/rn-iso.html MacKenzie, I. S., Kauppinen, T., & Silfverberg, M. (21). Accuracy measures for evaluating computer pointing devices. Proceedings of the ACM Conference on Human Factors in Computing Systems CHI 1, 3 (1), 9-19. New York: ACM. MacKenzie, I. S., Sellen, A., & Buxton, W. (1991). A comparison of input devices in elemental pointing and dragging tasks. Proceedings of the ACM Conference on Human Factors in Computing Systems CHI 91, 161-166. New York: ACM. Ming, C., Dingfeng, X., Xiaorong, G. & Shangkai, D. (21). Brain-computer interface with high transfer rates. Proceedings of 8th International Conference On Neural Information Processing ICONIP 21. Retrieved December 15, 22, from conference Web site: http://www.cse.cuhk.edu.hk/~apnna/ proceedings/iconip21/papers/114a.pdf Oel, P., Schmidt, P., & Schmidt, A. (21). Time prediction of mouse-based cursor movements. Proceedings of Joint AFIHM-BCS Conference on Human-Computer Interaction IHM-HCI 21, II, 37-4. Toulouse, France: Cépaduès-Éditions. Oh, J. Y. & Stuerzlinger, W. (22). Laser pointers as collaborative pointing devices. Proceedings of Graphics Interface 22, 141-149. CHCCS and AK Peters. Penny, D. W., & Roberts, J. S. (1999). Experiments with an EEG-based computer interface. Technical report, Workshop, Albany, USA. 1466