Unconstrained pupil detection technique using two light sources and the image difference method

Similar documents
Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique

Pupil detection and tracking using multiple light sources

Patents of eye tracking system- a survey

A PILOT STUDY ON ULTRASONIC SENSOR-BASED MEASURE- MENT OF HEAD MOVEMENT

A Vehicle Speed Measurement System for Nighttime with Camera

Frame-Rate Pupil Detector and Gaze Tracker

Another Eye Guarding the World

FACE RECOGNITION BY PIXEL INTENSITY

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

, ARNON AMIR, MYRON FLICKNER

Vandal Proof Camera: v-cam 500 (D-WDR, 650 TVL, Sony Effio-E, 0.05 lx) Vandal Proof Camera: v-cam 500 (D-WDR, 650 TVL, Sony Effio-E, 0.

SHC-721A. Another Eye Guarding the World. Low Light, WDR, Day & Night Color Camera. SSNR

An Embedded Pointing System for Lecture Rooms Installing Multiple Screen

ScienceDirect. Improvement of the Measurement Accuracy and Speed of Pupil Dilation as an Indicator of Comprehension

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

LENSES. INEL 6088 Computer Vision

BEAM HALO OBSERVATION BY CORONAGRAPH

Traffic Sign Recognition Senior Project Final Report

Information & Instructions

Exercise questions for Machine vision

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

Hello, welcome to the video lecture series on Digital Image Processing.

MAV-ID card processing using camera images

Table of Contents 1. Image processing Measurements System Tools...10

Digital Radiography : Flat Panel

Section 2 concludes that a glare meter based on a digital camera is probably too expensive to develop and produce, and may not be simple in use.

Version 6. User Manual OBJECT

High Resolution DNR Day/Night Color Camera

Technology offer. Low cost system for measuring vibrations through cameras

Detection of Greening in Potatoes using Image Processing Techniques. University of Tehran, P.O. Box 4111, Karaj , Iran.

Displacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology

Camera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note

Quick Operation Guide

BIPRO-S600VF12 and BIPRO-S700VF50 OSD Manual. Please visit these product pages for more information on the BIPRO-S600VF12 and BIPRO-S700VF50 Cameras

Day&Night Box camera with 36x Optical Zoom WV-CZ392/CZ492

TRIANGULATION-BASED light projection is a typical

X-ray light valve (XLV): a novel detectors technology for digital mammography*

Automated Detection of Early Lung Cancer and Tuberculosis Based on X- Ray Image Analysis

Compensating for Eye Tracker Camera Movement

Low Light, WDR, Day & Night Vandal-Proof Color Dome Camera. SVD-4120A Built-in SSNR chip Originally Developed by Samsung Techwin

The Eye and Vision. Activities: Linda Shore, Ed.D. Exploratorium Teacher Institute Exploratorium, all rights reserved

IncuCyte ZOOM Fluorescent Processing Overview

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

The Xiris Glossary of Machine Vision Terminology

An Enhanced Biometric System for Personal Authentication

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

F400. Detects subtle color differences. Color-graying vision sensor. Features

License Plate Localisation based on Morphological Operations

Flair for After Effects v1.1 manual

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Systems Biology. Optical Train, Köhler Illumination

One Week to Better Photography

Q A bitmap file contains the binary on the left below. 1 is white and 0 is black. Colour in each of the squares. What is the letter that is reve

Development of Hybrid Image Sensor for Pedestrian Detection

Techniques for Suppressing Adverse Lighting to Improve Vision System Success. Nelson Bridwell Senior Vision Engineer Machine Vision Engineering LLC

Time Delay Integration (TDI), The Answer to Demands for Increasing Frame Rate/Sensitivity? Craige Palmer Assistant Sales Manager

Finger print Recognization. By M R Rahul Raj K Muralidhar A Papi Reddy

Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1

Synchronization in Chaotic Vertical-Cavity Surface-Emitting Semiconductor Lasers

Bruker Optical Profilometer SOP Revision 2 01/04/16 Page 1 of 13. Bruker Optical Profilometer SOP

www. riseeyetracker.com TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01

Project Staff: Feng Zhang, Prof. Jianfeng Dai (Lanzhou Univ. of Tech.), Prof. Todd Hasting (Univ. Kentucky), Prof. Henry I. Smith

A Study on Single Camera Based ANPR System for Improvement of Vehicle Number Plate Recognition on Multi-lane Roads

ME 6406 MACHINE VISION. Georgia Institute of Technology

On spatial resolution

STEM Spectrum Imaging Tutorial

Background Subtraction Fusing Colour, Intensity and Edge Cues

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,

USER S MANUAL. 580 TV Line OSD Bullet Camera With 2 External Illuminators

PC Eyebot. Tutorial PC-Eyebot Console Explained

A Short History of Using Cameras for Weld Monitoring

Visual Perception. human perception display devices. CS Visual Perception

SHC-721 SHC-720. WDR Color Camera. Powerful Dynamic Range! High Sensitivity! (With Day & Night) (Without Day & Night) Day & Night

FLUORESCENCE MAGNETIC PARTICLE FLAW DETECTING SYSTEM BASED ON LOW LIGHT LEVEL CCD

Chapter 5. Tracking system with MEMS mirror

NEW HIERARCHICAL NOISE REDUCTION 1

Design of Pipeline Analog to Digital Converter

Operation Manual. Super Wide Dynamic Color Camera

Camera Calibration Certificate No: DMC III 27542

Digital Image Processing

A Method of Using Digital Image Processing for Edge Detection of Red Blood Cells

ML7520 ML7530 DIOPTER ADJUSTMENT RING BINOCULAR BODY, INCLINED 30. (a) Field Iris Control Lever. (c) Filter Slots EYEPIECES, KHW10X

Image Database and Preprocessing

The professional WolfVision Visualizer technology:

LAB I. INTRODUCTION TO LAB EQUIPMENT

ImageJ: Introduction to Image Analysis 3 May 2012 Jacqui Ross

An Improved Bernsen Algorithm Approaches For License Plate Recognition

BEAMAGE-3.0 KEY FEATURES BEAM DIAGNOSTICS AVAILABLE MODELS MAIN FUNCTIONS SEE ALSO ACCESSORIES. CMOS Beam Profiling Cameras

Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD U.S.A.

Sets distance refraction. Moves camera forward / backward. Moves camera up / down. Moves camera left / right. Starts image acquisition (HRT 3 only)

Tobii T60XL Eye Tracker. Widescreen eye tracking for efficient testing of large media

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Gas scintillation Glass GEM detector for high-resolution X-ray imaging and CT

Operating Manual USER'S INSTRUCTIONS

ERS KEY FEATURES BEAM DIAGNOSTICS MAIN FUNCTIONS AVAILABLE MODEL. CMOS Beam Profiling Camera. 1 USB 3.0 for the Fastest Transfer Rates

Operation Manual Full-HD Miniature POV Camera

The introduction and background in the previous chapters provided context in

Transcription:

Unconstrained pupil detection technique using two light sources and the image difference method Yoshinobu Ebisawa Faculty of Engineering, Shizuoka University, Johoku 3-5-1, Hamamatsu, Shizuoka, 432 Japan Abstract For developing a human-computer interface applying eye-gaze, we have already proposed a noncontact, unconstrained video-based pupil detection technique using two light sources and the image difference method. The detected pupil position in the difference image is utilized together with the glint (corneal reflection light of an infrared light source) position for eye-gaze position determination. In this paper, the hardware for real-time image differentiation was developed. This image differenciator made real-time pupil possible, by applying the pupil detector including the noise reducer, which had been already developed. For stably detecting the glint and the pupil, it was clarified that the pupil brightness is influenced by the pupil area and the power of infrared light irradiated for the eye. For stabilizing the pupil brightness in the difference images, a method which utilized this characteristics was proposed. This method made pupil and glint center detection stabler. 1 Introduction The pointing device applying the eye-gaze position is desired for an interface between human and computer systems, especially for quick menu selections. It is necessary for the point device to develop a noncontact and unconstrained eye-gaze position detection method. For implementation, video-based methods are probably most appropriate. In these methods, the eye is irradiated with an infrared ray, and the eye images are taken by video camera. The eye-gaze position was detected by the relative positions between two feature points; the corneal reflection image and the pupil image. Hutchinson et al., 1 e.g., irradiated an infrared light from the cam-

era axis and got the pupil image, brighter than its surroundings. However, it was difficult to stably detect the pupil area under bad lighting conditions. To solve this problem, we have already proposed the pupil detection technique using two light sources and the image difference method. 2 Its usefulness for stable pupil detection was indicated by using the still images. In this study, the image differentiator was developed to apply the proposed method in real time. And it was used together with the other devices which we had already developed for pupil detection. In addition, a method for stabilizing pupil brightness was implemented for stabler pupil and glint detection. 2 The pupil detection technique If the eye is illuminated by a near-infrared light source in coaxial with the camera, the light enters the pupil and is reflected off the retina and comes out through the pupil. The pupil image appears as a half-lighted disc against a darker background, called the bright eye, 1, 3 as shown in Fig. 1a. On the other hand, if the eye is illuminated by a light source uncoaxial with the camera, the pupil image appears as a darker area against the background, called the dark eye, 4 as shown in Fig. 1b. Here, Fig. 1d and e show the brightness distribution on the horizontal lines in Fig. 1a and b, respectively. Under both bright and dark eye conditions, a fraction of the infrared light is reflected off the corneal surface and appears as a small intense area, called the glint, 1 as shown in Fig. 1a,b,d and e. (a) Odd field image (b) Even field image (c) Difference image (d) Bright eye Glint (e) Dark eye Glint (f) Brightness Brightness Brightness Threshold Position 0 0 Position 0 Position Figure 1: Conceptual scheme of the pupil detection technique using two light sources and the image difference method.

The glint position remains almost fixed in the image field as long as the user's head remains stationary. In contrast, the position of the pupil moves in the image field, following the rotation of the eye ball. So under both bright and dark eye conditions, the direction of gaze can be determined from the relative positions of the pupil and glint center. Under these conditions, however, it is difficult to set a threshold for pupil detection due to little difference in brightness between pupil area and its surroundings. In the pupil detection technique using two light sources and the image difference method, 2 one light source set in coaxial with the camera is switched on during the odd fields of the video signal, and another one set uncoaxial with the camera is switched on during the even fields. Then the difference images are obtained by subtracting the even field images from the consecutive odd field images. As a result, the background almost vanishes as shown in Fig. 1c and f. Here, it is necessary to make the background brightness levels, in the odd and even field images, equal by controlling the infrared light sources' power. By converting the difference image into a binary image by setting an appropriate threshold, the pupil area is detected. Here, the positional difference between the two light sources generates a small hole-like image on the pupil. This hole-like image will be described later. 3 What influences pupil brightness? Pupil brightness in the difference images fluctuates much more when the user changes his line of sight toward illuminated targets or menus. Moreover, in general, the pupil brightness is low in a light room (during daytime) and it becomes difficult to detect the pupil with a weak infrared light source. If using a strong light source, however, the pupil image sometimes saturates the video signal in a dark room (at night). This saturation disturbs glint detection because the glint usually exists in the pupil image. In this chapter, the relationship between the pupil brightness and the pupil area will be investigated. A standard interlaced scanning CCD camera (NTSC, 30 frames per second), having near-infrared sensitivity was used. A zoom lens, a close-up lens, an extender, and an infrared pass filter (800 nm) were attached to the camera. On the front of the zoom lens, a protecting filter was attached. In the filter, two holes were made and two infrared LEDs (TLN201, 880 nm) were inserted. One LED was located at the center of the filter and the other was located 23 mm away. The two LEDs were alternately switched

by an even/odd signal derived from an image input board. The center LED (in coaxial with the camera) was switched on during the odd fields and the off-center LED (uncoaxial with the camera) was switched on during the even fields. The center LED current was set at 25-150 ma (9 levels), and then the off-center LED current was controlled and the brightness in the surroundings of the eye in the even field images was adjusted to that of the odd field images. The subject's head and chin were restricted and the images of his right eye were obtained. The distance between the eye and the center LED was 62.5 cm. The zoom level was adjusted so that 1 cm in actual scale corresponded to 28 % of the width in the image of the aspect ratio 1:1. The infrared light power was measured near the eye when the sensor was directed to the center LED. A green LED (not including infrared light) was located 53 mm above the center on the front of the zoom lens. The subject was asked to fixate the LED. The pupil area was purposely fluctuated by blinking the LED. The images were obtained under fluorescent lighting during daytime (about 950 lx) and nighttime (about 700 lx). In each trial, the 10-consecutive odd and even field images were taken at a rate of one image per second by using the image input board (aspect ratio 1:1) which had a resolution of 512H 256V pixels in each field. The trial was conducted during daytime and nighttime on each infrared LED current. These images were stored on a hard disc, and analyzed later by a personal computer (NEC PC9801DA2) through the image input board. The difference images were obtained by subtracting the even images from the odd images. Here, when the differences indicated negative values, the results were calculated as zero (completely dark). Furthermore, the high brightness parts corresponding to the glint and the low brightness parts corresponding to the backgrounds were eliminated using two appropriate thresholds. Then the pupil area (pixel numbers) and the pupil brightness level (average in the pupil area) were calculated. Fig. 2 shows the relationships between the pupil brightness and pupil area in the difference images in each infrared LED current. The infrared light near the eye in darkness was also indicated in parentheses. The data obtained from the daytime and nighttime experiments were plotted together. You can see that pupil brightness is almost proportional to the pupil area, independently of lighting conditions in the room. Moreover, the pupil becomes brighter with an increase of the LED current. These findings indicate that the pupil brightness in the difference image strongly depends on the size of the pupil area. However, the room light does not directly influ-

ence pupil brightness, although it indirectly influences pupil brightness by changing the pupil area. Pupil brightness variation makes it difficult to detect the pupil as well as the glint. The glint in the odd and even fields almost cancels out each other. So it is difficult to stably detect the glint in the difference image because of its size and low brightness. The glint in the even fields is not so high in brightness, compared to its surroundings, as shown in Fig. 1f. Moreover, in our current system, the off-center LED for the even field moves according to the zoom level. So, we do not want to use the even fields for glint detection. On the other hand, the use of the odd fields is advantageous for glint detection in the next stages of our development; under perfectly free-head movement condition and under bespectacled condition. To detect the glint in the odd fields, pupil brightness should be stabilized as low as possible. LED current (infrared light) 150 (14.7) 130 (12.0) Pupil brightness level 100 45 0 1000 2000 P upil area (pixels) 100 ( 9.7) 80 ( 8.4) 60 ( 6.5) 50 ( 5.7) 40 ( 4.4) 30 ( 3.5) 25 ( 3.2) [ma] [ W/cm 2 ] Figure 2: Relationship between pupil area and pupil brightness. 4 Real-time pupil detection To differentiate two consecutive fields, we applied first-in-first-out (FIFO) memories. The block diagram of the image differentiator is shown in Fig. 3a. An analog video signal from the camera is converted to a 256-brightness level digital signal. An even/odd signal, generated by the synchronous separator, is used as a control signal for the FIFOs which were exclusively prepared for the odd and even fields.

Now, assume the first odd field (O1) enters the image differentiator at the time t1 (Fig. 3b). The odd FIFO receives O1 while the even FIFO does not. Next, when the first even field (E1) enters this device at the time t2, the even FIFO receives E1 while the odd FIFO does not. At this time, O1 is outputted from the odd FIFO. When the second odd field (O2) is inputted into the odd FIFO (t3), O1 is outputted. Simultaneously, the even FIFO outputs E1. At this time (t3), the first difference image (O1-E1) is outputted by the following subtracter. When the second even field (E2) enters (t4), E1 is outputted. And the odd FIFO outputs O2. At this time, the second difference image (O2-E1) is outputted. The two FIFOs alternately repeat these processes using the even/odd signal, always outputting consecutive even and odd fields. As a result, the difference image ( odd field minus even field ) can be obtained every 1/60 s. The time delay from the input of the first field to the output of the first difference image is 1/30 s. The resolution of the output images is 640H 256V. (a) IN ADC SYNC SEP. in in FIFO (odd) E/O FIFO (even) out out SUB DAC OUT (b) t1 t2 t3 IN O1 E1 O2 in out O1 E1 O1 O2 O1 E1 OUT invalid invalid O1 E1 ADC: analog to digital converter E/O: even/odd signal DAC: digital to analog converter FIFO: first-in-first-out memory SUB: subtracter SYNC SEP.: synchronous separator t4 t5 t6 E2 O3 E3 O2 E2 E1 O3 O2 E2 O3 E3 E2 O2 E1 O2 E2 O3 E2 Figure 3: (a)device and (b)algorithm for image differentiation. A video signal from the camera was inputted into the image differentiator. The output was inputted into the pupil detector. The pupil detector binarized the difference image signal with an appropriate threshold manually controlled by the computer through the parallel input/output (PIO) board. The binarized image had a resolution 256H 256V (aspect ratio 1:1). It was compensated by the following noise reducer, which applied the mathematical morphology processing theory, 5 whose basic operations are to dilate and erode. First, the binary pupil image was dilated by the structuring element as shown in Fig. 4a (each small circle corresponds to

image pixels). The shape of this element approximates a circle (broken line). This dilation operation eliminates the hole-image on the pupil (see Fig. 1f) and compensates overshadowed pupil images under bad conditions, although background noise is expanded. Next, the dilated image was eroded by the structuring element as shown in Fig. 4b. This process eliminates all background noises because the structuring element for erosion was larger than that of dilation. This noise reducer has been already described. 6 Its delay time is 1/30 s. From the output image (256H 256V), the x coordinates of the right and left edges (xr and xl ) and the y coordinates of the top and bottom edges ( yt and yb) of the pupil image were calculated by the pupil detector. These four coordinates were sent to the computer through the PIO, and then the pupil center coordinates ( px, py) were determined as px = (xr +xl )/2 and py = ( yt +yb )/2 every 1/60 s. The pupil detector also counted the pupil pixel number and output it to the computer every 1/60 s. On the other hand, the video signal from the camera was sent to the glint detector, which binarized this signal and output the four coordinates corresponding to those of the pupil detector. The glint center coordinates of the odd fields were calculated every 1/30 s. The threshold for glint detection was manually controlled by the computer through the PIO. Figure 4: Structuring elements for (a)dilation and (b)erosion. To stabilize pupil brightness in the difference images, the relationship between the pupil area and a given LED current, must be identified. In one case, a level of 45 was chosen for stabilizing pupil brightness. Fig. 5 shows the identified relationship. Here, the pupil area was calculated when pupil brightness reached the level of 45, for each current in Fig. 2. The relationship between the pupil area and the LED current was plotted, and then its regression curve was calculated as shown in Fig. 5. The current was controlled, in accordance with this curve formula, using the real-time pupil pixel number (average of consecutive 10 difference images) obtained from the pupil detector, through a D-A converter and two V-I converters. In the present stage of this study, the same current was always given to the two LEDs. The two infrared LEDs were alternately switched by the personal

computer, utilizing the even/odd signal derived from the image differentiator. The subject was asked to fixate on the green LED, slowly moving his head horizontally or vertically. Experiments were conducted in a dark room (about 0.1 lx), because the pupil area fluctuates largely by blinking the green LED. While the pupil and glint centers were detected by using the hardware described above, the video images from the camera were recorded by a video tape recorder. The other experimental conditions were the same as the ones described in Section 3. 150 Desired pupil brightness level : 45 LE D cu rren t (m A ) 100 50 104. 10 y = x + 100 4 0 2000 4000 Pupil area (pixels) Figure 5: Estimated curve for pupil brightness stabilization. In the first experiment, the LED current was not controlled. The recorded video tape was replayed every 10 frames, and these still images were inputted into the image input board for analysis. The pupil pixel number and the pupil brightness level were calculated after carefully determining an appropriate threshold for pupil detection. As shown in Fig. 6a, the pupil repeatedly constricted and dilated about three times during 10 s due to the blinking green LED. Accordingly, pupil brightness fluctuated (Fig. 6b). Under this condition, it was difficult to decide an appropriate threshold for stable pupil and glint detection. In the second experiment, the LED current was controlled to stabilize the pupil brightness level at 45. Although the pupil area fluctuated largely as shown in Fig. 7c, pupil brightness was almost stabilized at the desired level (Fig. 7d). Here, the pupil pixel number was obtained from the pupil detector. Simultaneously, the glint and pupil center positions were easily

detected as shown in Fig. 7a and b. Here, the head was slowly moved about 1.6 cm horizontally (X) for the initial seven seconds, and about 1.2 cm vertically (Y) for the residual time. Since the subject was fixating on one point (green LED), the relative positions between the glint and pupil centers almost never changed. Pupil area (pixels) 3000 2000 (a) Pupil brightness level 1000 50 (b) 0 Time (s) 10 Figure 6: Pupil brightness depending on pupil area. Glint position Pupil position 200 100 0 200 100 (a) (b) X X Y Y 0 Pupil area (pixels) Pupil brightness level 3000 2000 1000 50 0 (c) (d) Time (s) 10 Figure 7: Pupil brightness stabilization and real-time glint and pupil center detection.

The image differentiator output the difference images to the pupil detector every 1/60 s. This performance made it possible for the pupil detector to output the pupil coordinates every 1/ 60 s, although the glint detector output them only every 1/30 s. In the next stage of our study, we will develop the system for bespectacled conditions. In the system, a window (masking the pupil s surroundings), which moves corresponding to the pupil position, will be applied to eliminate the reflection light on the eye glass lenses. The ability of the image differentiator will be advantageous for adjusting the movement of the window to the pupil position in real time. Pupil brightness stabilization facilitated increased glint and pupil detection. In addition, there are also other advantages. First, at present, there is no LED which can produce adequately strong infrared light. The stabilization at a low brightness level prevents the LEDs deterioration. Second, the safety of infrared light has not been ascertained. This infrared light minimization may assure its safety. Third, this method also minimizes the hole-image on the pupil (see Fig. 1f), which in turn, reduces noise. 5 Conclusions An unconstrained and noncontact video-based eye-gaze detection method was studied to use as a human-computer interface. In the present stage of this study, we concentrated to detect the centers of the two feature points; the glint and pupil images, which were necessary to eye-gaze determination. In the conventional methods, it was difficult to detect the pupil image, especially. In this paper, the hardware (image differentiator) was developed to apply the pupil detection technique using two light sources and the image difference method, which we had already proposed. The combination of the image differentiator and the pupil detector made it possible to output the pupil coordinates every 1/60 s. In addition, the pupil brightness stabilization method was applied to the pupil detection technique. The method facilitated increased glint detection as well as pupil detection. Acknowledgments This research was partially supported by Tateisi Science and Technology Foundation. References 1. Hutchinson, T.E., White, Jr., K.P., Reichert, K.C. & Frey, L.A. Hu-

man-computer interaction using eye-gaze input, IEEE Transactions on Systems, Man, and Cybernetics, 1989, 19, 1527-1533. 2. Ebisawa, Y. & Satoh, S. Effectiveness of pupil area detection technique using two light sources and image difference method, (ed A.Y.J.Szeto & R.M.Rangayyan), pp. 15-1268 to 15-1269, Proceedings of the 15th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, U.S.A., 1993. 3. Charlier, J.R. & Hache, J.C. New instrument for monitoring eye fixation and pupil size during the visual field examination, Medical & Biological Engineering & Computing, 1982, 20, 23-28. 4. Meyers, A.M., Sherman, K.R. & Stark, L. Eye monitor, microcomputer-based instrument uses an internal model to track the eye, Computer, March 1991, 14-21. 5. Haralick, R.M., Sternberg, S.R. & Zhuang, X. Image analysis using mathematical morphology", IEEE transactions on Pattern Analysis and Machine Intelligence, 1987, 9, 532-550. 6. Kojima, S., Ebisawa, Y. & Miyakawa, T. Fast morphology hardware using large size structuring element", Systems and Computer in Japan, 1994, 25, 6, 41-49.