Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

Similar documents
PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique

Unconstrained pupil detection technique using two light sources and the image difference method

Eye-Gaze Tracking Using Inexpensive Video Cameras. Wajid Ahmed Greg Book Hardik Dave. University of Connecticut, May 2002

A PILOT STUDY ON ULTRASONIC SENSOR-BASED MEASURE- MENT OF HEAD MOVEMENT

The introduction and background in the previous chapters provided context in

Lenses. Images. Difference between Real and Virtual Images

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved.

Fig Color spectrum seen by passing white light through a prism.

Research on Pupil Segmentation and Localization in Micro Operation Hu BinLiang1, a, Chen GuoLiang2, b, Ma Hui2, c

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

ScienceDirect. Improvement of the Measurement Accuracy and Speed of Pupil Dilation as an Indicator of Comprehension

A moment-preserving approach for depth from defocus

Eye Contact Camera System for VIDEO Conference

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Chapter 36. Image Formation

Chapter 36. Image Formation

Chapter 25. Optical Instruments

CMOS Image Sensor for High Speed and Low Latency Eye Tracking

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses.

Development of Gaze Detection Technology toward Driver's State Estimation

Frame-Rate Pupil Detector and Gaze Tracker

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

Quintic Hardware Tutorial Camera Set-Up

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

Fein. High Sensitivity Microscope Camera with Advanced Software 3DCxM20-20 Megapixels

Chapter 34: Geometrical Optics (Part 2)

OPTICAL SYSTEMS OBJECTIVES

Chapter Ray and Wave Optics

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

PROCEEDINGS OF SPIE. Automated asphere centration testing with AspheroCheck UP

CSE Thu 10/22. Nadir Weibel

Tobii T60XL Eye Tracker. Widescreen eye tracking for efficient testing of large media

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Patents of eye tracking system- a survey

New foveated wide angle lens with high resolving power and without brightness loss in the periphery

Image Measurement of Roller Chain Board Based on CCD Qingmin Liu 1,a, Zhikui Liu 1,b, Qionghong Lei 2,c and Kui Zhang 1,d

3B SCIENTIFIC PHYSICS

Automatics Vehicle License Plate Recognition using MATLAB

POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM

Optical Engineering 421/521 Sample Questions for Midterm 1

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Chapter 34: Geometric Optics

10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

Chapter 23. Mirrors and Lenses

Understanding Optical Specifications

Study on Imaging Quality of Water Ball Lens

Development of Hybrid Image Sensor for Pedestrian Detection

GEOMETRICAL OPTICS Practical 1. Part I. BASIC ELEMENTS AND METHODS FOR CHARACTERIZATION OF OPTICAL SYSTEMS

Encoding and Code Wheel Proposal for TCUT1800X01

Chapter 23. Mirrors and Lenses

ME 6406 MACHINE VISION. Georgia Institute of Technology

DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS. GUI Simulation Diffraction: Focused Beams and Resolution for a lens system

Spherical Mirrors. Concave Mirror, Notation. Spherical Aberration. Image Formed by a Concave Mirror. Image Formed by a Concave Mirror 4/11/2014

Chapter 36: diffraction

EC-433 Digital Image Processing

X-RAY COMPUTED TOMOGRAPHY

Development of an Automatic Measurement System of Diameter of Pupil

A novel solution for various monitoring applications at CERN

Noise Characteristics of a High Dynamic Range Camera with Four-Chip Optical System

3B SCIENTIFIC PHYSICS

Sets distance refraction. Moves camera forward / backward. Moves camera up / down. Moves camera left / right. Starts image acquisition (HRT 3 only)

Chapter 23. Mirrors and Lenses

Development of a High-Precision DOP Measuring Instrument

GPI INSTRUMENT PAGES

Sensors and Sensing Cameras and Camera Calibration

Pupil detection and tracking using multiple light sources

Vocabulary: Description: Materials: Objectives: Safety: Two 45-minute class periods (one for background and one for activity) Schedule:

Tobii Pro VR Integration based on HTC Vive Development Kit Description

A novel tunable diode laser using volume holographic gratings

Notes on the VPPEM electron optics

Measurements of Droplets Spatial Distribution in Spray by Combining Focus and Defocus Images

[ Summary. 3i = 1* 6i = 4J;

Spotlight White paper

Applications of Optics

High-speed Gaze Controller for Millisecond-order Pan/tilt Camera

Bias errors in PIV: the pixel locking effect revisited.

Waves & Oscillations

Wheeled Mobile Robot Obstacle Avoidance Using Compass and Ultrasonic

2.710 Optics Spring 09 Problem Set #3 Posted Feb. 23, 2009 Due Wednesday, March 4, 2009

www. riseeyetracker.com TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01

Practice Problems (Geometrical Optics)

Day&Night Box camera with 36x Optical Zoom WV-CZ392/CZ492

Test procedures Page: 1 of 5

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

ADALAM Sensor based adaptive laser micromachining using ultrashort pulse lasers for zero-failure manufacturing D2.2. Ger Folkersma (Demcon)

Information & Instructions

Eye Tracking Computer Control-A Review

Drawing with precision

Intorduction to light sources, pinhole cameras, and lenses

Imaging Photometer and Colorimeter

Digital Image Processing

Electrowetting-Based Variable-Focus Lens for Miniature Systems

A Vehicle Speed Measurement System for Nighttime with Camera

Copyrighted Material. Copyrighted Material. Copyrighted. Copyrighted. Material

A Short History of Using Cameras for Weld Monitoring

Integral 3-D Television Using a 2000-Scanning Line Video System

LOS 1 LASER OPTICS SET

Camera Calibration Certificate No: DMC III 27542

Notation for Mirrors and Lenses. Chapter 23. Types of Images for Mirrors and Lenses. More About Images

Transcription:

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Kiyotaka Fukumoto (&), Takumi Tsuzuki, and Yoshinobu Ebisawa Graduate School of Engineering, Shizuoka University, Shizuoka 432-8561, Japan {fukumoto.kiyotaka,ebisawa.yoshinobu}@shizuoka.ac.jp Abstract. One of the general problems of the pupil-corneal reflection-based gaze detection systems is that the frames and lens of eyeglasses produce reflection images of the light sources in the camera image when a user wears eyeglasses. The glass reflections tend to be misdetected as the pupil and corneal reflections. In the present paper, we propose a novel geometrical methodology based on the optical structure of the eyeball to detect a true pair of the pupil and corneal reflection. The experimental results show that the proposed method improved the precision of gaze detection when the subjects wore glasses or when disturbance light sources existed. Keywords: Pupil Corneal reflection Corneal sphere center Gaze detection 1 Introduction Video-based gaze detection systems are about to be used in various fields such as the entertainment [1], medicine [2], and safety driving support [3]. In our previous study, we have developed a pupil-corneal reflection-based robust and precise gaze detection system using the two light sources and the image difference method, which allows large head movements and easy user calibration [4, 5]. In this system, an optical system for detecting the pupils and corneal reflections consists of a camera and two concentric near-infrared LED rings (inner and outer rings) light source attached to the camera. The inner and outer LED rings generate bright and dark pupil images, respectively. The pupils are detected from a difference image created by subtracting the bright and dark pupil images. In the difference image, a threshold for binarization to detect the pupils is easily determined automatically because the pupils are embossed from the relatively flat background image. However, when the users move their head, the pupil position differs between the bright and dark pupil images because of the acquisition time difference of both pupil images. As a result, the pupil position is not detected accurately. Therefore, in our system, the image difference processing is performed after shifting the small areas (small windows) including each pupil in the dark pupil image so that the corneal reflection in this dark pupil image may coincide with that in the bright pupil image. We call this method the image difference method with positional Springer International Publishing Switzerland 2015 M. Kurosu (Ed.): Human-Computer Interaction, Part II, HCII 2015, LNCS 9170, pp. 13 23, 2015. DOI: 10.1007/978-3-319-20916-6_2

14 K. Fukumoto et al. compensation based on the corneal reflection (the positionally compensated image difference (PCID) method) [6]. In addition, we proposed the easy gaze calibration methods: the automatic, one-point, and two-point calibration methods [4]. In the one-point calibration method, the user has only to fixate on one target having known coordinates presented at the center of the PC screen. By this procedure, the gaze points on the whole of the PC screen can be detected almost exactly. However, when the user wears eyeglasses, their frames and lens produce various size, shape and intense of areas in the camera image as so-called glass reflections. The reflections often show image features similar to the pupil and the corneal reflection and tend to be misdetected as the pupil or the corneal reflection. The reflections of tears and disturbance light sources also cause the misdetection. In the present paper, we propose a novel geometrical methodology based on the optical structure of the eyeball to detect a true pair of the pupil and corneal reflection for accurate gaze detection even if the user wears glasses. 2 Our Gaze Detection System 2.1 System Configuration Figure 1(a) shows an overview of the gaze detection system which we developed. This system has two optical systems (Fig. 1(b)), each of which consists of a digital video camera having near-infrared sensitivity, a 16-mm lens, an infrared filter (IR80), and a light source. Each of the two optical systems was placed under a 19-in. liquid crystal display (screen size: 376.3 301.1 mm, 1,280 1,024 pixels). The light source consisting of near-infrared 3ϕ LEDs which are arranged in a double concentric circle ring form is attached to the camera. The wavelengths of the inner and outer rings were 850 and 940 nm, respectively. The pupil becomes brighter in the 850 nm ring than the 940 nm ring because the transmissivity of the eyeball medium is different. The distance between the LEDs and the aperture of the camera also varies the pupil brightness. The combined effects of the differences of the distance and transmissivity were applied to the light source. In order to reduce the effect of an ambient light, it is desirable that the LEDs irradiation power on the user s face becomes as strong as possible compared with the ambient light. Therefore, the LEDs were flashed while the camera shutter opened (shutter speed 500 μs). The current flow was approximately one ampere during LED flashing. The two cameras were driven with a slight synchronization difference (670 μs) for avoiding mutual light interference of the optical systems. By this, basically, only one corneal reflection appears for each eye in an image. An 8-bit gray scale image (640 480 pixels) of the user s face was input into a personal computer (PC, Intel Core i7 3.20 GHz CPU and 12 GB RAM) at 60 fps. 2.2 Detection of Centers of Pupils and Corneal Our Conventional Method for Detection of Pupils and Corneal Reflections. First, the pupils are searched for and detected from the difference image generated from the

Improvement of Accuracy in Remote Gaze Detection 15 Digital camera (1/3 inch CMOS) 16mm lens Near-infrared pass filter Outer ring, 940 nm Light source Optical systems Inner ring, 850 nm (a) Overview of the gaze detection system (b) Optical system Fig. 1. (a) Our gaze detection system. (b) Optical system for detecting pupil and corneal reflection. bright and dark pupil images. The image is processed in the following order: binarization, removal of isolated pixels, noise reduction using mathematical morphology operations, and labeling. The largest and second largest labeled regions are detected as the two pupils. When the pupil is undetected in the prior difference image (e.g., when the pupil is covered with the glass reflection), the pupils are searched for in the whole of the current difference image again. When two consecutive pupil detections occur, in order to perform the PCID method, the pupil positions in the current images are predicted using the linear Kalman filter, and the small window (70 70 pixels) is then applied around the predicted pupil positions, respectively. The image within the small window is transformed into the double-resolution (DR) image (140 140 pixels). An intense and tiny region closest to a center of the DR images is extracted and then the center of gravity considering the values of the pixels in the region is determined as the center of the corneal reflection. As described before, when the user s head is moving, the pupils cannot be obtained correctly because the pupil position differs between the bright and dark pupil images. Therefore, the DR difference image is generated after shifting the DR dark pupil image so that the corneal reflection in the DR dark pupil image coincides with that of the DR bright pupil image (the PCID method). When the corneal reflection is not detected in either one of the bright and dark pupil image, the difference image is generated without the positional compensation. After the image areas whose image feature similar to the pupil are labeled in the binarized DR difference image, the nearest

16 K. Fukumoto et al. area to the predicted pupil position is determined as the pupil in the current DR difference image. In this image, the ellipse-fitting of the contour of the pupil is performed. The center of the ellipse is determined as the pupil center. Proposal Geometrical Method for Determining a True Pair of the Pupil and the Corneal Reflection. When the user wears glasses, the glass reflections are tend to be misdetected as the pupils and corneal reflections. In addition, the false images of disturbance light sources may be misdetected as the true corneal reflection of the light sources of the system. Therefore, we propose the geometrical method for detecting a true pair of the pupil and corneal reflection. Assuming the corneal surface to be a sphere, the corneal sphere center is determined as shown in Fig. 2(a). We use the pinhole camera model and assume that the light source is located at the same position as the pinhole. Therefore, the corneal sphere center exists on the line connecting the pinhole and the corneal reflection in the image sensor. The 3D position of the corneal sphere center can be determined by stereo-matching the corneal reflections obtained from the two cameras. However, when one or both corneal reflections (Fig. 2(b) and (c)) obtained from the two cameras are misdetected due to the glass reflection or the disturbance light source, the corneal sphere center is detected at wrong positions. In the proposed method, m and n corneal reflection candidates for each camera and each eye are extracted from the bright and dark pupil images, respectively, which include the true and false corneal reflections. The PCID method is performed for all m n combinations of the candidates. When the pupil is not detected, it is judged that at least one of the paired two candidates used for performing the method was not the true corneal reflection. When the pupil is detected, the pupil and corneal reflection pair used in the method is retained as one of the pair candidates. The 3D positions of the pupil and corneal sphere centers are detected by stereo-matching the pupils and the corneal reflections, respectively, of the remaining pair candidates obtained from two cameras. One true pair of the pupil and corneal sphere centers is chosen by the following two conditions: Condition I: the angle between the vector from the corneal sphere center to the pupil center and the vector from the pupil center to the middle point between the two cameras is within 40. This is because we thought that the gaze detection system suffices gaze detection of only PC screen area. So, considering the unknown angle difference between the visual and optical axes, we gave 40 so as to include the screen area. Condition II: the distance d between the corneal sphere center and the pupil center satisfies D C-P 1.5 [mm] < d < D C - P + 1.5 [mm], where D C-P is the distance obtained beforehand from the individual users. Based on the chosen pair, the 3D pupil position and the coordinates of the pupil and corneal reflection in the camera image are obtained and are used for the gaze detection [5]. 2.3 Gaze Detection Theory and Calibration Method [5] In Fig. 3, O 1 and O 2 indicate the pinholes of the two camera-calibrated cameras. The 3D pupil position P is obtained by stereo-matching. As mentioned before, we assume

Improvement of Accuracy in Remote Gaze Detection 17 (a) R Light source and pinhole of camera Optical axis of camera Eyeball Image of corneal reflection Camera L Pupil center Corneal sphere center-pupil center vector Corneal sphere center (b) Eyeglasses Detected position R L False distance (c) R L Detected position Disturbance light source Fig. 2. (a) When two cameras detect a true corneal reflection of the light source attached to the corresponding camera, respectively. (b) When the left and right cameras detect a true corneal reflection and a false reflection of a glass reflection, respectively. (c) When the left camera detects a true corneal reflections produced by the light source attached to the left camera and the right camera detects a false reflection of the disturbance light.source, respectively. that the light source attached to each camera is located at the same position as the corresponding camera. The line of sight (visual axis of the eyeball) passes through the fovea on the retina, the pupil center P and gaze point Q on the screen plane of the PC display. Now we define the virtual gaze planes H 1 and H 2 of the cameras for one eyeball. These planes are perpendicular to the line passing through P and O 1 and the

18 K. Fukumoto et al. line passing through P and O 2, respectively, and they include O 1 and O 2, respectively. The X-axis (X 1 and X 2 ) of planes H 1 and H 2 is the intersection between the corresponding plane and the horizontal plane in the global coordinate system (x y z). H 1 and H 2 rotate according to the displacements of the pupil position. Next, we define the virtual gaze sphere S whose center is P and whose radius is arbitrary. The visual axis PQ has intersection points with sphere S and planes H 1 and H 2. The intersection points are denoted as G, G 1 and G 2, respectively. Here, we define ƒ angular vectors h 1 and h 2 on sphere S as the projections of ordinary vectors O 0 ƒ! 1 G0 1 and ƒ O 0 ƒ! 2 G0 2 on planes H1 and H 2 to sphere S. By projecting the horizontal axes X 1 and X 2 on ƒ planes H 1 and H 2 to sphere S, orientations ϕ 1 and ϕ 2 of vectors O 0 ƒ! ƒ ƒ! 1 G0 1 and O 0 2 G 0 2 can be also projected to sphere S and can be defined. According to these projections, you can see that angular vectors h 1 and h 2 can be determined by using ϕ 1 and \O 1 PG and by ƒ ƒ! ƒ ƒ! using ϕ 2 and \O 2 PG, respectively. Here, angular vector O 1 O 2ð O1 O 2 ¼ \O 1 PO 2 Þ is expressed as follows: ƒ ƒ! O 1 O 2 ¼ h1 h 2 ð1þ We assume a linear relationship between the actual size vector r from the corneal reflection to the pupil center and the angle θ between the visual axis of the eyeball and the line connecting the pupil and the camera as follows: h ¼ kr ð2þ where r is converted from the vector from the corneal reflection center to the pupil center, which is obtained from the camera image, using the pinhole model. k is a constant. Actually, in general, there is a difference between the optical and visual axes of the eyeball. So, r is calculated by compensating a measured vector r 0 using an offset vector r 0 as the following equation: r ¼ r 0 r 0 ð3þ From Eqs. (2) and (3), the following equations are given for cameras 1 and 2. h 1 ¼ kr 1 ¼ kðr 0 1 r 0Þ h 2 ¼ kr 2 ¼ kðr 0 2 r 0Þ From the above equations, k is calculated by the following equation: k ¼ h 1 h 2 r 0 1 r0 ¼ \O 1PO 2 2 r 0 1 r0 2 ð4þ ð5þ ð6þ Using the value of k, r 0 is determined from Eqs. (4) and (5). Determining k and r 0 mean the user calibration.

Improvement of Accuracy in Remote Gaze Detection 19 In the gaze detection procedure, first, the pupil-corneal reflection vectors r 0 1 and r0 2 are obtained from the images of the two cameras. By using Eqs. (4) and (5), h 1 and h 2 are calculated. Next, the visual axis is determined for each eye from pupil position P, h 1 and h 2. Finally, the gaze point on the screen is estimated as the intersection point between the screen plane and the visual axis. Visual gaze plane of left camera G Camera 1 Visual axis Q Fovea P z y x Visual gaze sphere S Visual gaze plane of right camera Camera 2 Fig. 3. Gaze detection theory using visual gaze sphere 3 Experiments 3.1 Experiment 1: Measurement of Distance Between Corneal Sphere Center and Pupil Center Method. In order to examine and determine the distance D C-P shown in Condition II, the 3D corneal sphere and pupil center positions of three university students who did not wear glasses were measured. In the calibration procedure, the subjects were asked to fixate on a calibration target presented at the center of the PC screen (the one-point calibration method). The distance between the eyes and the screen was approximately 80 cm. After the calibration procedure, the subjects fixated on a stationary target presented at the center of the screen and a slowly moving target between the right and left edges of the screen. Using a chinrest stand, the subjects heads were positioned at the following five positions: approximately 75, 80, and 85 cm from the PC screen, and 5 cm to the left and 5 cm to the right at 80 cm. In addition, subject A wore glasses and participated in the same experiment again. Results. Figure 4(a) and (b) show the averages and SDs of the distance d between the pupil and corneal sphere centers at the five head positions when the subjects fixated on the stationary target and the moving target, respectively. Although the distance d was different among the subjects, it did not depend on the head positions and the gaze

20 K. Fukumoto et al. directions for each subject. Figure 5 shows the results when subject A wore and did not wear glasses, respectively. Almost the same values were obtained by whether the subject wore glasses or not. 6.0 Head position 80 cm Left 5 cm Right 5 cm 75 cm 85 cm Average Distance between pupil and corneal sphere centers [mm] 5.5 5.0 4.5 4.0 Subject A B C (a) When subjects fixated on stationary target A B C (b) When subjects fixated on moving target Fig. 4. Averages and SDs of the distance d between the pupil and corneal sphere centers at the five head positions for each subject. Fig. 5. Averages and SDs of the distance d between the pupil and corneal sphere centers at the five head positions when subject A wore glasses and when he did not wear glasses. 3.2 Experiment 2: Gaze Detection When Subjects Wear Glasses Method. This experiment was conducted in order to compare the precision of gaze detection between the proposed and our previous methods when subjects wore glasses. In the previous method, the corneal reflection nearest to the predicted pupil center was chosen for the PCID method. The values of m and n were both three. The subjects were three university students. In the one-point calibration procedure, the head direction of the subjects was adjusted so that the lens reflections did not appear in the camera

Improvement of Accuracy in Remote Gaze Detection 21 image. After the procedure, the subjects wearing glasses fixated on 25 (5 by 5) visual targets equally arranged on the PC screen one by one. The values of D C-P were obtained from each subject when they did not wear glasses before this experiment. Results and Discussion. Figure 6(a) and (b) show the gaze point distributions of the left eye for the subject A in the proposed and our previous methods, respectively. In the previous method, the dispersion of the gaze points was large compared to the proposed method, especially when the subject fixated on the lower targets. This was caused by misdetection of the pupil and/or the corneal reflection due to the glass reflections. Especially, the subject fixated on the targets 17 and 22, the glass reflection had covered the left pupil. In the proposed method, no gaze point was detected. These results mean that the pupil and/or the corneal reflection were misdetected in the previous method, whereas the proposed method prevented these misdetections. Furthermore, 1.0 % gaze points outside of the region presented in Fig. 6 existed in the previous method while 0 % in the proposed method. The average and SD of the gaze error in visual angle for the subject A were 1.24 ± 1.61 [deg] in the previous method, whereas those of the proposed method were 1.08 ± 1.23 [deg]. The results of the other two subjects showed the similar results. The average gaze error for all the three subjects was 1.26 ± 1.62 [deg] in the previous method and 1.14 ± 1.99 [deg] in the proposed method. These results indicate that the proposed method functioned to prevent the misdetection of the pupil and corneal reflection and to select a true pair of the pupil and corneal reflection. Fig. 6. Detected gaze point distributions in the previous and proposed methods when subject A wore glasses. Dots and intersections of dotted lines indicate the gaze points and visual target positions, respectively. The rectangular area enclosed by the broken lines indicates the PC screen. 3.3 Experiment 3: Gaze Detection When Disturbance Light Sources Generated False Corneal Reflection Method. Four small disturbance light sources were installed at the four corners of the PC screen, respectively, and they generated the false corneal reflections. Subjects were two university students who wore glasses. The calibration and gaze detection

22 K. Fukumoto et al. procedures were the same as in experiment 2, where the distance between the eyes and the screen was approximately 80 cm. Results and Discussion. Figure 7(a) and (b) compare the averaged gaze points of the left and right eyes for subject A between the previous and proposed methods. In the previous method, the gaze point dispersions were large for many of the 25 targets. Whereas the proposed method showed the smaller dispersion for almost all targets. The gaze error in the previous method for subject A was 3.08 ± 5.62 [deg], whereas that of the proposed method was 1.23 ± 2.55 [deg]. Subject B showed the error of 5.31 ± 10.61 [deg] in the previous method and 2.43 ± 5.52 [deg] in the proposed method, respectively. These results indicate that the proposed method functioned to prevent the misdetection of the false corneal reflections produced by the disturbance light sources. Fig. 7. Detected gaze points (average of right and left eyes) in the previous and proposed methods when four disturbance light sources were installed at four corners of PC screen and generated false corneal reflections. 4 Conclusions In our remote gaze detection system, in order to prevent the misdetection of the pupil and corneal reflection when a user wears glasses and/or when the disturbance light sources exist, the novel geometrical method based on the optical structure of the eyeball was proposed. The experimental results showed that the proposed method detects a true pair of the pupil and corneal reflection and improves the accuracy of the gaze detection when the glass reflections or the false corneal reflections of the disturbance light sources appear in the camera image. The proposed method would function well also in the other pupil-corneal reflection-based gaze detection systems.

Improvement of Accuracy in Remote Gaze Detection 23 References 1. Tobii Technology. http://www.tobii.com/ 2. Bakker, N.M., Lenseigne, B.A.J., Schutte, S., Geukers, E.B.M., Jonker, P.P., van der Helm, F.C.T., Simonsz, H.J.: Accurate gaze direction measurements with free head movement for strabismus angle estimation. IEEE Trans. Biomed. Eng. 60(11), 3028 3035 (2013) 3. Tawari, A., Martin, S., Trivedi, M.M.: Continuous head movement estimator for driver assistance: issues, algorithms, and on-road evaluations. IEEE Trans. Intell. Transp. Syst. 15(2), 818 830 (2014) 4. Ebisawa, Y., Fukumoto, K.: Head-freeremote eye-gaze detection system based on pupil-corneal reflection method with easy calibration using two stereo-calibrated video cameras. IEEE Trans. Biomed. Eng. 60(10), 2952 2960 (2013) 5. Ebisawa, Y., Fukumoto, K.: Head-free, remote gaze detection system based on pupil-corneal reflection method with using two video cameras one-point and nonlinear Calibrations. In: Kurosu, M. (ed.) HCII/HCI 2013, Part IV. LNCS, vol. 8007, pp. 205 214. Springer, Heidelberg (2013) 6. Nakashima, A., Ebisawa, Y., Nurikabe, Y.: Pupil detection using light sources of different wavelengths. J. Inst. Image Inf. Telev. Eng. 60(12), 2019 2025 (2006)

http://www.springer.com/978-3-319-20915-9