BIOMEDICAL ENGINEERING- APPLICATIONS, BASIS & COMMUNICATIONS POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM 141 CHERN-SHENG LIN 1, HSIEN-TSE CHEN 1, CHIA-HAU LIN 1, MAU-SHIUN YEH 2, SHYAN-LUNG LIN 1 1 Department of Automatic Control Engineering, Feng Chia University, Taichung, Taiwan 2 Chung-Shan Institute of Science and Technology, Tao-Yuan, Taiwan ABSTRACT In this paper a polar coordinate mapping method for an improved infrared eye-tracking system is described. In the proposed control system for an eye-tracking device, users do not need to wear any equipment. Rather they just ware an infrared light source and an infrared CCD camera to extract the eye images for the computer to analyze, record the information of the moving traces and pupil diameters, control the mouse's cursor and operate many kinds of applied programs. The advantage of this system lies in solving non-contact requirement problems for which the measured system would require whole eye monitoring and a quick response. We also improve the feasibility and the safety of this eye-tracking device by using infrared rays and new coordinate mapping method. Biomed Eng Appl Basis Comm, 2005(June); 17: 141-146. Keywords: polar coordinates, eye-tracking, infrared light 1. INTRODUCTION Information technology and network development are becoming and personal and notebook computers are very commonly used by people of any age for personal data processing, internet games, and exchanging information. Computer operation is quite convenient and easy, and there are many kinds of the input interfaces in addition to the traditional keyboard and the mouse which can assist the operator for different demands[1-3]. Recently, human interface systems have been researched and developed for handicapped persons as well as the general people. The eye-controlled system uses eye-tracking technology with special software and hardware for the users to operate a computer with their eyes controlling the cursor. In addition, there are Received: Aug 7, 2004; Accepted: April 28, 2005 Correspondence: Chern-Sheng Lin, Professor Department of Automatic Control Engineering, Feng Chia University, Taichung, Taiwan E-mail: cslin@auto.fcu.edu.tw numerous other kinds of the auxiliary devices to assist handicapped people in communicating, entertaining or working[4-5]. Since the eye-controlled system is still very expensive, most handicapped persons can not afford it, so we wish to reduce the cost for them. Moreover, we found that previous pupil trackers require the user to wear a headwear frame for spectacles plus irradiating the visible light, which can make user feel uncomfortable and easily damage their eyes. Therefore, this research uses the headwear infrared light (short distance wearing HMD's eye-controlling system) or uses the infrared light by long distance irradiating (the remote eye-controlling system), and does not require the user to wear any devices to use the pupil tracker and reduces possible harm from these two infrared light sources. 2. POLAR COORDINATES MAPPING METHOD At first the user gazes at the center point of the screen, so we can obtain the corresponding point C in 31
142 the eye image, as shown in Fig. 1. When the user gazes at the point L a, we obtain pupil center point L1 1 in the eye image. When the user gazes at the point L b, we obtain pupil center point L2 2 in the eye image. When the user gazes at the point L c, we obtain pupil center point L3 3 in the eye image. When the user gazes at the point L d, we obtain pupil center point L4 4 in the eye image. Then the gaze point L0 0 of pupil center point Lx x in the eye image can be determined by using the above five calibration points. For example, If 2 x 1, the gaze point L0 0 is calculated by the following equation: The same method can be applied in case of x 2 In case of 4 x 3 3 But in 1 x or x 4 case, we must let 5= 1 +2 If 2, let 0= 0-2 (1) (2) (3) (4) (5) (6) (7) (8) To achieve the positions of 5 points, the user gazes at the upper, left, center, right and lower parts of the screen, and the CCD camera will capture the images. The status of the eyeballs is as shown in Fig. 2. Using these 5 statuses of the eyeballs with formulas (1) to (8) and Fig. 2 can show the eyeballs gazing at one point of the screen, and then compiling the related programs Vol. 17 No. 3 June 2005 can enable the user to control the cursor by his/her eyeballs. This method could be used in the remote eyecontrolled system and also for a short distance (wearing HMD) eye-controlling system. 4. SOFTWARE AND HARDWARE STRUCTURE OF THE EYE- CONTROLLED SYSTEM First, the procedure of the dynamic-image searching algorithm for the remote eye-controlling system is as shown in Fig. 3, and the specifications of using the infrared LED, JY-IR940-A, are as below: Light wavelength: 940nm Half light output power: 100nW Half power angle: 20 Volume: Packing in 5mm diameter Working Voltage: 5V This research uses infrared light to determine the position of the user's eyeballs, so it must consider the degree to which light can harm the human body. Generally, the higher standard of harm due to light is based on the medical laser system [6]. Because the laser has great condensability and causes the high light power density, it can seriously damage the human body. Therefore, this system depends on the medical laserϒƒs safety standard to determine the maximum limitation of the infrared LED light intensity for the eyes. In the case of infrared's =940nm and the irradiating time = 1000s (about 16.7min), the energy density limitation is calculated as C A =10 0.002(940-700) =3.01995. However, the infrared light irradiates the userϒƒs eyes directly in long distance, so the energy density limitation can be obtained. The energy density limitation = (9) Power density limitation = Energy density limitation / time (10) It would be overly censer rather to use the standard of laser rays for infrared rays. For general specifications, the width of the infrared LED beam is 50nm and the single module laser ray is 5nm, so the power density of the laser rays is much greater than the infrared LED. Taking the beam width for example, the width of the infrared LED is 1/10 that of the laser ray. Therefore, setting 10 times of the first class laser rays power of Z136.1 defines the light power limitation for infrared light LED to preventing damage to the user's eyes from the infrared light. 32
BIOMEDICAL ENGINEERING- APPLICATIONS, BASIS & COMMUNICATIONS By the calculation above, the power density limitation of the infrared light LED is 9.67mW/cm 2 because the power density limitation of the first class laser rays is 0.967 mw/cm 2 with =940nm. Because the distance between the user's eyes and the infrared light is 300mm, and the area and the distance of the infrared rays is as shown in Fig. 4(A), the area of eyeballs' surface is calculated as (11) The 0.25cm radius of 5mm packing of JY-IR940- A infrared LED reflects to one part radius of the eyeball area. The maximum output power of JY-IR940- A is 100mW. The irritating infrared light energy density on the eyeballs' surface is converted into: The limitation of power density = Output power / the irradiating area = 100/391.91 = 0.255mW/cm 2 The power density (0.255mW/cm 2 ) is smaller than the safety standard power density (9.67 mw/cm 2 ). This indicates that the infrared LED (JY-IR940-A) conforms to the safety limitation of output power. The eye-control system with the reflecting light is discussed as shown in Fig. 4(B). The infrared light uses the reflector irradiating to the user's eyes, so the limitation of the energy density may be found from the list, as shown in (13). The limitation of the energy density= (11) By formula 12, the limitation of power density is 2.446 mw/cm 2. Defining the power limitation of the infrared LED as 10 times the power limitation of the first class laser rays of Z136.1 can prevents the user's eyes from damage by the infrared light. According to the above calculation; the power density limitation of the first class laser rays of =940nm is 2.446 mw/cm 2, and then calculating the power density limitation of the infrared LED is calculated to be 24.46 mw/cm 2. The distance from the infrared light through the reflector to the user's eyes is 60mm. The area of the eyeballϒƒs surface is calculated by the irradiating area and distance of the infrared LED. 143 5.37mW/cm 2, is much smaller than the safe standard, 24.46mW/cm 2, which means that the output power of the infrared LED (JY-IR940-A) is in the safe zone. 5. EXPERIMENT RESULT The eye-tracking system is then improved into the remote eye-tracking system. The infrared CCD built under the screen replaces the CCD set originally under the headwear spectacles. This can evaluate the performance of acquiring image at a long distance. Otherwise, not only is the long distance eye-tracking device developed, but also the short distance device is improved by abandoning the visible light and setting an infrared light on it. For the image acquisition, the infrared auxiliary light irradiates on the user eyeballs' and then using the infrared CCD camera acquires the images. The AV- USB adapter transforms the signals of the camera into the USB 2.0 interface to the computer, and finally, it controls the mouse through the computer program. Fig. 1. Polar coordinates mapping relationship between eye polar coordinates (left) and monitor polar coordinates (right) The 0.25cm radius of 5mm packing of JY-IR940- A infrared LED reflects to one part radius of the eyeballs' area, and the infrared energy density by JY- IR940-A irradiating to the eyeballs surface is 5.37mW/cm 2. Therefore, the calculated power density, Fig. 2. Positions of eyeball (up, left, middle, right, down) 33
144 Vol. 17 No. 3 June 2005 Fig. 4(A) Fig. 4(B) Fig. 4(A) The area of eyeball surface which infrared rays irradiating on and long distance eye-tracking device Fig. 4(B) The area of eyeball surface which infrared LED reflecting form and short distance eyetracking device Fig. 3. The procedure of the algorithm of dynamic image searching frame In this eye-tracking system, the track of eyeball movement can be recorded. When the user gazes at a specific area in the monitor, there will be more coordinates of eye tracks located in this area. For accuracy and improved record time interval of the eyetracking coordinates, the user must gaze at the four corners of the monitor. Taking Fig. 5 for as an example with 800 600 resolution of the monitor, the user gazes at the four corners of the monitor. The time for the system to take samples is 30 seconds. The numbers of samples are increased in one second in order to find the shortest sampling time. 34
BIOMEDICAL ENGINEERING- APPLICATIONS, BASIS & COMMUNICATIONS 145 Fig. 5 Sample chart of eyeball's tracking Table. 1 Testing results of the eye-tracking coordinates In the first sampling, the time interval is set for one second between two samplings. The system takes 30 samples, and these eye-tracking coordinates of each sampling can be transformed into an Excel file. Then, these coordinates are shown in a X-Y axis chart. The lines between each coordinate are the track of eyes movement and the results are shown in Fig.6 (A). In the second example, the time interval is 0.5 second, and 300 data samples are collected, as shown in Fig 6(B). In the last sampling, the time interval is 0.03 second, and 1000 data samples are collected, as shown in Fig 6(C). From the statistics of the Table 1, it can be found that the system records the coordinates of the eyes track precisely in the beginning and requires the same time for the user's operation. When the time interval between each sampling is decreased, the system still can work properly due to the time interval 0.05 second, until the time interval is decreased to 0.04 second Fig. 6(A) 30 coordinates data of eyeball's tracking chart Fig. 6(B) 300 coordinates data chart of eyeball's tracking chart Fig. 6(C) 1000 coordinates data chart of eyeball's tracking chart (sampling 25 times per second). There are some postponements in the system in the tenth sampling of the time interval 0.03 seconds in 1000 data collected. The system is clearly delayed and in total it takes 50 seconds to finish this sampling. Thus, the shortest time interval in this eye-tracking system is 0.04 seconds, indicating that the system may take 25 samples per second. 35
146 Vol. 17 No. 3 June 2005 6. DISCUSSION The projected light for the eye-controlled system is infrared light, instead of the visible light, which it makes the user feel more comfortable. Moreover, the proposes polar coordinate mapping method of this study enables the user to choose the eye-controlled system by their demands for controlling the mouse cursor, recording the eyeballs tracking, or operating any programs. This proposed infrared eye-controlled human interface is designed for persons with whole-body paralysis, because they only can express and communicate by moving their eyes. This eye-tracking system can enable these persons to express themselves through computer and even use the Internet to use the latest technology and information. ACKNOWLEDGEMENTS This work was sponsored by the National Science Council, Taiwan, Republic of China under grant number NSC 92-2515-S-035-002 and the Chung-Shan Institute of Science and Technology under grant number BV91U14P. REFERENCES 1. Yu-Luen Chen, Fuk-Tan Tang, Walter H. Chang, May-Kuen Wong, Ying-Ying Shih, and Te-Son Kuo, "The New Design of an Infrared-Controlled Human Computer Interface for the Disabled", IEEE Transactions on Rehabilitation Engineering, 1999, 7(4): 471-481. 2. Siegmund Pastoor, Jin Liu, and Sylvain Renault, "Experimental Multimedia System Allowing 3-D Visualization and Eye-controlling Interaction Without User-Worn Devices", IEEE Transactions on Multimedia, 1999, 1(1): 42~52. 3. D. Gareth Evans, Roger Drew, and Paul Blenkhorn "Controlling Mouse Pointer Position Using an Infrared Head-Operated Joystick", IEEE Transactions on Rehabilitation Engineering, 2000, 8(1): 107-117. 4. Margrit Betke, James Gips, and Peter Fleming, "The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities", IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2002, 10(1): 1~10. 5. Yu-Luen Chen,"Application of Tilt Sensors in Human -Computer Mouse Interface for People With Disabilities", IEEE Transactions on Neural Systems and Rehabilitation Engineering, 2001, 9(3): 289-294. 6. David Sliney and Myron Wolbarsht.,"Safety with Laser and Other Optical Sources", Plenum Press, New York and London, 1980, 65~151. 36