PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique

Size: px
Start display at page:

Download "PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique"

Transcription

1 PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique Yoshinobu Ebisawa, Daisuke Ishima, Shintaro Inoue, Yasuko Murayama Faculty of Engineering, Shizuoka University Hamamatsu, , Japan ABSTRACT We propose PupilMouse in order to control a cursor on a PC screen by movements of a pupil or two pupils in the camera image, caused by head movements. In the system, the states of the eyes (open or close) can be judged by detecting the existence of the pupils, they are reflected to the button push states of the ordinary two-button mouse. A combination of a couple of infrared illuminator made it easy to detect two pupils. The image processing was executed using only a general-purpose image input board and a PC. An advantage of this system is that it is unnecessary to attach marker or sensor to the head. Another advantage is that a click or drag is possible without any other device. This human interface would be useful especially for physically handicapped people, who can rotate their heads, to control a PC. Keywords: pupil detection, handicapped, human interface, cursor control, button control 1 INTRODUCTION Even the severely handicapped can control electric home appliances and communicate with the outside world via internet, if they can use a PC. In the severely handicapped, there are many people who can move the head. In many cases, in order to use a PC, they push keys with a stick mouthed. However, in the GUI environment, mouse operation plays an important role as well as key operation. On the contrary if they can use a certain mouse emulator, the repetition of the cursor movement and click enable also character inputs on all application software by using a software keyboard presented on a PC screen. Therefore, if a good device that can move the cursor with head movement is developed, it is expected that it enable the handicapped people to operate a PC easily. So far, the head movement-based cursor control devices using ultrasonic sensors [1], infrared LED [2], and tilt sensor [3], and the method using a magnetic sensor, have been proposed. In all cases, the sensor must be attached to the head; this becomes a burden for users. In such a device, HeadMouse [4] and NaturalPoint [5] have come out to the market. In these methods, a special marker (seal) is attached to the forehead usually. The position of the marker is detected by a camera; its motion is reflected to cursor motion. In this method, although attaching the marker has some problem, the biggest problem is that there is no way to click in the device itself. To solve this problem, when the cursor is kept for a short period of time, e.g. 1 sec, within an area where the user wants to click, a click is carried out. However, when the user carelessly stops the head elsewhere, this method results in making the user click a different area from that which the user wants to click. The system developed by Betke et al. is commercialized as CameraMouse [6]. It tracks the predetermined portion of the face by the template matching method. This system has an advantage that it is unnecessary to attach something to the head. However, the system itself has no way to click. Betke et al. tried to deal with the problem by preparing the area where a click is not carried out even if cursor has stopped [7]. However, since it is necessary to move the cursor to the area intentionally, it may not be so user-friendly. We are developing the non-contact video-based eye-gaze detection system [8]. For highly precise eye-gaze detection, it is necessary to detect 3D eye position precisely. So we focused on the fact that it is relatively easy to detect pupils from a face image if irradiating the near-infrared light for the face. And we tried to detect the 3D pupil positions of the two pupils by the stereo camera method, and got a good result. The good result came from the image processing method in one camera image and the near-infrared irradiation method, which made it easy to detect the pupil centers from the face image precisely and stably [9]. By using the pupil detection technique, basically the right and left pupils could be distinguished and their pupil center coordinates as well as their existences in the image could be detected accurately. We considered that since opening and closing of the eyes can be detected if using this technique, a click and drag are implemented with opening and closing of the eyes. In this paper, we propose PupilMouse, in which a motion of the pupil(s) in the image is reflected to a motion of the cursor of a mouse and the opening and closing of the eyes are reflected to the button states of the mouse. 2 METHODS Pupil Center Detection Technique If the near-infrared LEDs arranged around the aperture of a camera as shown in Fig. 1 are switched on, the image having the bright pupils (bright pupil image) can be obtained as shown in Fig. 2(a). If the same LEDS arranged apart from the aperture of the camera are

2 switched on, the image having the dark pupils (dark pupil image) can be obtained (Fig. 2(b)). A subtraction of the dark pupil image from the bright pupil image make it easy to detect the pupils because almost all background images disappear as shown in Fig. 2(c). Similar pupil detection techniques have been described elsewhere [8][10]-[12]. In the present system, the above-mentioned two sets of LEDs were alternately switched on, synchronously with the video odd/even signal. Using an image grabber board, the image from the camera is digitized and transferred into the memory on a PC. The PC made an image processing including image difference and detected the center coordinates and the existence of the two pupils. However, in the difference image, the face contour also appears. The contour and the glass reflection light which is the LEDs light reflected off corrective eyeglasses can be misidentified as pupils. To avoid this, (1) the face area was detected from the bright pupil image, utilizing the fact that the face area is brighter than the background. (2) Once the pupils were detected, small windows were given around the pupils (see Fig. 2(d)-(f)). Initial pupil detection procedure was repeated every frame when the pupil was not detected. In the procedure, by subtracting the four-frame former odd image from the current odd image, the areas of the eyes were extracted within the face area as a binary image. From the difference image (the subtraction image of the current dark pupil image from the current bright pupil image) within the eye areas, the small area including the pixel having the highest intensity was detected as primary pupil. After masking this area, the secondary pupil was detected likewise. As mentioned above, initial detection of the pupils was executed utilizing a blink. (3) In the subsequent frames, the pupil positions are predicted using the Kalman filter. The small windows were applied to the predicted positions. The pupil was searched within the windows. The pupil center was determined in high resolution using a pupil parameter obtained from the combination of the separability [13] and the image intensity. (4) Eye closing was judged by the existence of the pupil within the window, which was identified using the pupil parameter and a predetermined constant threshold. The detailed image processing method was described in [9]. The average calculation time for detecting the coordinates of two pupils was approximately 10 ms in a PC (Pentium 4, 2.5MHz). To diminish the influence of the ambient light, the band-pass filter (half intensity width: 17nm) having the same central wavelength as that of the LEDs was attached to the front of the camera lens. eye ends the drag (Fig. 3(b)). Cursor shift during dragging is reflected to the movement of the center of the pupil of the opened eye. When blinking, both pupils disappear. A blink is reflected to no cursor shift or no button push. Fig. 2. Fig. 1. Arrangement of LEDs. Images and detection of pupils. Scheme of Pupil Mouse As shown in Fig. 3(a), basically movements of the centers of the two pupils in the camera image, which are caused by shifting or rotating the head, shift the cursor on a screen of a PC. Whether the eyes are open and closed can be determined by the existences of the pupils. By a wink of the right or left eye, a corresponding right or left click can be executed. Closing one eye and moving the head starts the drag and subsequently opening the closed Fig. 3. Correspondence between pupil center shifts in camera image and cursor shift on PC screen.

3 Correspondence of Pupil Detection States to Button Push States As shown in Table 1, there are four combinations in the detection states of the right and left pupils: (1) both pupils detected, (2) right pupil detected and left pupil undetected, (3) left pupil detected and right pupil undetected, and (4) neither pupil detected. These four kinds of pupil detection states are reflected to the button push states of a general-purpose two-button mouse. (1) Neither button pushed (up, up), (2) only left button pushed (up, down), and (3) only right button pushed (down, up). (4) This condition occurs when a blink occurs. Fortunately, on the two-button mouse, the condition that both buttons are pushed (down, down) is not used. So, in the case of (4), suppose that neither of the buttons is pushed like the case of (1). Here, although there can be a cursor movement in the case of (1), no cursor movement in the case of (4). Furthermore, the following processing is carried out about the blink. When it had been judged that one of buttons was pushed in the previous image frame and neither pupil is detected in the current frame, a blink may have occurred. In this case, it is not judged immediately as the condition where neither button is being pushed. If the same pupil detection state has continued for a while, the button push state is judged as changed. Moreover, also in an ordinary blink, two pupils may not disappear or re-appear simultaneously, but a time difference of one or two frames may arise. If the pupil detection states are reflected to the button push states simply, it carries out different operation from the user's intention. Therefore, in any shift of the pupil detection states, it was shifted to the corresponding new button push state when a new different pupil detection state has continued for three frames (0.1 seconds). Table 1. Correspondence of pupil detection states to button push states. Correspondence of Pupil Movement to Cursor Movement When both eyes are open as shown in Fig. 1(a), the amount of change between frames of the average position of both pupil centers is reflected to the amount of cursor movement between frames. Here, the reason for using not one pupil center but two pupil centers is for cursor movement to be more stable. Since one eye is closed during a drag, the amount of movement of the pupil of an open eye is reflected to the amount of cursor movement. Here, to relate pupil center movement to cursor movement, it is possible to use the following equation (linear function). Cx = Kx Px (1) Cy = Ky Py where Px and Py indicate horizontal and vertical pupil center movement between frames and Cx and Cy are horizontal and vertical cursor movement between frames. Kx and Ky are constants. They are freely changed by a user so that it may be easy to use. Now, assume that a user moves the cursor to the target position and then closes one eye as it is. Although it is easy to make the cursor approach a target position, it is accompanied by some strain to make the cursor agree with the target position. In that case, a user's head may vibrate delicately. The head movement may lead to vibration of the cursor. So we tried also the following nonlinear equations. b Cx = Kx Px (2) b Cy = Ky Py In these equations, If b > 1, the cursor shows a tendency to move greatly by a quick pupil motion and shows almost no reaction by a slow pupil motion. Eq. (2) is written for its intelligibleness. Practically, the following equations were used. Cx = Kx Px Px Cy = Ky Py Py b 1 b 1 (3) In the experiments, they are used as b=2 (quadric function). Experiments All processing described below was executed using a PC (Pentium 4, 2.5GHz). A camera and light sources can be placed everywhere, e.g., above or side of the display. In this experiment, they were placed below the 17 inch PC display as shown in Fig. 4. Three normal students were served as subjects. They were seated approximately 70 cm from the display. On the display screen, the face images as shown in Fig. 5 were observable. By observing those images, the states of pupil detection were checked. However, during experiments, a window as shown in Figs. 6 and 7 was extended on the whole screen. The range to which the cursor moves in the window (white portion) was 330 x 240 mm and corresponded to 1276 x 935 pixels. By the image processing [9], the center coordinates of the two pupils in each frame were determined by sub-pixel within in horizontal and in vertical. The following two experiments were conducted to show the usefulness of the pupil mouse: Click and drag experiments. Click experiment: In every experimental trial, a cursor and Target 1 appeared at random positions in the window (Fig. 6). The subject was asked to move the cursor to the target by head movement and then to close his left eye (to click). The sizes of the target and the cursor were 12 x 12 mm (45 x 45 pixels) and 5 x 5 mm (18 x 18 pixels), respectively. When the upper left corner of the cursor was included in the area of the target, the color of the target was reversed. This reversal let the subject know

4 that it was in preparation for click. When it was judged that the click occurred when the target was in reverse, the click succeeded and the trial was completed, and then immediately the next trial was started. When the click failed, the trial was continued until the click was successful. Three experimental sessions were conducted for each subject. Each session consisted of 10 trials. Subjects A and B participated in this type of experiment for the first time. Subject C was one of the developers of this system and she was familiar to this experiment. Both the experiments using the linear function as Kx=12 and Ky=14 in Eq. (1) and the experiment using the quadric function as b=2, Kx=6 and Ky=8 in Eq. (2) were conducted. These values were chosen commonly for the three subjects so that they can move the cursor reasonably to the ends of the screen. For each of the experiments using the linear and the quadric functions, the first session was started after approximately two minutes training. The same click experiment was conducted by using an ordinary mouse. Drag experiment: In every trial, the cursor and Targets 1 and 2 appeared at random positions (Fig. 7). The cursor and Target 1 were the same as used in the click experiment. The size of Target 2 was 14 x 12 mm (52 x 45). In this experiment, the subject was asked to move the cursor to Target 1, and then to close the left eye. By this procedure (the same as the click experiment), Target 1 was attached to the cursor and the drag was started. Next, the subject was asked to move the cursor with the target to Target 2 by head movement, with the left eye closed. When the upper left corner of the Target 1 was included in the area of Target 1, the color of this target was reversed. The subject was asked to open the closed left eye. When it was judged that the left eye was opened when the target was in reverse, the drag succeeded and the trial was completed, and then immediately the next trial was started. When the drag failed, the trial was repeated until it was successful. Three sessions were conducted in each subject. Each experiment consisted of five trails. For the relationship between the pupil and cursor motions, the quadric function was used as Kx=6 and Ky=8 in Eq. (2). Fig. 5. Face images for checking pupil detection Fig. 6. PC screen in click experiment. Fig. 7. PC screen in drag experiment. Fig. 4. Pupil mouse system. 3 RESULTS Click Experiment Table 2 indicates the period of time from Target 1 appearance until click completion. In the averages of the three subjects, the pupil mouse showed approximately 3 sec while the ordinary mouse took 1.0 sec. However,

5 subject C showed 2.2 sec in the pupil mouse. It can be said that the speed was very quick. The required time was hardly different between for the linear function and quadric function. However, on the average of the three subjects, the misclick number for the quadric function showed half of that for the linear function. In actual PC operations, the misclick means that the user clicked a different position and needs a time to correct the error by the misclick. Accordingly, the decrease of the misclick number is important. Figs. 8 and 9 show samples of the cursor trajectories for the linear and quadric functions, respectively, on subject C. In these figures, the small dots depict the cursor positions in every frame, and the squares indicate the positions where Target 1 appeared. The numbers indicate the trial order. When the cursor moved largely, the distances between the adjacent dots for the quadric function (Fig. 9) were longer than for the linear function (Fig. 8). Within or around the target, the cursor positions disperse largely for the linear function but small for the quadric function. This difference is one of the reasons why the misclick number decreased for the quadric function for the linear function. Drag Experiment Table 3 shows the average of a required time from the start of a drag (capturing Target 1) until the end of the drag (releasing the target). It included the time in the case of releasing and recapturing the target on the half way of dragging (drag failure). Like the click experiment, subject C was faster than the other two subjects. For all subjects, the required time in the drag experiment was longer than that in the click experiment. Fig. 10 shows the cursor trajectories from the start until the end of the drags. The types of squares in this figure indicate the appearing positions of Targets 1 and 2. Fig. 8. Fig. 9. Sample of cursor trajectories in click experiment (linear function). Sample of cursor trajectories in click experiment (quadric function). Table 2. Results of click experiment (N=30). Time Required for One Number of Misclick Click Subejct Pupil Mouse Ordinary Pupil Mouse Ordinary Linear Quadric Mouse Linear Quadric Mouse A 3.8s 3.6s 0.9s B 2.8s 3.3s 1.0s C 2.2s 2.2s 1.1s Mean 2.9s 3.0s 1.0s Table 3. Results of drag experiment (N=15). Time Required for One Click Subject Pupil Mouse (Quadric) Ordinary Mouse A 4.8s 0.9s B 4.7s 1.3s C 2.4s 1.1s Mean 4.0s 1.1s Fig. 10. Sample of cursor trajectory in drag experiment (quadric function). 4. DISCUSSION The results for subject C showed good enough to use the pupil mouse in both the click and drag experiments. The other two subjects were naïve for these experiments;

6 it was difficult to rotate their head obliquely. Accordingly, they showed a tendency to rotate horizontally at first, and to rotate vertically, and vice versa. Such two-step head movements seemed to be a reason why the two subjects were slower than subject C. The cursor trajectory shown in Fig. 8 corresponds to the shift of two pupils between frames. This result indicates that the pupil centers could have been detected precisely and stably. It is most important for the pupil mouse to detect the pupil centers precisely and stably. Accordingly, appropriate face illumination and image processing methods were necessary. The detail of the pupil detection method used in the present study was described in [9]. Since the pupil center signal has not been moving-averaged, the subjects did perceive almost no delay of cursor movement against head movement. There was the case that the click failed even though the subjects closed or opened their eye after confirming that Target 1 or 2 reversed in color. One of the reasons results from the use of the cursor position in the moment the pupil just disappeared or reappeared. That is, there are several frames from the moment the users decided to close or open their one eye until it was judged that the corresponding pupil disappeared or reappeared. During these periods of time, the eye or the head may moves. These cause the cursor shift after the user intended to click. This problem was solved to some extent by using the quadric function for the relationship between the pupil and cursor motions. However, this problem would be almost completely solved by using the cursor position at the ascended time from the moment it was judged that the pupil disappeared or reappeared. 5. CONCLUSION At the present stage of this study, the pupil mouse has a problem that many misclicks are seen. This problem would be solved by the simple device. The pupil mouse has some advantages as a pointing device. First, it can be used with no sensor or no marker attached to the face. Second, no calibration is necessary unlike the line-of-sight pointing devices. Third, anyone can immediately use the system by placing his or her face in the camera frame because initial setting for image processing (e.g., threshold setting) is unnecessary. Fourth, due to the use of the infrared light sources, the pupil mouse can be used in complete darkness. In the present report, the gains (Kx, Ky) between the pupil motion and cursor motion did not changed for all subjects. However, it is possible to choose the better gains adjusted for each subject. Moreover, for the use of the severely handicapped, the use of the correspondence between the pupil position and the cursor position may be better than the use of the correspondence between the pupil movement and the cursor movement. We want to examine it. For the experiments in the present study, we developed the special programs. However, they can not operate the cursor on other application software. To make the pupil mouse emulate the ordinary mouse is one of the future subjects. Furthermore, we want to enable character inputs on application software by the combination of a software keyboard and the pupil mouse, to have the handicapped use it, and to confirm the usefulness of the pupil mouse. ACKNOWLEDGMENT The present research was supported by Cooperative Link of Unique Science and Technology for Economy Revitalization (CLUSTER) in Hamamatsu area (Hamamatsu Optronics Cluster). REFERECES [1] M. Nunoshita, Y. Ebisawa, Head Pointer Based on Ultrasonic Position Measurement, Proc of the Second Joint EMBS/BMES Conference, 2002, pp [2] D.G. Evans, R. Drew, and P. Blenkhorn, Controlling mouse pointer position using an infrared head-operated joystick, IEEE Trans. Rehab. Eng., Vol. 8, No.1, 2000, pp [3] Y.-L. Chen, Application of tilt sensors in human-computer mouse interface for people with disabilities, IEEE Trans. Neural Syst. & Rehab. Eng., Vol. 9, No. 3, 2001, pp [4] [5] [6] [7] M. Betke, J. Gips, and P. Fleming, The camera Mouse, Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities, IEEE Trans on Neural Syst. and Rehab. Eng., Vol.10, No. 1, 2002, pp [8] Y. Ebisawa, Improved video-based eye-gaze detection method IEEE Trans. Instr. and Meas., Vol. 47, No. 4, 1998, pp [9] Y. Ebisawa, Realtime 3D Position Detection of Human Pupil, Proceedings of IEEE International Conference on Virtual Environments, Human-Computer Interfaces, and Measurement Systems (VECIMS 2004), Boston, MD, USA, July 2004, pp [10] A. Tomono, M. Iida, Y. Kobayashi, A TV camera system which extracts feature points for non-contact eye movement detection, Proceedings of the SPIE Optics, Illumination, and Image Sensing for Machine Vision IV, Vol. 1194, 1989, pp [11] C. H. Morimoto, D. Koons, A. Amir, M. Flickner, Pupil detection and tracking using multiple light sources, Image and Vision Computing, Vol.18, 2000, pp [12] Q. Ji, X. Yang, Real-time eye, gaze, and face pose tracking for monitoring driver vigilance, Real-Time Imaging, Vol.8, 2002, pp [13] K. Fukui, Edge extraction method based on separability of image features, IEICE Trans. Inf. & Syst., Vol. E78-D, No.12, 1995, pp

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Kiyotaka Fukumoto (&), Takumi Tsuzuki, and Yoshinobu Ebisawa

More information

Unconstrained pupil detection technique using two light sources and the image difference method

Unconstrained pupil detection technique using two light sources and the image difference method Unconstrained pupil detection technique using two light sources and the image difference method Yoshinobu Ebisawa Faculty of Engineering, Shizuoka University, Johoku 3-5-1, Hamamatsu, Shizuoka, 432 Japan

More information

POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM

POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM BIOMEDICAL ENGINEERING- APPLICATIONS, BASIS & COMMUNICATIONS POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM 141 CHERN-SHENG LIN 1, HSIEN-TSE CHEN 1, CHIA-HAU LIN 1, MAU-SHIUN

More information

Frame-Rate Pupil Detector and Gaze Tracker

Frame-Rate Pupil Detector and Gaze Tracker Frame-Rate Pupil Detector and Gaze Tracker C.H. Morimoto Ý D. Koons A. Amir M. Flickner ÝDept. Ciência da Computação IME/USP - Rua do Matão 1010 São Paulo, SP 05508, Brazil hitoshi@ime.usp.br IBM Almaden

More information

A PILOT STUDY ON ULTRASONIC SENSOR-BASED MEASURE- MENT OF HEAD MOVEMENT

A PILOT STUDY ON ULTRASONIC SENSOR-BASED MEASURE- MENT OF HEAD MOVEMENT A PILOT STUDY ON ULTRASONIC SENSOR-BASED MEASURE- MENT OF HEAD MOVEMENT M. Nunoshita, Y. Ebisawa, T. Marui Faculty of Engineering, Shizuoka University Johoku 3-5-, Hamamatsu, 43-856 Japan E-mail: ebisawa@sys.eng.shizuoka.ac.jp

More information

Quick Button Selection with Eye Gazing for General GUI Environment

Quick Button Selection with Eye Gazing for General GUI Environment International Conference on Software: Theory and Practice (ICS2000) Quick Button Selection with Eye Gazing for General GUI Environment Masatake Yamato 1 Akito Monden 1 Ken-ichi Matsumoto 1 Katsuro Inoue

More information

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System Tsumoru Ochiai and Yoshihiro Mitani Abstract The pupil detection

More information

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung, IJCSNS International Journal of Computer Science and Network Security, VOL.11 No.9, September 2011 55 A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang,

More information

A Whole-Body-Gesture Input Interface with a Single-View Camera - A User Interface for 3D Games with a Subjective Viewpoint

A Whole-Body-Gesture Input Interface with a Single-View Camera - A User Interface for 3D Games with a Subjective Viewpoint A Whole-Body-Gesture Input Interface with a Single-View Camera - A User Interface for 3D Games with a Subjective Viewpoint Kenichi Morimura, Tomonari Sonoda, and Yoichi Muraoka Muraoka Laboratory, School

More information

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation ITE Trans. on MTA Vol. 2, No. 2, pp. 161-166 (2014) Copyright 2014 by ITE Transactions on Media Technology and Applications (MTA) Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based

More information

www. riseeyetracker.com TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01

www. riseeyetracker.com  TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01 TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01 CONTENTS 1 INTRODUCTION... 5 2 SUPPORTED CAMERAS... 5 3 SUPPORTED INFRA-RED ILLUMINATORS... 7 4 USING THE CALIBARTION UTILITY... 8 4.1

More information

MEASUREMENT CAMERA USER GUIDE

MEASUREMENT CAMERA USER GUIDE How to use your Aven camera s imaging and measurement tools Part 1 of this guide identifies software icons for on-screen functions, camera settings and measurement tools. Part 2 provides step-by-step operating

More information

3D-Position Estimation for Hand Gesture Interface Using a Single Camera

3D-Position Estimation for Hand Gesture Interface Using a Single Camera 3D-Position Estimation for Hand Gesture Interface Using a Single Camera Seung-Hwan Choi, Ji-Hyeong Han, and Jong-Hwan Kim Department of Electrical Engineering, KAIST, Gusung-Dong, Yusung-Gu, Daejeon, Republic

More information

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture Nobuaki Nakazawa 1*, Toshikazu Matsui 1, Yusaku Fujii 2 1 Faculty of Science and Technology, Gunma University, 29-1

More information

Towards a Google Glass Based Head Control Communication System for People with Disabilities. James Gips, Muhan Zhang, Deirdre Anderson

Towards a Google Glass Based Head Control Communication System for People with Disabilities. James Gips, Muhan Zhang, Deirdre Anderson Towards a Google Glass Based Head Control Communication System for People with Disabilities James Gips, Muhan Zhang, Deirdre Anderson Boston College To be published in Proceedings of HCI International

More information

Vision: How does your eye work? Student Version

Vision: How does your eye work? Student Version Vision: How does your eye work? Student Version In this lab, we will explore some of the capabilities and limitations of the eye. We will look Sight is one at of the extent five senses of peripheral that

More information

Vision: How does your eye work? Student Advanced Version Vision Lab - Overview

Vision: How does your eye work? Student Advanced Version Vision Lab - Overview Vision: How does your eye work? Student Advanced Version Vision Lab - Overview In this lab, we will explore some of the capabilities and limitations of the eye. We will look Sight at is the one extent

More information

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture Akira Suganuma Depertment of Intelligent Systems, Kyushu University, 6 1, Kasuga-koen, Kasuga,

More information

Technical Note How to Compensate Lateral Chromatic Aberration

Technical Note How to Compensate Lateral Chromatic Aberration Lateral Chromatic Aberration Compensation Function: In JAI color line scan cameras (3CCD/4CCD/3CMOS/4CMOS), sensors and prisms are precisely fabricated. On the other hand, the lens mounts of the cameras

More information

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam

DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam DIGITAL IMAGE PROCESSING Quiz exercises preparation for the midterm exam In the following set of questions, there are, possibly, multiple correct answers (1, 2, 3 or 4). Mark the answers you consider correct.

More information

Pupil detection and tracking using multiple light sources

Pupil detection and tracking using multiple light sources Image and Vision Computing 18 (2000) 331 335 www.elsevier.com/locate/imavis Pupil detection and tracking using multiple light sources C.H. Morimoto a, *, D. Koons b, A. Amir b, M. Flickner b a Dept. de

More information

Zoom Set Too Tight Zoom Set Correctly Zoom Set Too Wide

Zoom Set Too Tight Zoom Set Correctly Zoom Set Too Wide The ISG-E300 AutoCam Elite offers special features that increase capture efficiency and enhance image quality. By following the procedures outlined in this document, the ISG-E300 Elite can be used to its

More information

CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker

CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker 2016 3 rd International Conference on Engineering Technology and Application (ICETA 2016) ISBN: 978-1-60595-383-0 CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed

More information

A Comparison Between Camera Calibration Software Toolboxes

A Comparison Between Camera Calibration Software Toolboxes 2016 International Conference on Computational Science and Computational Intelligence A Comparison Between Camera Calibration Software Toolboxes James Rothenflue, Nancy Gordillo-Herrejon, Ramazan S. Aygün

More information

Digital Microscope. User Manual

Digital Microscope. User Manual Digital Microscope User Manual Features The digital microscope provides 10~200X adjustable magnification range. The build-in high-performance white LED can illuminate the object without using any auxiliary

More information

Spring 2005 Group 6 Final Report EZ Park

Spring 2005 Group 6 Final Report EZ Park 18-551 Spring 2005 Group 6 Final Report EZ Park Paul Li cpli@andrew.cmu.edu Ivan Ng civan@andrew.cmu.edu Victoria Chen vchen@andrew.cmu.edu -1- Table of Content INTRODUCTION... 3 PROBLEM... 3 SOLUTION...

More information

, ARNON AMIR, MYRON FLICKNER

, ARNON AMIR, MYRON FLICKNER SIBGRAPI 99, XII BRAZILIAN SYMPOSIUM IN COMPUTER GRAPHICS AND IMAGE PROC., CAMPINAS, BRAZIL, OCTOBER, 1999 171 CARLOS MORIMOTO, DAVID KOONS Keeping an Eye for HCI, ARNON AMIR, MYRON FLICKNER, SHUMIN ZHAI

More information

Eye Gaze Tracking With a Web Camera in a Desktop Environment

Eye Gaze Tracking With a Web Camera in a Desktop Environment Eye Gaze Tracking With a Web Camera in a Desktop Environment Mr. K.Raju Ms. P.Haripriya ABSTRACT: This paper addresses the eye gaze tracking problem using a lowcost andmore convenient web camera in a desktop

More information

Swept-Field User Guide

Swept-Field User Guide Swept-Field User Guide Note: for more details see the Prairie user manual at http://www.prairietechnologies.com/resources/software/prairieview.html Please report any problems to Julie Last (jalast@wisc.edu)

More information

Seishi IKAMI* Takashi KOBAYASHI** Yasutake TANAKA* and Akira YAMAGUCHI* Abstract. 2. System configuration. 1. Introduction

Seishi IKAMI* Takashi KOBAYASHI** Yasutake TANAKA* and Akira YAMAGUCHI* Abstract. 2. System configuration. 1. Introduction Development of a Next-generation CCD Imager for Life Sciences Research Seishi IKAMI* Takashi KOBAYASHI** Yasutake TANAKA* and Akira YAMAGUCHI* Abstract We have developed a next-generation CCD-based imager

More information

Eye Contact Camera System for VIDEO Conference

Eye Contact Camera System for VIDEO Conference Eye Contact Camera System for VIDEO Conference Takuma Funahashi, Takayuki Fujiwara and Hiroyasu Koshimizu School of Information Science and Technology, Chukyo University e-mail: takuma@koshi-lab.sist.chukyo-u.ac.jp,

More information

Embroidery process - EC on PC

Embroidery process - EC on PC 66 Software EC on PC Embroidery process - EC on PC PC display Motif the motif is displayed in color the smallest possible hoop for the selected motif is displayed the needle position (marked by a fine

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

Table of Contents 1. Image processing Measurements System Tools...10

Table of Contents 1. Image processing Measurements System Tools...10 Introduction Table of Contents 1 An Overview of ScopeImage Advanced...2 Features:...2 Function introduction...3 1. Image processing...3 1.1 Image Import and Export...3 1.1.1 Open image file...3 1.1.2 Import

More information

PC Eyebot. Tutorial PC-Eyebot Console Explained

PC Eyebot. Tutorial PC-Eyebot Console Explained Sightech Vision Systems, Inc. PC Eyebot Tutorial PC-Eyebot Console Explained Published 2005 Sightech Vision Systems, Inc. 6580 Via del Oro San Jose, CA 95126 Tel: 408.282.3770 Fax: 408.413-2600 Email:

More information

Patents of eye tracking system- a survey

Patents of eye tracking system- a survey Patents of eye tracking system- a survey Feng Li Center for Imaging Science Rochester Institute of Technology, Rochester, NY 14623 Email: Fxl5575@cis.rit.edu Vision is perhaps the most important of the

More information

Development of an Automatic Measurement System of Diameter of Pupil

Development of an Automatic Measurement System of Diameter of Pupil Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 22 (2013 ) 772 779 17 th International Conference in Knowledge Based and Intelligent Information and Engineering Systems

More information

An Embedded Pointing System for Lecture Rooms Installing Multiple Screen

An Embedded Pointing System for Lecture Rooms Installing Multiple Screen An Embedded Pointing System for Lecture Rooms Installing Multiple Screen Toshiaki Ukai, Takuro Kamamoto, Shinji Fukuma, Hideaki Okada, Shin-ichiro Mori University of FUKUI, Faculty of Engineering, Department

More information

The original image. Let s get started! The final result.

The original image. Let s get started! The final result. Miniature Effect With Tilt-Shift In Photoshop CS6 In this tutorial, we ll learn how to create a miniature effect in Photoshop CS6 using its brand new Tilt-Shift blur filter. Tilt-shift camera lenses are

More information

Hohner Harmonica Tuner V5.0 Copyright Dirk's Projects, User Manual. Page 1

Hohner Harmonica Tuner V5.0 Copyright Dirk's Projects, User Manual.  Page 1 User Manual www.hohner.de Page 1 1. Preface The Hohner Harmonica Tuner was developed by Dirk's Projects in collaboration with Hohner Musical Instruments and is designed to enable harmonica owners to tune

More information

Demonstration of a Frequency-Demodulation CMOS Image Sensor

Demonstration of a Frequency-Demodulation CMOS Image Sensor Demonstration of a Frequency-Demodulation CMOS Image Sensor Koji Yamamoto, Keiichiro Kagawa, Jun Ohta, Masahiro Nunoshita Graduate School of Materials Science, Nara Institute of Science and Technology

More information

Direct gaze based environmental controls

Direct gaze based environmental controls Loughborough University Institutional Repository Direct gaze based environmental controls This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI,

More information

Be aware that there is no universal notation for the various quantities.

Be aware that there is no universal notation for the various quantities. Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and

More information

AgilEye Manual Version 2.0 February 28, 2007

AgilEye Manual Version 2.0 February 28, 2007 AgilEye Manual Version 2.0 February 28, 2007 1717 Louisiana NE Suite 202 Albuquerque, NM 87110 (505) 268-4742 support@agiloptics.com 2 (505) 268-4742 v. 2.0 February 07, 2007 3 Introduction AgilEye Wavefront

More information

CREATING A COMPOSITE

CREATING A COMPOSITE CREATING A COMPOSITE In a digital image, the amount of detail that a digital camera or scanner captures is frequently called image resolution, however, this should be referred to as pixel dimensions. This

More information

Near Infrared Face Image Quality Assessment System of Video Sequences

Near Infrared Face Image Quality Assessment System of Video Sequences 2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University

More information

INTRODUCTION TO CCD IMAGING

INTRODUCTION TO CCD IMAGING ASTR 1030 Astronomy Lab 85 Intro to CCD Imaging INTRODUCTION TO CCD IMAGING SYNOPSIS: In this lab we will learn about some of the advantages of CCD cameras for use in astronomy and how to process an image.

More information

CHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES

CHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES CHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES The current multiplication mechanism offered by dynodes makes photomultiplier tubes ideal for low-light-level measurement. As explained earlier, there

More information

SCATT Biathlon shooting trainer User s Manual

SCATT Biathlon shooting trainer User s Manual SCATT Biathlon shooting trainer User s Manual Russia, Moscow, ZAO SCATT Internet: www.scatt.com E-mail: info@scatt.com Tel/Fax: +7 (499) 70 0667 Please read the User s Manual before installation, operation,

More information

Development of Hybrid Image Sensor for Pedestrian Detection

Development of Hybrid Image Sensor for Pedestrian Detection AUTOMOTIVE Development of Hybrid Image Sensor for Pedestrian Detection Hiroaki Saito*, Kenichi HatanaKa and toshikatsu HayaSaKi To reduce traffic accidents and serious injuries at intersections, development

More information

Spectroradiometer CS-2000/2000A. The world's top-level capability spectroradiometers make further advances with addition of second model to lineup.

Spectroradiometer CS-2000/2000A. The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. Spectroradiometer CS-000/000A The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. 15 World's top level capability to detect extremely low

More information

How Many Pixels Do We Need to See Things?

How Many Pixels Do We Need to See Things? How Many Pixels Do We Need to See Things? Yang Cai Human-Computer Interaction Institute, School of Computer Science, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA ycai@cmu.edu

More information

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures

More information

FEATURE. Adaptive Temporal Aperture Control for Improving Motion Image Quality of OLED Display

FEATURE. Adaptive Temporal Aperture Control for Improving Motion Image Quality of OLED Display Adaptive Temporal Aperture Control for Improving Motion Image Quality of OLED Display Takenobu Usui, Yoshimichi Takano *1 and Toshihiro Yamamoto *2 * 1 Retired May 217, * 2 NHK Engineering System, Inc

More information

Spectroradiometer CS-2000/2000A. The world's top-level capability spectroradiometers make further advances with addition of second model to lineup.

Spectroradiometer CS-2000/2000A. The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. Spectroradiometer /000A The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. World's top level capability to detect extremely low luminance

More information

The student will: download an image from the Internet; and use Photoshop to straighten, crop, enhance, and resize a digital image.

The student will: download an image from the Internet; and use Photoshop to straighten, crop, enhance, and resize a digital image. Basic Photoshop Overview: Photoshop is one of the most common computer programs used to work with digital images. In this lesson, students use Photoshop to enhance a photo of Brevig Mission School, so

More information

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000

The ideal K-12 science microscope solution. User Guide. for use with the Nova5000 The ideal K-12 science microscope solution User Guide for use with the Nova5000 NovaScope User Guide Information in this document is subject to change without notice. 2009 Fourier Systems Ltd. All rights

More information

Fourier Transformation Hologram Experiment using Liquid Crystal Display. Kenji MISUMI, Yoshikiyo KASHII, Mikio MIMURA (Received September 30, 1999)

Fourier Transformation Hologram Experiment using Liquid Crystal Display. Kenji MISUMI, Yoshikiyo KASHII, Mikio MIMURA (Received September 30, 1999) Mem. Fac. Eng., Osaka City Univ., Vol. 40, pp. 85-91 (1999) Fourier Transformation Hologram Experiment using Liquid Crystal Display Kenji MISUMI, Yoshikiyo KASHII, Mikio MIMURA (Received September 30,

More information

Quick Operation Guide

Quick Operation Guide Quick Operation Guide Power ON Mounting specimens Set the specimen on the sample holder, and install the sample holder to the holder frame. Attach the holder frame to the XY stage. Type of holder Main

More information

CMOS Image Sensor for High Speed and Low Latency Eye Tracking

CMOS Image Sensor for High Speed and Low Latency Eye Tracking This article has been accepted and published on J-STAGE in advance of copyediting. ntent is final as presented. IEICE Electronics Express, Vol.*, No.*, 1 10 CMOS Image Sensor for High Speed and Low Latency

More information

Development of an Education System for Surface Mount Work of a Printed Circuit Board

Development of an Education System for Surface Mount Work of a Printed Circuit Board Development of an Education System for Surface Mount Work of a Printed Circuit Board H. Ishii, T. Kobayashi, H. Fujino, Y. Nishimura, H. Shimoda, H. Yoshikawa Kyoto University Gokasho, Uji, Kyoto, 611-0011,

More information

OPERATION MANUAL MIMAKI ENGINEERING CO., LTD.

OPERATION MANUAL MIMAKI ENGINEERING CO., LTD. OPERATION MANUAL MIMAKI ENGINEERING CO., LTD. http://www.mimaki.co.jp/ E-mail:traiding@mimaki.co.jp D200674 About FineCut for CorelDRAW Thank you very much for purchasing a product of Mimaki. FineCut,

More information

Motion Lab : Relative Speed. Determine the Speed of Each Car - Gathering information

Motion Lab : Relative Speed. Determine the Speed of Each Car - Gathering information Motion Lab : Introduction Certain objects can seem to be moving faster or slower based on how you see them moving. Does a car seem to be moving faster when it moves towards you or when it moves to you

More information

Efficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision

Efficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision Efficient Construction of SIFT Multi-Scale Image Pyramids for Embedded Robot Vision Peter Andreas Entschev and Hugo Vieira Neto Graduate School of Electrical Engineering and Applied Computer Science Federal

More information

Cover Story SOUMYA MAITRA. photographer, photoshop, or, even the model...it s all about The Light.

Cover Story SOUMYA MAITRA. photographer, photoshop, or, even the model...it s all about The Light. Cover Story SOUMYA MAITRA IIt s t nott th the camera, iit s t nott th the llens, it it s nott th the photographer, photoshop, or, even the model...it s all about The Light. I N today s digital world, most

More information

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Cooperative Transportation by Humanoid Robots Learning to Correct Positioning Yutaka Inoue, Takahiro Tohge, Hitoshi Iba Department of Frontier Informatics, Graduate School of Frontier Sciences, The University

More information

Adaptive Optimum Notch Filter for Periodic Noise Reduction in Digital Images

Adaptive Optimum Notch Filter for Periodic Noise Reduction in Digital Images Adaptive Optimum Notch Filter for Periodic Noise Reduction in Digital Images Payman Moallem i * and Majid Behnampour ii ABSTRACT Periodic noises are unwished and spurious signals that create repetitive

More information

Instruction Manual for HyperScan Spectrometer

Instruction Manual for HyperScan Spectrometer August 2006 Version 1.1 Table of Contents Section Page 1 Hardware... 1 2 Mounting Procedure... 2 3 CCD Alignment... 6 4 Software... 7 5 Wiring Diagram... 19 1 HARDWARE While it is not necessary to have

More information

Development of Gaze Detection Technology toward Driver's State Estimation

Development of Gaze Detection Technology toward Driver's State Estimation Development of Gaze Detection Technology toward Driver's State Estimation Naoyuki OKADA Akira SUGIE Itsuki HAMAUE Minoru FUJIOKA Susumu YAMAMOTO Abstract In recent years, the development of advanced safety

More information

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot Quy-Hung Vu, Byeong-Sang Kim, Jae-Bok Song Korea University 1 Anam-dong, Seongbuk-gu, Seoul, Korea vuquyhungbk@yahoo.com, lovidia@korea.ac.kr,

More information

MEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING. J. Ondra Department of Mechanical Technology Military Academy Brno, Brno, Czech Republic

MEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING. J. Ondra Department of Mechanical Technology Military Academy Brno, Brno, Czech Republic MEASUREMENT OF ROUGHNESS USING IMAGE PROCESSING J. Ondra Department of Mechanical Technology Military Academy Brno, 612 00 Brno, Czech Republic Abstract: A surface roughness measurement technique, based

More information

Miniature Effect With Tilt-Shift In Photoshop CS6

Miniature Effect With Tilt-Shift In Photoshop CS6 Miniature Effect With Tilt-Shift In Photoshop CS6 This effect works best with a photo taken from high overhead and looking down on your subject at an angle. You ll also want a photo where everything is

More information

CS-2000/2000A. Spectroradiometer NEW

CS-2000/2000A. Spectroradiometer NEW Spectroradiometer NEW CS-000/000A The world's top-level capability spectroradiometers make further advances with addition of second model to lineup. World's top level capability to detect extremely low

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

Eye Tracking Computer Control-A Review

Eye Tracking Computer Control-A Review Eye Tracking Computer Control-A Review NAGESH R 1 UG Student, Department of ECE, RV COLLEGE OF ENGINEERING,BANGALORE, Karnataka, India -------------------------------------------------------------------

More information

RESNA Gaze Tracking System for Enhanced Human-Computer Interaction

RESNA Gaze Tracking System for Enhanced Human-Computer Interaction RESNA Gaze Tracking System for Enhanced Human-Computer Interaction Journal: Manuscript ID: Submission Type: Topic Area: RESNA 2008 Annual Conference RESNA-SDC-063-2008 Student Design Competition Computer

More information

A novel click-free interaction technique for large-screen interfaces

A novel click-free interaction technique for large-screen interfaces A novel click-free interaction technique for large-screen interfaces Takaomi Hisamatsu, Buntarou Shizuki, Shin Takahashi, Jiro Tanaka Department of Computer Science Graduate School of Systems and Information

More information

Audacity 5EBI Manual

Audacity 5EBI Manual Audacity 5EBI Manual (February 2018 How to use this manual? This manual is designed to be used following a hands-on practice procedure. However, you must read it at least once through in its entirety before

More information

Utilize Eye Tracking Technique to Control Devices for ALS Patients

Utilize Eye Tracking Technique to Control Devices for ALS Patients Utilize Eye Tracking Technique to Control Devices for ALS Patients Eng. Sh. Hasan Al Saeed 1, Eng. Hasan Nooh 2, Eng. Mohamed Adel 3, Dr. Abdulla Rabeea 4, Mohamed Sadiq 5 Mr. University of Bahrain, Bahrain

More information

Contents Technical background II. RUMBA technical specifications III. Hardware connection IV. Set-up of the instrument Laboratory set-up

Contents Technical background II. RUMBA technical specifications III. Hardware connection IV. Set-up of the instrument Laboratory set-up RUMBA User Manual Contents I. Technical background... 3 II. RUMBA technical specifications... 3 III. Hardware connection... 3 IV. Set-up of the instrument... 4 1. Laboratory set-up... 4 2. In-vivo set-up...

More information

Image Processing Based Vehicle Detection And Tracking System

Image Processing Based Vehicle Detection And Tracking System Image Processing Based Vehicle Detection And Tracking System Poonam A. Kandalkar 1, Gajanan P. Dhok 2 ME, Scholar, Electronics and Telecommunication Engineering, Sipna College of Engineering and Technology,

More information

Noise Characteristics of a High Dynamic Range Camera with Four-Chip Optical System

Noise Characteristics of a High Dynamic Range Camera with Four-Chip Optical System Journal of Electrical Engineering 6 (2018) 61-69 doi: 10.17265/2328-2223/2018.02.001 D DAVID PUBLISHING Noise Characteristics of a High Dynamic Range Camera with Four-Chip Optical System Takayuki YAMASHITA

More information

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira

An EOG based Human Computer Interface System for Online Control. Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira An EOG based Human Computer Interface System for Online Control Carlos A. Vinhais, Fábio A. Santos, Joaquim F. Oliveira Departamento de Física, ISEP Instituto Superior de Engenharia do Porto Rua Dr. António

More information

Characterization Microscope Nikon LV150

Characterization Microscope Nikon LV150 Characterization Microscope Nikon LV150 Figure 1: Microscope Nikon LV150 Introduction This upright optical microscope is designed for investigating up to 150 mm (6 inch) semiconductor wafers but can also

More information

Automatic Iris Segmentation Using Active Near Infra Red Lighting

Automatic Iris Segmentation Using Active Near Infra Red Lighting Automatic Iris Segmentation Using Active Near Infra Red Lighting Carlos H. Morimoto Thiago T. Santos Adriano S. Muniz Departamento de Ciência da Computação - IME/USP Rua do Matão, 1010, São Paulo, SP,

More information

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY

A SURVEY ON GESTURE RECOGNITION TECHNOLOGY A SURVEY ON GESTURE RECOGNITION TECHNOLOGY Deeba Kazim 1, Mohd Faisal 2 1 MCA Student, Integral University, Lucknow (India) 2 Assistant Professor, Integral University, Lucknow (india) ABSTRACT Gesture

More information

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc.

Human Vision and Human-Computer Interaction. Much content from Jeff Johnson, UI Wizards, Inc. Human Vision and Human-Computer Interaction Much content from Jeff Johnson, UI Wizards, Inc. are these guidelines grounded in perceptual psychology and how can we apply them intelligently? Mach bands:

More information

Use of Photogrammetry for Sensor Location and Orientation

Use of Photogrammetry for Sensor Location and Orientation Use of Photogrammetry for Sensor Location and Orientation Michael J. Dillon and Richard W. Bono, The Modal Shop, Inc., Cincinnati, Ohio David L. Brown, University of Cincinnati, Cincinnati, Ohio In this

More information

An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques

An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques Kevin Rushant, Department of Computer Science, University of Sheffield, GB. email: krusha@dcs.shef.ac.uk Libor Spacek,

More information

Colour correction for panoramic imaging

Colour correction for panoramic imaging Colour correction for panoramic imaging Gui Yun Tian Duke Gledhill Dave Taylor The University of Huddersfield David Clarke Rotography Ltd Abstract: This paper reports the problem of colour distortion in

More information

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method

More information

Pixel v POTUS. 1

Pixel v POTUS. 1 Pixel v POTUS Of all the unusual and contentious artifacts in the online document published by the White House, claimed to be an image of the President Obama s birth certificate 1, perhaps the simplest

More information

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device Enkhbat Davaasuren and Jiro Tanaka 1-1-1 Tennodai, Tsukuba, Ibaraki 305-8577 Japan {enkhee,jiro}@iplab.cs.tsukuba.ac.jp Abstract.

More information

In the following sections, if you are using a Mac, then in the instructions below, replace the words Ctrl Key with the Command (Cmd) Key.

In the following sections, if you are using a Mac, then in the instructions below, replace the words Ctrl Key with the Command (Cmd) Key. Mac Vs PC In the following sections, if you are using a Mac, then in the instructions below, replace the words Ctrl Key with the Command (Cmd) Key. Zoom in, Zoom Out and Pan You can use the magnifying

More information

CB Database: A change blindness database for objects in natural indoor scenes

CB Database: A change blindness database for objects in natural indoor scenes DOI 10.3758/s13428-015-0640-x CB Database: A change blindness database for objects in natural indoor scenes Preeti Sareen 1,2 & Krista A. Ehinger 1 & Jeremy M. Wolfe 1 # Psychonomic Society, Inc. 2015

More information

Operation Manual Full-HD Miniature POV Camera

Operation Manual Full-HD Miniature POV Camera Operation Manual Full-HD Miniature POV Camera CV502-WPM/WPMB CV502-M/MB CV225-M/MB CV505-M/MB, CV565-MGB CV343-CS/CSB, CV345-CS/CSB, CV365-CGB STRUCTURE SETUP WB CTROL SUB DC IRIS ATW PUSH SUB BRIGHTNESS

More information

ScienceDirect. Improvement of the Measurement Accuracy and Speed of Pupil Dilation as an Indicator of Comprehension

ScienceDirect. Improvement of the Measurement Accuracy and Speed of Pupil Dilation as an Indicator of Comprehension Available online at www.sciencedirect.com ScienceDirect Procedia Computer Science 35 (2014 ) 1202 1209 18th International Conference in Knowledge Based and Intelligent Information and Engineering Systems

More information

522 Int'l Conf. Artificial Intelligence ICAI'15

522 Int'l Conf. Artificial Intelligence ICAI'15 522 Int'l Conf. Artificial Intelligence ICAI'15 Verification of a Seat Occupancy/Vacancy Detection Method Using High-Resolution Infrared Sensors and the Application to the Intelligent Lighting System Daichi

More information

Superfast phase-shifting method for 3-D shape measurement

Superfast phase-shifting method for 3-D shape measurement Superfast phase-shifting method for 3-D shape measurement Song Zhang 1,, Daniel Van Der Weide 2, and James Oliver 1 1 Department of Mechanical Engineering, Iowa State University, Ames, IA 50011, USA 2

More information

Kigamo Scanback which fits in your view camera in place of conventional film.

Kigamo Scanback which fits in your view camera in place of conventional film. What's included Kigamo Scanback which fits in your view camera in place of conventional film. SCSI Cable to connect your Scanback to the host computer. A 3-meter SCSI cable is standard. Kigamo also has

More information