Frame-Rate Pupil Detector and Gaze Tracker

Size: px
Start display at page:

Download "Frame-Rate Pupil Detector and Gaze Tracker"

Transcription

1 Frame-Rate Pupil Detector and Gaze Tracker C.H. Morimoto Ý D. Koons A. Amir M. Flickner ÝDept. Ciência da Computação IME/USP - Rua do Matão 1010 São Paulo, SP 05508, Brazil hitoshi@ime.usp.br IBM Almaden Research Center 650 Harry Road K57 San Jose, CA 95120, USA Abstract We present a robust, frame-rate pupil detector technique, based on an active illumination scheme, used for gaze estimation. The pupil detector uses two light sources synchronized with the even and odd fields of the video signal (interlaced frames), to create bright and dark pupil images. The retro-reflectivity property of the eye is exploited by placing an infra-red (IR) light source close to the camera s optical axis resulting in an image with a bright pupil. A similar off axis IR source generates an image with dark pupils. Pupils are detected from the thresholded difference of the bright and dark pupil images. After a calibration procedure, the vector computed from the pupil center to the center of the corneal glints generated from light sources is used to estimate the gaze position. The frame-rate gaze estimator prototype is currently being demonstrated in a docked 300 MHz IBM Thinkpad with a PCI frame grabber, using interlaced frames of resolution bits. 1 Introduction Robust face detection and tracking will be fundamental to future human computer interaction, and any reliable technique for detecting eyes would greatly simplify this task. The requirement for interaction imposes severe constraints on the response time of these image processing tasks, which are also known to have high computational demands. In this paper we describe a novel robust framerate pupil detector technique that is suitable for desktop and kiosk applications. Current research on real-time face detection and tracking are model-based, i.e., they use information about skin color [5, 7] or face geometry [1] for example. The technique described in this paper explores physical properties of eyes (i.e., their retro-reflectivity) to segment them using an active illumination scheme described in Section 2. Eye properties have been used before in commercial eye gaze trackers such as those available from ISCAN Incorporated, Applied Science Laboratories (ASL), and LC Technologies, but they use only bright or dark pupil images for tracking. Due to the retro-reflectivity of the eye, a bright pupil image is seen by the camera when a light source is placed very close to its optical axis (Figure 1). This effect is well known as the red-eye effect from flash photographs [8]. Under regular illumination (when the light source is not on the camera s optical axis), a dark pupil is seen. The trick for robust pupil detection is to combine dark and bright pupil images, where pupil candidates are detected from the thresholded difference of the dark from the bright pupil image, as seen in Figure 2. Off-axis lighting On-axis lighting Cornea Iris Cornea Iris Retina Retina Figure 1: Retro-reflectivity of the eye. Observe that when the light source is placed off-axis (top), the camera does not capture the light returning from the eye. The pupil detection systems presented in [6, 2] are also based on a differential lighting with thresholding scheme. These systems are used to detect and track the pupil and estimate the point of gaze, which also requires the detection of the corneal reflections created by the light sources. The corneal reflection from the light sources can be easily seen as the bright spot close to the pupils in Figures 2a and 2b. Our system differs from these due to its simplicity, and the

2 Figure 2: (a) Bright and (b) dark pupil images. (c) Difference of the dark from the bright pupil after thresholding constraint to use off-the-shelf hardware. In a previous paper [3] we have described a real-time eye and face detector system based on the differential lighting with thresholding scheme. This paper introduces several enhancements we have made to build the frame-rate (30 frames/second) pupil tracker and gaze estimator. The next section describes several issues related to the implementation of the pupil detector based on the active illumination scheme, and Section 3 presents the eye gaze tracker built on top of the pupil detector. Experimental results for both pupil detector and eye gaze tracker are given in Section 4. Section 5 concludes the paper and discusses future work. 2 Implementation Issues Figure 3 shows the pan-tilt camera with the illuminators used in the eye tracking system. The pan-tilt camera is a Sony EVI D30, and the illuminators are constituted by two sets of light sources. For convenience, near infra-red (IR) light with wavelength 875nm is used, which is invisible to the human eye. The IR illumination also makes the system insensitive to changes in indoor ambient illumination, i.e., the room lights can be turned on/off without affecting the operation of the system. We slightly modified the original camera optics to adjust it to operate in near IR (by removing its IR block filter) and introducing a pair of extra lenses to increase its optical magnification. The light sources LIGHT1 and LIGHT2 in Figure 3 are composed of sets of 7 IR LEDs each. LIGHT2 is composed of two sets of LEDs, symmetrically placed on the left and right sides of the optical axis. Symmetry around the optical axis is desired because it reduces shadow artifacts by producing more uniform illumination, but asymmetrical configurations also perform adequately. LIGHT1 is placed near the camera s optical axis, so it generates the bright pupil image (Figure 2a) when it is on, and LIGHT2 is placed off-axis to generate a dark pupil image (Figure 2b), adjusted for similar brightness in the rest of the scene. The video signal from the camera is composed of inter- Pan-Tilt LIGHT1 Sync Circuitry LIGHT2 Figure 3: and IR illumination setting. laced frames, where one frame can be decomposed into an even field and an odd field. Thus, a field has half the vertical resolution of a frame. Let Ø be an image frame taken at time instant Ø, with resolution columns (width) by Ö rows (height), or Ö. Ø can be de-interlaced into Ø and Ç Ø, where Ø is the even field composed by the even rows of Ø and Ç Ø is the odd field composed by the odd rows of Ø. When LIGHT1 is synchronized with the even fields and LIGHT2 with the odd fields, i.e., each illuminator stays on for just half the frame period, one interlaced frame will contain both bright and dark pupil images. Figure 4 shows a block diagram of the pupil detection process. Once an interlaced frame Ø is captured, it is de-interlaced and the odd field Ç Ø is subtracted from the even field Ø (dark from the bright pupil images). Thresholding of the difference image then creates a binary image, which is the input of a connected component labeling algorithm. Each connected component (blob) is checked for particular geometric properties, such as size and aspect ratio, and those that satisfy

3 these constraints, are output as pupils. Observe that it is also possible to detect pupils using Ç Ø Ø ½µ, i.e., between frames, thus increasing the detection rate to 60 fields per second. To Capture Card Video Decoder Video Signal EvenField Even Field LIGHT1 Interlaced Image Difference Odd Field Odd Field Binary Image Buffers LIGHT2 Pupils Geometric Constraints Blobs Connected Components Figure 5: Synchronization device block diagram. Figure 4: Pupil detection block diagram. The only piece of hardware build for the system was a very simple device to keep the even and odd fields synchronized with LIGHT1 and LIGHT2 respectively. Figure 5 shows a block diagram of the light synchronization device. The video signal from the camera is received by a video decoder module that separates the even and odd field signals. The video decoder is a National LM1881 chip, that is mounted on the same board that supports the IR LEDs (see Figure 3). The signal is fed to amplifying buffers that provide power for the IR LEDs. 3 Eye Gaze Tracking The purpose of an eye gaze tracker is to estimate the position on the screen to where the user is fixating her/his gaze. This is accomplished by tracking the user s pupil and the corneal glint, after a brief calibration procedure, that determines the mapping from coordinates of the pupil tracker to user screen coordinates. Assuming a static head, an eye can only rotate in its socket, and the surface of the eye can be approximated by a sphere. Since the light sources are also fixed, the glint on the cornea of the eye can be taken as a reference point, thus the vector from the glint to the center of the pupil will describe the gaze direction. To estimate the screen coordinates to where the user is looking, a simple second order polynomial transformation is used. After the calibration procedure, a simple possible application is to control the mouse using eye gaze, which provides an estimate about the accuracy of the system. We have obtained an accuracy of about 1 degree of resolution, that corresponds to about 1cm on the screen viewed from 50cm. 3.1 Calibration Procedure The calibration procedure is very simple and brief. Nine points are arranged in a grid on the screen, and the user is asked to fixate his/her gaze on a certain target point, press a key, and move to a next target, until all the points are fixated. On each fixation, the vector from the center of the glint to the center of the pupil is saved, so that 9 corresponding points are obtained. The transformation from a glint-pupil vector Ü Ý µ Ø, to a screen coordinate Ë Ü Ý µ Ø is given by: Ü ¼ ½ Ü ¾ Ý Ü Ý Ü ¾ Ý ¾ (1) Ý Ü Ý Ü Ý ½¼ Ü ¾ ½½Ý ¾ where are the coefficients of this second order polynomial. Each corresponding point gives 2 equations from (1), thus 18 equations are produced and an over determined linear system is obtained. The polynomial coefficients for Ü and Ý can be obtained independently, so that 2 simpler over determined systems are solved, using a least squares method. 3.2 Glint-Pupil Vector To compute the glint-pupil vector it is necessary to extract the centers of the pupil and the glint. Since the field of view is very narrow, the pupil is the biggest round blob obtained after the labeling algorithm. To estimate the center the pupil a window slightly larger than the enclosing box of the pupil is created. Then gray scale pixels in the difference image are summed horizontally and vertically (Radon transform). The x, y center is computed as the center of mass of the horizontal and vertical projections (sums). A search procedure for very bright pixels around the pupil is used to detect the glint and compute its center of mass. Ideally all coordinates are computed with subpixel repeatability. Figure 6 shows the bright, dark, and the dark pupil image with two crosses superimposed, that correspond to the computed centers of the pupil and glint. 3.3 Pan-Tilt Servo Mechanism In order to allow some head motion, it is required to keep the pupil centered in the image. The magnitude of the rotation angle of the camera which brings the pupil to the center of the image (assuming the rotation is around

4 Figure 6: (a) Bright and (b) dark pupil images. (c) Dark pupil image superimposed by two crosses, marking the center of the pupil and the glint (for monitoring and debugging purposes). its principal point), will only depend of the image size and the field of view (FOV) of the camera. If the center of the pupil is at the pixel Ü Ýµ (assume the pixel ¼ ¼µ to be the center of the image), and given the FOV Ü Ý µ, and image size Ï À, the pan and tilt are given by: Ü pan Ü Ï Ý tilt Ý À 4 Experimental Results The current prototype was implemented on a dual Pentium II 400MHz machine running Windows NT4, using a commercial PCI capture card compatible with Video for Windows. The eye tracker runs at frame-rate (30 frames per second), processing interlaced frames of resolution bits. We have also achieved this frame rate with an IBM 300 MHz Pentium II Thinkpad 770X machine running Windows NT4, using a PCI capture card installed in an IBM dock III. Eyeglasses and contact lens do not change the retroreflectivity of the eye and unless the glasses or contacts are tinted with an IR blocking coating, they do not inhibit detection. Figure 7 shows the bright, dark, and difference images for a person with glasses. Observe in Figure 7c that spurious false pupil candidates are generated by the specular reflections from the eyeglasses, and these reflections can also block the dark pupil response under very particular head orientations. In such cases, if the head motion must be restricted, a slight change in the orientation of the glasses is enough to reestablish detection and gaze estimation. Pupil detection using only the dark or only the bright pupil images, as it is done by most commercial eye trackers for gaze estimation, would have a lot more spurious responses, which can be expected from images of the same kind shown in Figure 7. The retro-reflectivity property of eyes is uncommon in man-made and natural objects resulting in pupils generally (2) (3) being the only objects appearing with high contrast between the two pupil images. Pupil detection is greatly facilitated by the enhanced signal-to-noise ratio, and the simple process of thresholding the difference between bright and dark pupil images is generally sufficient, as shown in Figure 2c. Our experience shows that most retro-reflectors we tested, used for example in running shoes, reflect light in a reasonable wide angle, so that they appear bright in both images and do not cause artifacts. Certain lamps, like table lamps and ceiling lamps, have reflectors that when pointed to the camera can cause artifacts. The one degree accuracy mentioned in Section 3 and the small head motion allowed by the system is comparable with commercial systems. The limitations on head motion is due to the simple motion model adopted, because the calibration changes with different head positions. We are currently working on more complex models to allow free head motion. Other applications such as real-time face tracking [3] and enhanced human-computer interaction [4, 9] using the frame-rate pupil detector are described in other publications. 5 Conclusion We have presented a robust frame-rate pupil detector and eye gaze tracker, with high potential to be used in human-computer interaction, particularly in desktop and kiosk applications. The even and odd frames of a video camera are synchronized with two IR light sources. A pupil is alternately illuminated with an on-axis IR source when even frames are being captured, and with an off-axis IR source for odd frames. The on camera axis illumination generates a bright pupil, and the off axis illumination keeps the scene at about the same illumination, but the pupil remains dark. Detection follows from thresholding the difference between even and odd frames. Once the pupil is detected, the corneal glint from the light sources is searched for, near the center of the pupil. The eye gaze tracker uses the center of the pupil and glint

5 Figure 7: (a) Bright and (b) dark pupil images with glasses. (c) Difference image after thresholding. The strong glints on the glasses can be avoided with a slight change in its orientation. to estimate the position on the screen to where the user is fixating her/his gaze, after a brief calibration procedure that determines the mapping from coordinates of the pupil tracker to user screen coordinates. The eye gaze tracker has been successfully tested for a very large number of people, and it has proven to be very robust. Future extensions include the generalization of the problem to a 3D model in order to allow for large head motion and enhancements on the pupil detector to increase its accuracy, that includes changes in the calibration procedure and mapping functions. References [1] S. Birchfield. An elliptical head tracker. In Proceeding of the 31st Asilomar Conf. on Signals, Systems, and Computers, Pacific Grove, CA, November [2] Y. Ebisawa and S. Satoh. Effectiveness of pupil area detection technique using two light sources and image difference method. In A.Y.J. Szeto and R.M. Rangayan, editors, Proceedings of the 15th Annual Int. Conf. of the IEEE Eng. in Medicine and Biology Society, pages , San Diego, CA, [6] A. Tomono, M. Iida, and Y. Kobayashi. A tv camera system which extracts feature points for non-contact eye movement detection. In Proceedings of the SPIE Optics, Illumination, and Image Sensing for Machine Vision IV, volume 1194, pages 2 12, [7] J. Yang and A. Waibel. A real-time face tracker. In Proceedings of the Third IEEE Workshop on Applications of Computer Vision, pages , Sarasota, FL, [8] L. Young and D. Sheena. Methods & designs: Survey of eye movement recording methods. Behavioral Research Methods & Instrumentation, 7(5): , [9] S. Zhai, C.H. Morimoto, and S. Ihde. Manual and gaze input cascaded (magic) pointing. In Proc. ACM SIGCHI - Human Factors in Computing Systems Conference, pages , Pittsburgh, PA, May [3] C. Morimoto, D. Koons, A. Amir, and M. Flickner. real-time detection of eyes and faces. In Proceedings of 1998 Workshop on Perceptual User Interfaces, pages , San Francisco, CA, November [4] C.H. Morimoto, D. Koons, A. Amir, M. Flickner, and S. Zhai. Keeping an eye for hci. In Proc. of the XII Brazilian Symposium on Computer Graphics and Image Processing - Sibgrapi 99, Campinas, SP, October [5] N. Oliver, A. Pentland, and F. Berard. Lafter: Lips and face real time tracker. In Proc. IEEE Conference on Computer Vision and Pattern Recognition, pages , Puerto Rico,PR, June 1997.

, ARNON AMIR, MYRON FLICKNER

, ARNON AMIR, MYRON FLICKNER SIBGRAPI 99, XII BRAZILIAN SYMPOSIUM IN COMPUTER GRAPHICS AND IMAGE PROC., CAMPINAS, BRAZIL, OCTOBER, 1999 171 CARLOS MORIMOTO, DAVID KOONS Keeping an Eye for HCI, ARNON AMIR, MYRON FLICKNER, SHUMIN ZHAI

More information

Pupil detection and tracking using multiple light sources

Pupil detection and tracking using multiple light sources Image and Vision Computing 18 (2000) 331 335 www.elsevier.com/locate/imavis Pupil detection and tracking using multiple light sources C.H. Morimoto a, *, D. Koons b, A. Amir b, M. Flickner b a Dept. de

More information

Automatic Iris Segmentation Using Active Near Infra Red Lighting

Automatic Iris Segmentation Using Active Near Infra Red Lighting Automatic Iris Segmentation Using Active Near Infra Red Lighting Carlos H. Morimoto Thiago T. Santos Adriano S. Muniz Departamento de Ciência da Computação - IME/USP Rua do Matão, 1010, São Paulo, SP,

More information

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere

Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Improvement of Accuracy in Remote Gaze Detection for User Wearing Eyeglasses Using Relative Position Between Centers of Pupil and Corneal Sphere Kiyotaka Fukumoto (&), Takumi Tsuzuki, and Yoshinobu Ebisawa

More information

The introduction and background in the previous chapters provided context in

The introduction and background in the previous chapters provided context in Chapter 3 3. Eye Tracking Instrumentation 3.1 Overview The introduction and background in the previous chapters provided context in which eye tracking systems have been used to study how people look at

More information

PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique

PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique PupilMouse: Cursor Control by Head Rotation Using Pupil Detection Technique Yoshinobu Ebisawa, Daisuke Ishima, Shintaro Inoue, Yasuko Murayama Faculty of Engineering, Shizuoka University Hamamatsu, 432-8561,

More information

Unconstrained pupil detection technique using two light sources and the image difference method

Unconstrained pupil detection technique using two light sources and the image difference method Unconstrained pupil detection technique using two light sources and the image difference method Yoshinobu Ebisawa Faculty of Engineering, Shizuoka University, Johoku 3-5-1, Hamamatsu, Shizuoka, 432 Japan

More information

www. riseeyetracker.com TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01

www. riseeyetracker.com  TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01 TWO MOONS SOFTWARE LTD RISEBETA EYE-TRACKER INSTRUCTION GUIDE V 1.01 CONTENTS 1 INTRODUCTION... 5 2 SUPPORTED CAMERAS... 5 3 SUPPORTED INFRA-RED ILLUMINATORS... 7 4 USING THE CALIBARTION UTILITY... 8 4.1

More information

Patents of eye tracking system- a survey

Patents of eye tracking system- a survey Patents of eye tracking system- a survey Feng Li Center for Imaging Science Rochester Institute of Technology, Rochester, NY 14623 Email: Fxl5575@cis.rit.edu Vision is perhaps the most important of the

More information

Compensating for Eye Tracker Camera Movement

Compensating for Eye Tracker Camera Movement Compensating for Eye Tracker Camera Movement Susan M. Kolakowski Jeff B. Pelz Visual Perception Laboratory, Carlson Center for Imaging Science, Rochester Institute of Technology, Rochester, NY 14623 USA

More information

Eye-centric ICT control

Eye-centric ICT control Loughborough University Institutional Repository Eye-centric ICT control This item was submitted to Loughborough University's Institutional Repository by the/an author. Citation: SHI, GALE and PURDY, 2006.

More information

Optical design of a high resolution vision lens

Optical design of a high resolution vision lens Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:

More information

Eye Gaze Tracking With a Web Camera in a Desktop Environment

Eye Gaze Tracking With a Web Camera in a Desktop Environment Eye Gaze Tracking With a Web Camera in a Desktop Environment Mr. K.Raju Ms. P.Haripriya ABSTRACT: This paper addresses the eye gaze tracking problem using a lowcost andmore convenient web camera in a desktop

More information

Towards Wearable Gaze Supported Augmented Cognition

Towards Wearable Gaze Supported Augmented Cognition Towards Wearable Gaze Supported Augmented Cognition Andrew Toshiaki Kurauchi University of São Paulo Rua do Matão 1010 São Paulo, SP kurauchi@ime.usp.br Diako Mardanbegi IT University, Copenhagen Rued

More information

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM Takafumi Taketomi Nara Institute of Science and Technology, Japan Janne Heikkilä University of Oulu, Finland ABSTRACT In this paper, we propose a method

More information

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot Visual Perception Based Behaviors for a Small Autonomous Mobile Robot Scott Jantz and Keith L Doty Machine Intelligence Laboratory Mekatronix, Inc. Department of Electrical and Computer Engineering Gainesville,

More information

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

CPSC 4040/6040 Computer Graphics Images. Joshua Levine CPSC 4040/6040 Computer Graphics Images Joshua Levine levinej@clemson.edu Lecture 04 Displays and Optics Sept. 1, 2015 Slide Credits: Kenny A. Hunt Don House Torsten Möller Hanspeter Pfister Agenda Open

More information

Image Formation: Camera Model

Image Formation: Camera Model Image Formation: Camera Model Ruigang Yang COMP 684 Fall 2005, CS684-IBMR Outline Camera Models Pinhole Perspective Projection Affine Projection Camera with Lenses Digital Image Formation The Human Eye

More information

Efficient Color Object Segmentation Using the Dichromatic Reflection Model

Efficient Color Object Segmentation Using the Dichromatic Reflection Model Efficient Color Object Segmentation Using the Dichromatic Reflection Model Vladimir Kravtchenko, James J. Little The University of British Columbia Department of Computer Science 201-2366 Main Mall, Vancouver

More information

TRIANGULATION-BASED light projection is a typical

TRIANGULATION-BASED light projection is a typical 246 IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 39, NO. 1, JANUARY 2004 A 120 110 Position Sensor With the Capability of Sensitive and Selective Light Detection in Wide Dynamic Range for Robust Active Range

More information

Introduction to Video Forgery Detection: Part I

Introduction to Video Forgery Detection: Part I Introduction to Video Forgery Detection: Part I Detecting Forgery From Static-Scene Video Based on Inconsistency in Noise Level Functions IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 5,

More information

Software Development Kit to Verify Quality Iris Images

Software Development Kit to Verify Quality Iris Images Software Development Kit to Verify Quality Iris Images Isaac Mateos, Gualberto Aguilar, Gina Gallegos Sección de Estudios de Posgrado e Investigación Culhuacan, Instituto Politécnico Nacional, México D.F.,

More information

Wide-Band Enhancement of TV Images for the Visually Impaired

Wide-Band Enhancement of TV Images for the Visually Impaired Wide-Band Enhancement of TV Images for the Visually Impaired E. Peli, R.B. Goldstein, R.L. Woods, J.H. Kim, Y.Yitzhaky Schepens Eye Research Institute, Harvard Medical School, Boston, MA Association for

More information

CSE Thu 10/22. Nadir Weibel

CSE Thu 10/22. Nadir Weibel CSE 118 - Thu 10/22 Nadir Weibel Today Admin Teams : status? Web Site on Github (due: Sunday 11:59pm) Evening meetings: presence Mini Quiz Eye-Tracking Mini Quiz on Week 3-4 http://goo.gl/forms/ab7jijsryh

More information

EC-433 Digital Image Processing

EC-433 Digital Image Processing EC-433 Digital Image Processing Lecture 2 Digital Image Fundamentals Dr. Arslan Shaukat 1 Fundamental Steps in DIP Image Acquisition An image is captured by a sensor (such as a monochrome or color TV camera)

More information

LOOK WHO S TALKING: SPEAKER DETECTION USING VIDEO AND AUDIO CORRELATION. Ross Cutler and Larry Davis

LOOK WHO S TALKING: SPEAKER DETECTION USING VIDEO AND AUDIO CORRELATION. Ross Cutler and Larry Davis LOOK WHO S TALKING: SPEAKER DETECTION USING VIDEO AND AUDIO CORRELATION Ross Cutler and Larry Davis Institute for Advanced Computer Studies University of Maryland, College Park rgc,lsd @cs.umd.edu ABSTRACT

More information

openeyes: a low-cost head-mounted eye-tracking solution

openeyes: a low-cost head-mounted eye-tracking solution openeyes: a low-cost head-mounted eye-tracking solution Dongheng Li, Jason Babcock, and Derrick J. Parkhurst The Human Computer Interaction Program Iowa State University, Ames, Iowa, 50011 Abstract Eye

More information

Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs.

Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs. 2D Color Analyzer 8 Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs. Accurately and easily measures the distribution of luminance and chromaticity. Advanced

More information

Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs.

Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs. 2D Color Analyzer Ideal for display mura (nonuniformity) evaluation and inspection on smartphones and tablet PCs. Accurately and easily measures the distribution of luminance and chromaticity. The included

More information

Implementation of a Visible Watermarking in a Secure Still Digital Camera Using VLSI Design

Implementation of a Visible Watermarking in a Secure Still Digital Camera Using VLSI Design 2009 nternational Symposium on Computing, Communication, and Control (SCCC 2009) Proc.of CST vol.1 (2011) (2011) ACST Press, Singapore mplementation of a Visible Watermarking in a Secure Still Digital

More information

Physics 3340 Spring Fourier Optics

Physics 3340 Spring Fourier Optics Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.

More information

Fig Color spectrum seen by passing white light through a prism.

Fig Color spectrum seen by passing white light through a prism. 1. Explain about color fundamentals. Color of an object is determined by the nature of the light reflected from it. When a beam of sunlight passes through a glass prism, the emerging beam of light is not

More information

Multi-Modal User Interaction. Lecture 3: Eye Tracking and Applications

Multi-Modal User Interaction. Lecture 3: Eye Tracking and Applications Multi-Modal User Interaction Lecture 3: Eye Tracking and Applications Zheng-Hua Tan Department of Electronic Systems Aalborg University, Denmark zt@es.aau.dk 1 Part I: Eye tracking Eye tracking Tobii eye

More information

Activity monitoring and summarization for an intelligent meeting room

Activity monitoring and summarization for an intelligent meeting room IEEE Workshop on Human Motion, Austin, Texas, December 2000 Activity monitoring and summarization for an intelligent meeting room Ivana Mikic, Kohsia Huang, Mohan Trivedi Computer Vision and Robotics Research

More information

Improved SIFT Matching for Image Pairs with a Scale Difference

Improved SIFT Matching for Image Pairs with a Scale Difference Improved SIFT Matching for Image Pairs with a Scale Difference Y. Bastanlar, A. Temizel and Y. Yardımcı Informatics Institute, Middle East Technical University, Ankara, 06531, Turkey Published in IET Electronics,

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Human Visual System. Prof. George Wolberg Dept. of Computer Science City College of New York

Human Visual System. Prof. George Wolberg Dept. of Computer Science City College of New York Human Visual System Prof. George Wolberg Dept. of Computer Science City College of New York Objectives In this lecture we discuss: - Structure of human eye - Mechanics of human visual system (HVS) - Brightness

More information

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL Instructor : Dr. K. R. Rao Presented by: Prasanna Venkatesh Palani (1000660520) prasannaven.palani@mavs.uta.edu

More information

GlassSpection User Guide

GlassSpection User Guide i GlassSpection User Guide GlassSpection User Guide v1.1a January2011 ii Support: Support for GlassSpection is available from Pyramid Imaging. Send any questions or test images you want us to evaluate

More information

Building a lightweight eyetracking headgear

Building a lightweight eyetracking headgear Building a lightweight eyetracking headgear Jason S.Babcock & Jeff B. Pelz Rochester Institute of Technology Abstract Eyetracking systems that use video-based cameras to monitor the eye and scene can be

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

Research on 3-D measurement system based on handheld microscope

Research on 3-D measurement system based on handheld microscope Proceedings of the 4th IIAE International Conference on Intelligent Systems and Image Processing 2016 Research on 3-D measurement system based on handheld microscope Qikai Li 1,2,*, Cunwei Lu 1,**, Kazuhiro

More information

Cameras. CSE 455, Winter 2010 January 25, 2010

Cameras. CSE 455, Winter 2010 January 25, 2010 Cameras CSE 455, Winter 2010 January 25, 2010 Announcements New Lecturer! Neel Joshi, Ph.D. Post-Doctoral Researcher Microsoft Research neel@cs Project 1b (seam carving) was due on Friday the 22 nd Project

More information

Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman

Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman Fast, Robust Colour Vision for the Monash Humanoid Andrew Price Geoff Taylor Lindsay Kleeman Intelligent Robotics Research Centre Monash University Clayton 3168, Australia andrew.price@eng.monash.edu.au

More information

FACE RECOGNITION BY PIXEL INTENSITY

FACE RECOGNITION BY PIXEL INTENSITY FACE RECOGNITION BY PIXEL INTENSITY Preksha jain & Rishi gupta Computer Science & Engg. Semester-7 th All Saints College Of Technology, Gandhinagar Bhopal. Email Id-Priky0889@yahoo.com Abstract Face Recognition

More information

Image Capture and Problems

Image Capture and Problems Image Capture and Problems A reasonable capture IVR Vision: Flat Part Recognition Fisher lecture 4 slide 1 Image Capture: Focus problems Focus set to one distance. Nearby distances in focus (depth of focus).

More information

Exercise questions for Machine vision

Exercise questions for Machine vision Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC ROBOT VISION Dr.M.Madhavi, MED, MVSREC Robotic vision may be defined as the process of acquiring and extracting information from images of 3-D world. Robotic vision is primarily targeted at manipulation

More information

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE

APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE APPLICATION OF COMPUTER VISION FOR DETERMINATION OF SYMMETRICAL OBJECT POSITION IN THREE DIMENSIONAL SPACE Najirah Umar 1 1 Jurusan Teknik Informatika, STMIK Handayani Makassar Email : najirah_stmikh@yahoo.com

More information

PROVISIONAL PATENT FOR MEASURING VISUAL CYLINDER USING A TWO-DIMENSIONAL SURFACE ABSTRACT OF THE DISCLOSURE:

PROVISIONAL PATENT FOR MEASURING VISUAL CYLINDER USING A TWO-DIMENSIONAL SURFACE ABSTRACT OF THE DISCLOSURE: PROVISIONAL PATENT FOR MEASURING VISUAL CYLINDER USING A TWO-DIMENSIONAL SURFACE Inventors: Reid Laurens, Allan Hytowitz, Alpharetta, GA (US) 5 ABSTRACT OF THE DISCLOSURE: Visual images on a display surface

More information

Face Detection System on Ada boost Algorithm Using Haar Classifiers

Face Detection System on Ada boost Algorithm Using Haar Classifiers Vol.2, Issue.6, Nov-Dec. 2012 pp-3996-4000 ISSN: 2249-6645 Face Detection System on Ada boost Algorithm Using Haar Classifiers M. Gopi Krishna, A. Srinivasulu, Prof (Dr.) T.K.Basak 1, 2 Department of Electronics

More information

Camera Calibration Certificate No: DMC III 27542

Camera Calibration Certificate No: DMC III 27542 Calibration DMC III Camera Calibration Certificate No: DMC III 27542 For Peregrine Aerial Surveys, Inc. #201 1255 Townline Road Abbotsford, B.C. V2T 6E1 Canada Calib_DMCIII_27542.docx Document Version

More information

CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker

CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed Circuit Breaker 2016 3 rd International Conference on Engineering Technology and Application (ICETA 2016) ISBN: 978-1-60595-383-0 CCD Automatic Gain Algorithm Design of Noncontact Measurement System Based on High-speed

More information

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation Hiroshi Ishiguro Department of Information Science, Kyoto University Sakyo-ku, Kyoto 606-01, Japan E-mail: ishiguro@kuis.kyoto-u.ac.jp

More information

This document is a preview generated by EVS

This document is a preview generated by EVS INTERNATIONAL STANDARD ISO 17850 First edition 2015-07-01 Photography Digital cameras Geometric distortion (GD) measurements Photographie Caméras numériques Mesurages de distorsion géométrique (DG) Reference

More information

Enhanced Method for Face Detection Based on Feature Color

Enhanced Method for Face Detection Based on Feature Color Journal of Image and Graphics, Vol. 4, No. 1, June 2016 Enhanced Method for Face Detection Based on Feature Color Nobuaki Nakazawa1, Motohiro Kano2, and Toshikazu Matsui1 1 Graduate School of Science and

More information

Superfast phase-shifting method for 3-D shape measurement

Superfast phase-shifting method for 3-D shape measurement Superfast phase-shifting method for 3-D shape measurement Song Zhang 1,, Daniel Van Der Weide 2, and James Oliver 1 1 Department of Mechanical Engineering, Iowa State University, Ames, IA 50011, USA 2

More information

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation ITE Trans. on MTA Vol. 2, No. 2, pp. 161-166 (2014) Copyright 2014 by ITE Transactions on Media Technology and Applications (MTA) Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based

More information

Real-Time Face Detection and Tracking for High Resolution Smart Camera System

Real-Time Face Detection and Tracking for High Resolution Smart Camera System Digital Image Computing Techniques and Applications Real-Time Face Detection and Tracking for High Resolution Smart Camera System Y. M. Mustafah a,b, T. Shan a, A. W. Azman a,b, A. Bigdeli a, B. C. Lovell

More information

Libyan Licenses Plate Recognition Using Template Matching Method

Libyan Licenses Plate Recognition Using Template Matching Method Journal of Computer and Communications, 2016, 4, 62-71 Published Online May 2016 in SciRes. http://www.scirp.org/journal/jcc http://dx.doi.org/10.4236/jcc.2016.47009 Libyan Licenses Plate Recognition Using

More information

License Plate Localisation based on Morphological Operations

License Plate Localisation based on Morphological Operations License Plate Localisation based on Morphological Operations Xiaojun Zhai, Faycal Benssali and Soodamani Ramalingam School of Engineering & Technology University of Hertfordshire, UH Hatfield, UK Abstract

More information

Adding Gestures to Ordinary Mouse Use: a New Input Modality for Improved Human-Computer Interaction

Adding Gestures to Ordinary Mouse Use: a New Input Modality for Improved Human-Computer Interaction Adding Gestures to Ordinary Mouse Use: a New Input Modality for Improved Human-Computer Interaction Luca Lombardi and Marco Porta Dipartimento di Informatica e Sistemistica, Università di Pavia Via Ferrata,

More information

A Novel Morphological Method for Detection and Recognition of Vehicle License Plates

A Novel Morphological Method for Detection and Recognition of Vehicle License Plates American Journal of Applied Sciences 6 (12): 2066-2070, 2009 ISSN 1546-9239 2009 Science Publications A Novel Morphological Method for Detection and Recognition of Vehicle License Plates 1 S.H. Mohades

More information

A moment-preserving approach for depth from defocus

A moment-preserving approach for depth from defocus A moment-preserving approach for depth from defocus D. M. Tsai and C. T. Lin Machine Vision Lab. Department of Industrial Engineering and Management Yuan-Ze University, Chung-Li, Taiwan, R.O.C. E-mail:

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Sensors and Image Formation Imaging sensors and models of image formation Coordinate systems Digital

More information

Face Registration Using Wearable Active Vision Systems for Augmented Memory

Face Registration Using Wearable Active Vision Systems for Augmented Memory DICTA2002: Digital Image Computing Techniques and Applications, 21 22 January 2002, Melbourne, Australia 1 Face Registration Using Wearable Active Vision Systems for Augmented Memory Takekazu Kato Takeshi

More information

Imaging Photometer and Colorimeter

Imaging Photometer and Colorimeter W E B R I N G Q U A L I T Y T O L I G H T. /XPL&DP Imaging Photometer and Colorimeter Two models available (photometer and colorimetry camera) 1280 x 1000 pixels resolution Measuring range 0.02 to 200,000

More information

EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION

EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION EFFICIENT ATTENDANCE MANAGEMENT SYSTEM USING FACE DETECTION AND RECOGNITION 1 Arun.A.V, 2 Bhatath.S, 3 Chethan.N, 4 Manmohan.C.M, 5 Hamsaveni M 1,2,3,4,5 Department of Computer Science and Engineering,

More information

Development of Hybrid Image Sensor for Pedestrian Detection

Development of Hybrid Image Sensor for Pedestrian Detection AUTOMOTIVE Development of Hybrid Image Sensor for Pedestrian Detection Hiroaki Saito*, Kenichi HatanaKa and toshikatsu HayaSaKi To reduce traffic accidents and serious injuries at intersections, development

More information

Further Refining and Validation of RF Absorber Approximation Equations for Anechoic Chamber Predictions

Further Refining and Validation of RF Absorber Approximation Equations for Anechoic Chamber Predictions Further Refining and Validation of RF Absorber Approximation Equations for Anechoic Chamber Predictions Vince Rodriguez, NSI-MI Technologies, Suwanee, Georgia, USA, vrodriguez@nsi-mi.com Abstract Indoor

More information

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION Determining MTF with a Slant Edge Target Douglas A. Kerr Issue 2 October 13, 2010 ABSTRACT AND INTRODUCTION The modulation transfer function (MTF) of a photographic lens tells us how effectively the lens

More information

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES OSCC.DEC 14 12 October 1994 METHODOLOGY FOR CALCULATING THE MINIMUM HEIGHT ABOVE GROUND LEVEL AT WHICH EACH VIDEO CAMERA WITH REAL TIME DISPLAY INSTALLED

More information

Automatic Locating the Centromere on Human Chromosome Pictures

Automatic Locating the Centromere on Human Chromosome Pictures Automatic Locating the Centromere on Human Chromosome Pictures M. Moradi Electrical and Computer Engineering Department, Faculty of Engineering, University of Tehran, Tehran, Iran moradi@iranbme.net S.

More information

Defense Technical Information Center Compilation Part Notice

Defense Technical Information Center Compilation Part Notice UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO 11345 TITLE: Measurement of the Spatial Frequency Response [SFR] of Digital Still-Picture Cameras Using a Modified Slanted

More information

MML-High Resolution 5M Series

MML-High Resolution 5M Series Fixed Magnification Series -High Resolution 5M Series High-resolution models that possess the best contrast and NA of all Series. Image acquisition with even higher image quality is realized by combining

More information

ThermaViz. Operating Manual. The Innovative Two-Wavelength Imaging Pyrometer

ThermaViz. Operating Manual. The Innovative Two-Wavelength Imaging Pyrometer ThermaViz The Innovative Two-Wavelength Imaging Pyrometer Operating Manual The integration of advanced optical diagnostics and intelligent materials processing for temperature measurement and process control.

More information

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing

Digital Image Processing. Lecture # 6 Corner Detection & Color Processing Digital Image Processing Lecture # 6 Corner Detection & Color Processing 1 Corners Corners (interest points) Unlike edges, corners (patches of pixels surrounding the corner) do not necessarily correspond

More information

Visual Search using Principal Component Analysis

Visual Search using Principal Component Analysis Visual Search using Principal Component Analysis Project Report Umesh Rajashekar EE381K - Multidimensional Digital Signal Processing FALL 2000 The University of Texas at Austin Abstract The development

More information

High Performance Imaging Using Large Camera Arrays

High Performance Imaging Using Large Camera Arrays High Performance Imaging Using Large Camera Arrays Presentation of the original paper by Bennett Wilburn, Neel Joshi, Vaibhav Vaish, Eino-Ville Talvala, Emilio Antunez, Adam Barth, Andrew Adams, Mark Horowitz,

More information

INDOOR USER ZONING AND TRACKING IN PASSIVE INFRARED SENSING SYSTEMS. Gianluca Monaci, Ashish Pandharipande

INDOOR USER ZONING AND TRACKING IN PASSIVE INFRARED SENSING SYSTEMS. Gianluca Monaci, Ashish Pandharipande 20th European Signal Processing Conference (EUSIPCO 2012) Bucharest, Romania, August 27-31, 2012 INDOOR USER ZONING AND TRACKING IN PASSIVE INFRARED SENSING SYSTEMS Gianluca Monaci, Ashish Pandharipande

More information

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5 Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain

More information

ON THE REDUCTION OF SUB-PIXEL ERROR IN IMAGE BASED DISPLACEMENT MEASUREMENT

ON THE REDUCTION OF SUB-PIXEL ERROR IN IMAGE BASED DISPLACEMENT MEASUREMENT 5 XVII IMEKO World Congress Metrology in the 3 rd Millennium June 22 27, 2003, Dubrovnik, Croatia ON THE REDUCTION OF SUB-PIXEL ERROR IN IMAGE BASED DISPLACEMENT MEASUREMENT Alfredo Cigada, Remo Sala,

More information

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS

SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 - COMPUTERIZED IMAGING Section I: Chapter 2 RADT 3463 Computerized Imaging 1 SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 COMPUTERIZED IMAGING Section I: Chapter 2 RADT

More information

Using Optics to Optimize Your Machine Vision Application

Using Optics to Optimize Your Machine Vision Application Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information

More information

An Improved Bernsen Algorithm Approaches For License Plate Recognition

An Improved Bernsen Algorithm Approaches For License Plate Recognition IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) ISSN: 78-834, ISBN: 78-8735. Volume 3, Issue 4 (Sep-Oct. 01), PP 01-05 An Improved Bernsen Algorithm Approaches For License Plate Recognition

More information

ME 6406 MACHINE VISION. Georgia Institute of Technology

ME 6406 MACHINE VISION. Georgia Institute of Technology ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Chapter 5. Tracking system with MEMS mirror

Chapter 5. Tracking system with MEMS mirror Chapter 5 Tracking system with MEMS mirror Up to now, this project has dealt with the theoretical optimization of the tracking servo with MEMS mirror through the use of simulation models. For these models

More information

Eye-Gaze Tracking Using Inexpensive Video Cameras. Wajid Ahmed Greg Book Hardik Dave. University of Connecticut, May 2002

Eye-Gaze Tracking Using Inexpensive Video Cameras. Wajid Ahmed Greg Book Hardik Dave. University of Connecticut, May 2002 Eye-Gaze Tracking Using Inexpensive Video Cameras Wajid Ahmed Greg Book Hardik Dave University of Connecticut, May 2002 Statement of Problem To track eye movements based on pupil location. The location

More information

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT 1 Rudolph P. Darken, 1 Joseph A. Sullivan, and 2 Jeffrey Mulligan 1 Naval Postgraduate School,

More information

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope

PROCEEDINGS OF SPIE. Measurement of low-order aberrations with an autostigmatic microscope PROCEEDINGS OF SPIE SPIEDigitalLibrary.org/conference-proceedings-of-spie Measurement of low-order aberrations with an autostigmatic microscope William P. Kuhn Measurement of low-order aberrations with

More information

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy Digital Cameras for Microscopy Camera Overview For Materials Science Microscopes Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis Passionate about Imaging: Olympus Digital

More information

CS 445 HW#2 Solutions

CS 445 HW#2 Solutions 1. Text problem 3.1 CS 445 HW#2 Solutions (a) General form: problem figure,. For the condition shown in the Solving for K yields Then, (b) General form: the problem figure, as in (a) so For the condition

More information

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System

Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System Pupil Detection and Tracking Based on a Round Shape Criterion by Image Processing Techniques for a Human Eye-Computer Interaction System Tsumoru Ochiai and Yoshihiro Mitani Abstract The pupil detection

More information

Technical Guide Technical Guide

Technical Guide Technical Guide Technical Guide Technical Guide Introduction This Technical Guide details the principal techniques used to create two of the more technically advanced photographs in the D800/D800E catalog. Enjoy this

More information

Multiplex Image Projection using Multi-Band Projectors

Multiplex Image Projection using Multi-Band Projectors 2013 IEEE International Conference on Computer Vision Workshops Multiplex Image Projection using Multi-Band Projectors Makoto Nonoyama Fumihiko Sakaue Jun Sato Nagoya Institute of Technology Gokiso-cho

More information

AN ADAPTIVE MORPHOLOGICAL FILTER FOR DEFECT DETECTION IN EDDY

AN ADAPTIVE MORPHOLOGICAL FILTER FOR DEFECT DETECTION IN EDDY AN ADAPTIVE MORPHOLOGICAL FILTER FOR DEFECT DETECTION IN EDDY CURRENT AIRCRAFT WHEEL INSPECTION Shu Gao, Lalita Udpa Department of Electrical Engineering and Computer Engineering Iowa State University

More information

Southern African Large Telescope. RSS CCD Geometry

Southern African Large Telescope. RSS CCD Geometry Southern African Large Telescope RSS CCD Geometry Kenneth Nordsieck University of Wisconsin Document Number: SALT-30AM0011 v 1.0 9 May, 2012 Change History Rev Date Description 1.0 9 May, 2012 Original

More information

Digital Photographic Imaging Using MOEMS

Digital Photographic Imaging Using MOEMS Digital Photographic Imaging Using MOEMS Vasileios T. Nasis a, R. Andrew Hicks b and Timothy P. Kurzweg a a Department of Electrical and Computer Engineering, Drexel University, Philadelphia, USA b Department

More information