Long Range Iris Acquisition System for Stationary and Mobile Subjects

Size: px
Start display at page:

Download "Long Range Iris Acquisition System for Stationary and Mobile Subjects"

Transcription

1 Long Range Iris Acquisition System for Stationary and Mobile Subjects Shreyas Venugopalan 1,2, Unni Prasad 1,2, Khalid Harun 1,2, Kyle Neblett 1,2, Douglas Toomey 3, Joseph Heyman 1,2 and Marios Savvides 1,2 Abstract Most iris based biometric systems require a lot of cooperation from the users so that iris images of acceptable quality may be acquired. Features from these may then be used for recognition purposes. Relatively fewer works in literature address the question of less cooperative iris acquisition systems in order to reduce constraints on users. In this paper, we describe our ongoing work in designing and developing such a system. It is capable of capturing images of the iris up to distances of 8 meters with a resolution of 200 pixels across the diameter. If the resolution requirement is decreased to 150 pixels, then the same system may be used to capture images from up to 12 meters. We have incorporated velocity estimation and focus tracking modules so that images may be acquired from subjects on the move as well. We describe the various components that make up the system, including the lenses used, the imaging sensor, our auto-focus function and velocity estimation module. All the hardware components are Commercial Off The Shelf (COTS) with little or no modifications. We also present preliminary iris acquisition results using our system for both stationary and mobile subjects. 1. Introduction Iris pattern based biometric systems, for surveillance and access control have been gaining a lot of popularity over the recent years. Several such systems are being deployed in highly secure checkpoints across the globe, the primary reason being the uniqueness of iridal patterns across people. Over the years, many works in literature have proposed several feature extraction and comparison schemes for this biometric and they have reported very high recognition rates 1 Department of Electrical and Computer Engineering (ECE), Carnegie Mellon University, Pittsburgh, USA 2 CyLab Biometrics Center, Carnegie Mellon University, Pittsburgh, PA-15213, USA 3 Mauna Kea Infra-Red, Hilo, HI-96720, USA toomey@mkir.com (for e.g. see [15][19]). In all iris pattern based biometric systems, a major concern is the acquisition of very well focused eye images in which relevant iris features are discernible. The acquisition process often requires significant cooperation from the subject whose eye is being imaged. One has to position oneself at a pre-defined location, at a pre-defined distance from the camera and provide sufficient near infra-red (IR) illumination for acquisition. The need for such cooperation is due to the limited capture volume of the system i.e. the volume of space in front of the image acquisition system within which the user s iris is of acceptable quality. Once the iris of the user is within this volume, the user typically remains in that position with limited motion until the system acquires a good quality image. An example of a widely used commercial iris acquisition device is the LG IrisAccess4000 [8]. It uses voice prompts to direct the user into the capture volume. In general, with such systems, the positioning process can seem unintuitive for some users and can result in failure to acquire results. Other systems that have been proposed, involve less cooperation from the users. A good example is the Iris-On-the-Move system (IOM) that was developed by Sarnoff Corporation [18]. Iris patterns are captured while users walk through a portal fitted with near-ir illumination panels. The subject stand-off required by the system is 3 meters. This acquisition system is a fixed focus system with a reported narrow depth of field of 5cm. Compared to traditional desktop/wall mount systems (such as those marketed by Panasonic [10], LG [8] and others), this has the advantage of an increased stand-off distance and reduced level of cooperation from the subject. However, iris acquisition fails if a users iris is not acquired through a fixed, small capture volume. Also, this work proposes a modular approach to increase the height of the capture volume. Multiple cameras are stacked one above the other so that the iris can be captured irrespective of the height of the user. The extra hardware and custom lenses increase the cost of the system. Another category of acquisition systems involves the use of pan-tilt-zoom cameras. These cameras alleviate the constraint of a fixed capture volume. Early attempts at using these cameras are reported by Oki IrisPass-M [9], Sensar R1 [17], Wheeler et al [22]. These systems are based on the use of multiple cam /11/$ IEEE

2 eras - a wide angle scene camera to detect the face/eye in the scene and the second camera with a high magnification lens, specifically aimed to resolve the fine patterns in the iris. Depth estimation is performed using a stereo-camera setup. This information helps in estimating the position of the user in 3D space and hence the focus and pan/tilt settings. Venugopalan and Savvides [21] use a commercial off the shelf (COTS) pan tilt zoom camera to track the faces of subject and to acquire irises when the subject is still. The subjects irises may be acquired from stand-off distances of upto 1.5 meters. They use a single camera setup to acquire both the face and iris from subjects of different heights. The Retica Eagle Eye system [13] uses a scene camera, a face camera, and two iris cameras, which account for its large form factor. The capture volume of this setup is larger compared to the systems described so far, yielding a 3 x 2 x 3m capture volume with increased stand-off (average of 5m). AOptix [7] described a system to perform iris recognition at a stand-off of 2 meters. The system can be used to enroll as well as verify users whose heights vary between 0.9 and 1.9 meters. They use adaptive optics with a multi-stage, realtime closed loop control system, in order to find the subject within a capture volume of depth 1 meter. In this paper we propose a novel long range iris acquisition system that has been designed and developed to acquire pristine quality iris images from stand-off distances of up to 12 meters from stationary or mobile subjects. The range of the system developed will be longer than those proposed in literature so far. In addition, since our system uses Commercial Off The Shelf (COTS) equipment, the cost of the system will be extremely low compared to similar systems that employ several custom made devices. Another advantage of our system is the co-location of the illumination panel and imaging sensor. This facilitates a setup that may be deployed quickly to any location without any worry about the illumination on the subject s side. The pixel resolution of the acquired iris will consistently be greater than 150 pixels across the diameter. The system acquires both the face and the iris using a single camera. In order to track mobile subjects, we have incorporated an area scan camera in addition to the high resolution biometric imager. The function of this camera is to track the subject during his/her motion and keep him/her centered in the frame of the high resolution imager (which captures both the face and the iris with the required resolution). In this way, the iridal patterns of the person may be compared against an existing database and one can determine whether he/she poses a possible security threat. For mobile subjects, our current experiments are being performed at walking speeds of under 0.6m/s. In the following sections of the paper we describe the various components that make up our system. We present results from our acquisition experiments and conclude by outlining current work to enhance the performance of the system. 2. Proposed System In this section we describe briefly the important hardware and software components that make up our system. The hardware modules comprise of the high magnification telephoto lens, the imaging sensor and the auto-focus module. The software components consist of focus estimation, subject speed estimation and integration of focal length adjustment and image acquisition Telephoto Lens and Imaging Sensor First, we briefly describe the sensor and the choice of optics for the system. The requirement of our system is the acquisition of both the face and the iris of the subject of interest with sufficient resolution Telephoto Lens The required focal length of the lens is calculated based on the magnification we need so that, for a given subject standoff distance from the lens, we obtain reasonable resolution across the iris. The typical human iris has a diameter of 12mm [16]. Assuming we need approximately 200 pixels across the iris for the recognition module-as suggested in [12]-the magnification required may be determined as shown below. Typical diamter of the human iris, h o = 12mm Image size required, h i = 200pixels = 200 (side of a sensor pixel)mm = 200 ( )mm = 1.29mm The side of the sensor pixel is obtained from the specifications of the camera we use (see section 2.1.2). Hence the magnification required is at least M = h o = 1.29 = (1) h o 12 Consider f to be the focal length of the lens and D to be the distance of the object (here the subject of interest) from the front of the lens, then the required f is determined as f = M D 1 + M = = 776mm (2) where we have taken D = 8m as the subject stand-off distance from the system. In our system we use the Canon 800mm lens [3] as this would satisfy the above focal length requirement. Additionally, the reader should note that for a value of D greater than this, a higher focal length would be required and would necessitate the use of focal length extenders such as those seen in [2]. However, the use of

3 Figure 1: Variation in focal length required with subject stand-off distance to obtain a 200 pixel resolution across an acquired iris. Figure 2: Variation in iris resolution in pixels with subject stand-off distances, when an 800mm lens is used for imaging. such extenders is known to increase aberrations during image acquisition. Hence, for the purpose of this paper, we perform our imaging experiments without the use of such extenders. Figure 1 shows the variation in focal length with subject stand-off distance to obtain a 200 pixel resolution across an iris, using eqn(2). Figure 2 shows the pixel resolution obtained at various distances using the 800mm lens High Resolution Sensor For the purpose of this paper, we chose a camera with a very high pixel count - the Canon 5D Mark II [4]. A higher pixel count ensures that both the face and iris may be acquired with the required resolution, from a single captured frame. This camera has a full frame sensor with dimensions, 36mm x 24mm. As was shown in the previous section, we use a focal length of 800mm for our imaging requirements. Rearranging the terms in eqn(2), we see that for a value of f = 800mm and stand-off distance of D = 8m, the magnification M = f = 0.11 (3) D f From this value, we can determine the field of view at 8m for the given sensor size to be 32.4cm 21.6cm. However at D = 4m, the field of view is calculated to be 14.4cm 9.6cm. Due to the decrease in the longer dimension of the field of view, at distances closer than 4m it becomes difficult to accommodate the entire face within a frame. Our aim in this paper, as mentioned previously, is to acquire enrollment quality iris images (with pixel resolution 200 pixels) using this device. Due to this constraint as well as constraints on focal length used (section 2.1.1) and the diminishing field of view at shorter stand-off distances, in this paper, we have limited our capture range to 4m 8m. If we relax the pixel resolution required across the iris, then images may be captured at distances of 4m 12m using our current setup (see Figure2). The camera is operated in the portrait mode so that the longer dimension of the field of view corresponds to the length of the face Auto-focus module In order to ensure good focus across all acquired images, we have designed and developed a library of auto-focus routines for use with the system. The Canon Application Programming Interface (API) [5] does not provide easy control over the auto-focus hardware within the camera body. As a result we opted to modify an existing RS232 lens control module manufactured by Birger TM Engineering [1] for the purpose of this system. This ensures we have control over the lens, independent of the camera body. The auto-focus routine developed for the purpose of this paper includes a focus measure function based on spatial gradients. The focus measure over the frame is given by the mean of the magnitude of the two-dimensional gradient at every pixel position within the frame. If f measure represents the focus measure over the frame I of dimension m n, then where, f measure = 1 mn I i,j = (δii,j δx I i,j is the pixel intensity at (i, j). m i=1 j=1 ) 2 + n I i,j (4) ( δii,j 2.3. Speed Estimation for Mobile Subjects δy ) 2 (5) In the case of a mobile subject, we estimate his/her speed as he/she walks towards the system. We assume a constant velocity model for the purpose of this paper. The focus position of the system is set at a checkpoint A at a distance

4 D A from the system, upon initialization. As the subject approaches the system and crosses A, the system starts a timer and moves the focus position to a checkpoint B at a distance D B (see 3). Once the subject crosses this position too, due to the constant velocity assumption, we can estimate the speed and can predict the subjects position C at any point in time. We use this information track the subjects focus with time. Figure 3: Experimentally determined relationship between focus encoder steps of the Birger TM Engineering module we use and the stand-off distance of the subject from the lens. In order to perform focus tracking effectively, the position of the subject has to be mapped onto a stepper motor focus encoder value within the Birger TM mount [1]. Figure 3 shows a graph of distance from system in meters vs. encoder steps, which was experimentally determined. From this graph, we notice that for very small variation in distance, the relationship between distance and focus encoder position can be assumed to be linear. Since we are interested in fine focus adjustment for motion across position C (see Figure 3), we use a linear mapping scheme between distance and encoder steps for our experiments. Once the speed has been estimated, we move the focal length to the location C at which an iris image with the desired resolution may be obtained. Given the resolution required in pixels, the image size h i may be estimated as in eqn(1). Since we are using a lens with focal length f = 800mm, the maximum distance from the system D C at which an iris of this resolution can be acquired, may be obtained from eqns(1) and (2) as, D C = f(h o + h i ) h i (6) Once this is set, we capture a set of images continuously while the subject moves past C. This burst capture mode includes a fine focus tracking procedure as mentioned. Along with the estimated subject speed, this procedure also requires knowledge of the minimum time interval between two consecutive shutter activations of the imaging device. For the Canon 5D Mark II this value is 5 frames per second. In other words, the minimum time interval in the burst mode during which no images can be acquired (between two shutter activations) is 0.2 seconds. Hence, the subject distance from system, D i+1 for each consecutive frame may be determined as, ( ) DB D A D i+1 = D i t B t A i = 1, 2,..., N (7) where D 1 = D C. The linear mapping scheme mentioned earlier enables us to convert this value into an equivalent focus encoder step at each consecutive frame. In addition to the system components that have been detailed above, another essential component is the face tracking component. This is necessary in order to keep the subject within the frame of the Canon 5D Mark II. A wide angle scene camera tracks the subject during the entire acquisition process from location A, all the way till the multiple high resolution images are captured. The tracking is achieved by means of a standard Kalman filter implementation which is found in works such as by [20]. The face tracker directs the motion of a pan/tilt mechanism integrated with the system Infra-Red Illumination Panel It is seen that in most irises, a lot of incident light in the visible spectrum is absorbed and some of it is reflected off the cornea. However, most of the light in the infra-red wavelength, incident on the iris, is reflected back and can be imaged by the sensor. Boyce et al [14] has shown experimentally the validity of this argument and concludes that imaging the iris in the infra-red wavelength provides most of the discriminating information for an iris recognition system. We use a standard infra-red LED source such as the one seen in [6] for our experiments. Four panels of these low power LEDs are optimally aligned such that the individual beams superimpose over the required acquisition range. This provides sufficient illumination to image the necessary discriminating information within the iris texture. The system is also fitted with a filter wheel to switch between the visible spectrum and the infra-red spectrum as required. For the visible block filter, we use an 850nm bandpass filter since, from our experiments we see that most of the iris texture is clearly discernible when imaged around this wavelength. Figure 5 shows the system that was built using the components described. 3. Acquisition Results In this section we present high resolution face and eye images captured using the system described in the previous section. We also compare the eye images captured, to those captured using conventional iris acquisition systems.

5 Figure 4: As the subject approaches our system, and crosses variable checkpoints A and B, we estimate his/her speed. Once this is done, the focus position is set to a position C at which we are assured to obtain an iris of required resolution (eqn. (6)). A number of in-focus images are then acquired by changing the focus continuously based on subject distances estimated using eqn. (7) Iris Acquisition from stationary subjects (a) Our first set of experiments with this system involved a simple focus estimation stage and acquisition of eye images when a subject stands at an arbitrary location in front of the system. During this experiment, the subject is tracked and when he/she is still, the auto-focus module uses the focus measuring scheme outlined in section 2.2 and sets the focus position of the lens to an appropriate value. Figure 6 shows an example of a face that was captured during this experiment and Figure 7 shows a few eye images cropped out from such faces at different distances from the camera. We compare the acquired images with images acquired using standard iris acquisition systems in Figure 9. We see that using the proposed system, one can achieve enrollment quality iris images from a much greater stand-off distance compared to conventional systems Iris Acquisition from mobile subjects (b) Figure 5: The figure shows the system that was designed to capture images of static or mobile subjects from a large stand-off distance. Here we show the images that were acquired when the focus position of the lens was tracked automatically based on estimated subject speed (section 2.3). Figure 8 shows iris images that were captured from a subject moving at speeds of between 0.3m/s and 0.6m/s. From all the experiments conducted, we decided that a sufficient number of images to be acquired as the subject passes through the location C is 10. Of these 10 images, an average of 4 images is always in focus. The remainder of the images has erroneous focus settings due to minor variations in the speed of the subject, and motion blur which is to be expected in a real world setting. The requirement of 10 images is not a drawback, since the entire acquisition process when the subject is passing through C takes less than two seconds. Our cur-

6 (a) Figure 6: An example of a face captured during our experiments at stand-off distances between 4-8 meters. This face was captured from a distance of 7 meters. Once the face is acquired, we detect the eye regions and crop them out. Examples of such cropped out eye regions with iris pattern visible are shown in Figure 7. rent work involves measures to decrease the exposure time during image capture so that motion blur may be minimized and to better tune the focus tracking to the subject s motion. This will help reduce the number of burst images to be acquired. 4. Conclusion In this paper, we ve outlined the design and development of a system to acquire high resolution face and iris images from mobile (or stationary) subjects using a single imaging sensor. For mobile subjects, we have included an optional sensor fitted with a wide angle lens to track his/her movements during the acquisition process. As the subject moves towards the system, a kalman filter based face tracking component moves a pan/tilt mechanism. This is so that the face is always centered in the frame of the high resolution sensor. During the course of the subjects motion, we estimate (b) Figure 7: (a) shows an iris image capture from a subject standing still at a distance of 6 meters from our system and (b) shows an image from the same subject at 7 meters. his/her speed using a real time (lower resolution) feed from the high resolution sensor. Using this estimated value, the focus of the system is finely tuned to the subjects motion and a set of high resolution images are captured when he/she passes through a desired location. This desired location (location C in Figure 3) is determined based on the pixel resolution required across the acquired iris (see Figure 2). A sample of images acquired from both stationary and mobile subjects can be seen in Figure 7 and Figure 8. Our current work involves developing a library of iris segmentation and feature extraction functions for use with this system. By imposing a 200 pixel requirement on iris images for enrollment, the subject can be at a maximum stand-off of 8 meters (Figure 2). However, we feel that the pixel requirement can be relaxed during the testing stage during which lower resolution iris images may be acquired and compared with the enrollment images. In that case, the-

7 References (a) (b) Figure 8: (a) and (b) show images from two different capture sessions as the subject walked at speeds between 0.3 and 0.6 m/s. The images were captured at a distance of approximately 6m. oretically, as can be seen in Figure 2, test images may be acquired for distances even beyond 10 meters. An optimum set of filters will have to be designed to pick out, consistently, the most discriminating features from both these resolutions. Also, we are in the process of tuning the focus estimation module, so that a lesser number of burst images need be acquired to obtain a clear iris from the subject. Eventually we hope to use this system in a completely unconstrained environment to enroll and recognize subjects on the move from a large stand-off distance. 5. Acknowledgement We thank the Biometrics Identity Management Agency (BIMA) for supporting this research through ARL grant W911NF [1] Birger engineering. last accessed on 8/11/2011. [2] Canon 1.4X Focal Length extender available at. consumer/eos_slr_camera_systems/lenses/ extender_ef_1_4x_ii last accessed on 8/11/2011. [3] Canon 800mm f5.6 USM telephoto lens available at. products/cameras/ef_lens_lineup/ef_ 800mm_f_5_6l_is_usm last accessed on 8/11/2011. [4] Canon EOS 5D Mark II available at. usa.canon.com/cusa/consumer/products/ cameras/slr_cameras/eos_5d_mark_ii last accessed on 8/11/2011. [5] Canon SDK available at. com/cusa/consumer/standard_display/sdk_ homepage last accessed on 8/11/2011. [6] Infra-Red LED 850nm available at. sparkfun.com/products/9469 last accessed on 8/11/2011. [7] Insight Iris Recognition System available at. last accessed on 8/11/2011. [8] Lg IrisAccess available at. ps/products/irisaccess4000.html. [9] OKI Irispass available at. iris/. [10] Panasonic BM-ET330 available at. http: // bm-et300_demo/index.html last accessed on 8/11/2011. [11] PIER-T iris Acquisition System available at. last accessed on 8/11/2011. [12] A. I Iris Image Interchange Format. [13] F. Bashir, P. Casaverde, D. Usher, and M. Friedman. Eagleeyes: A system for iris recognition at a distance. In Technologies for Homeland Security, 2008 IEEE Conference on, pages , may [14] C. Boyce, A. Ross, M. Monaco, L. Hornak, and X. Li. Multispectral iris analysis: A preliminary study51. In Computer Vision and Pattern Recognition Workshop, CVPRW 06. Conference on, page 51, june [15] J. Daugman. Probing the uniqueness and randomness of iriscodes: Results from 200 billion iris pair comparisons. Proceedings of the IEEE, 94(11): , nov [16] J. Forrester, A. Dick, P. Mcmenamin, and W. Lee. The Eye: Basic Sciences in Practice. W.B. Saunder, London, [17] G. Guo, M. Jones, and B. P. A System for Automatic Iris Capturing. Mitsubishi Electric Research Laboratory http: // [18] J. Matey, O. Naroditsky, K. Hanna, R. Kolczynski, D. LoIacono, S. Mangru, M. Tinker, T. Zappia, and W. Zhao. Iris on the move: Acquisition of images for iris recognition in less constrained environments. Proceedings of the IEEE, 94(11): , nov

8 (a) (b) (c) (d) Figure 9: This figure compares iris images using (a) our system from 9m, with images captured using the (b) LG system [8] from 30cm, (c) the PIER [11] from 15cm and (d) the iris on the move system by Sarnoff [18] from 3m. [19] P. Phillips, K. Bowyer, P. Flynn, X. Liu, and W. Scruggs. The iris challenge evaluation In Biometrics: Theory, Applications and Systems, BTAS nd IEEE International Conference on, pages 1 8, oct [20] P. Turaga, G. Singh, and P. Bora. Face tracking using kalman filter with dynamic noise statistics. In TENCON IEEE Region 10 Conference, volume A, pages Vol. 1, nov [21] S. Venugopalan and M. Savvides. Unconstrained iris acquisition and recognition using cots ptz camera. EURASIP J. Adv. Signal Process, 2010:38:1 38:20, February [22] F. Wheeler, A. Perera, G. Abramovich, B. Yu, and P. Tu. Stand-off iris recognition system. In Biometrics: Theory, Applications and Systems, BTAS nd IEEE International Conference on, pages 1 7, oct

A design of iris recognition system at a distance

A design of iris recognition system at a distance A design of iris recognition system at a distance Wenbo Dong, Zhenan Sun, Tieniu Tan Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China E-mail:{wbdong, znsun, tnt}@nlpr.ia.ac.cn

More information

Recent research results in iris biometrics

Recent research results in iris biometrics Recent research results in iris biometrics Karen Hollingsworth, Sarah Baker, Sarah Ring Kevin W. Bowyer, and Patrick J. Flynn Computer Science and Engineering Department, University of Notre Dame, Notre

More information

Experiments with An Improved Iris Segmentation Algorithm

Experiments with An Improved Iris Segmentation Algorithm Experiments with An Improved Iris Segmentation Algorithm Xiaomei Liu, Kevin W. Bowyer, Patrick J. Flynn Department of Computer Science and Engineering University of Notre Dame Notre Dame, IN 46556, U.S.A.

More information

Global and Local Quality Measures for NIR Iris Video

Global and Local Quality Measures for NIR Iris Video Global and Local Quality Measures for NIR Iris Video Jinyu Zuo and Natalia A. Schmid Lane Department of Computer Science and Electrical Engineering West Virginia University, Morgantown, WV 26506 jzuo@mix.wvu.edu

More information

ANALYSIS OF PARTIAL IRIS RECOGNITION

ANALYSIS OF PARTIAL IRIS RECOGNITION ANALYSIS OF PARTIAL IRIS RECOGNITION Yingzi Du, Robert Ives, Bradford Bonney, Delores Etter Electrical Engineering Department, U.S. Naval Academy, Annapolis, MD, USA 21402 ABSTRACT In this paper, we investigate

More information

An Un-awarely Collected Real World Face Database: The ISL-Door Face Database

An Un-awarely Collected Real World Face Database: The ISL-Door Face Database An Un-awarely Collected Real World Face Database: The ISL-Door Face Database Hazım Kemal Ekenel, Rainer Stiefelhagen Interactive Systems Labs (ISL), Universität Karlsruhe (TH), Am Fasanengarten 5, 76131

More information

Chapter 25 Optical Instruments

Chapter 25 Optical Instruments Chapter 25 Optical Instruments Units of Chapter 25 Cameras, Film, and Digital The Human Eye; Corrective Lenses Magnifying Glass Telescopes Compound Microscope Aberrations of Lenses and Mirrors Limits of

More information

STEM Spectrum Imaging Tutorial

STEM Spectrum Imaging Tutorial STEM Spectrum Imaging Tutorial Gatan, Inc. 5933 Coronado Lane, Pleasanton, CA 94588 Tel: (925) 463-0200 Fax: (925) 463-0204 April 2001 Contents 1 Introduction 1.1 What is Spectrum Imaging? 2 Hardware 3

More information

Spatial Resolution as an Iris Quality Metric

Spatial Resolution as an Iris Quality Metric Spatial Resolution as an Iris Quality Metric David Ackerman SRI International Sarnoff Biometrics Consortium Conference Tampa, Florida September 8, Iris images with varying spatial resolution high medium

More information

Empirical Evidence for Correct Iris Match Score Degradation with Increased Time-Lapse between Gallery and Probe Matches

Empirical Evidence for Correct Iris Match Score Degradation with Increased Time-Lapse between Gallery and Probe Matches Empirical Evidence for Correct Iris Match Score Degradation with Increased Time-Lapse between Gallery and Probe Matches Sarah E. Baker, Kevin W. Bowyer, and Patrick J. Flynn University of Notre Dame {sbaker3,kwb,flynn}@cse.nd.edu

More information

Instructions for the Experiment

Instructions for the Experiment Instructions for the Experiment Excitonic States in Atomically Thin Semiconductors 1. Introduction Alongside with electrical measurements, optical measurements are an indispensable tool for the study of

More information

Compact Dual Field-of-View Telescope for Small Satellite Payloads

Compact Dual Field-of-View Telescope for Small Satellite Payloads Compact Dual Field-of-View Telescope for Small Satellite Payloads James C. Peterson Space Dynamics Laboratory 1695 North Research Park Way, North Logan, UT 84341; 435-797-4624 Jim.Peterson@sdl.usu.edu

More information

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the ECEN 4606 Lab 8 Spectroscopy SUMMARY: ROBLEM 1: Pedrotti 3 12-10. In this lab, you will design, build and test an optical spectrum analyzer and use it for both absorption and emission spectroscopy. The

More information

INTERNATIONAL RESEARCH JOURNAL IN ADVANCED ENGINEERING AND TECHNOLOGY (IRJAET)

INTERNATIONAL RESEARCH JOURNAL IN ADVANCED ENGINEERING AND TECHNOLOGY (IRJAET) INTERNATIONAL RESEARCH JOURNAL IN ADVANCED ENGINEERING AND TECHNOLOGY (IRJAET) www.irjaet.com ISSN (PRINT) : 2454-4744 ISSN (ONLINE): 2454-4752 Vol. 1, Issue 4, pp.240-245, November, 2015 IRIS RECOGNITION

More information

Why select a BOS zoom lens over a COTS lens?

Why select a BOS zoom lens over a COTS lens? Introduction The Beck Optronic Solutions (BOS) range of zoom lenses are sometimes compared to apparently equivalent commercial-off-the-shelf (or COTS) products available from the large commercial lens

More information

Iris Segmentation & Recognition in Unconstrained Environment

Iris Segmentation & Recognition in Unconstrained Environment www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume - 3 Issue -8 August, 2014 Page No. 7514-7518 Iris Segmentation & Recognition in Unconstrained Environment ABSTRACT

More information

Iris Recognition using Histogram Analysis

Iris Recognition using Histogram Analysis Iris Recognition using Histogram Analysis Robert W. Ives, Anthony J. Guidry and Delores M. Etter Electrical Engineering Department, U.S. Naval Academy Annapolis, MD 21402-5025 Abstract- Iris recognition

More information

Spatially Resolved Backscatter Ceilometer

Spatially Resolved Backscatter Ceilometer Spatially Resolved Backscatter Ceilometer Design Team Hiba Fareed, Nicholas Paradiso, Evan Perillo, Michael Tahan Design Advisor Prof. Gregory Kowalski Sponsor, Spectral Sciences Inc. Steve Richstmeier,

More information

mm F2.6 6MP IR-Corrected. Sensor size

mm F2.6 6MP IR-Corrected. Sensor size 1 1 inch and 1/1.2 inch image size spec. Sensor size 1-inch 1/1.2-inch 2/3-inch Image circle OK OK OK OK 1/1.8-inch OK 1/2-inch OK 1/2.5-inch 1 1-inch CMV4000 PYTHON5000 KAI-02150 KAI-2020 KAI-2093 KAI-4050

More information

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Bill Freeman Frédo Durand MIT - EECS Administrivia PSet 1 is out Due Thursday February 23 Digital SLR initiation? During

More information

ILLUMINATION AND IMAGE PROCESSING FOR REAL-TIME CONTROL OF DIRECTED ENERGY DEPOSITION ADDITIVE MANUFACTURING

ILLUMINATION AND IMAGE PROCESSING FOR REAL-TIME CONTROL OF DIRECTED ENERGY DEPOSITION ADDITIVE MANUFACTURING Solid Freeform Fabrication 2016: Proceedings of the 26th 27th Annual International Solid Freeform Fabrication Symposium An Additive Manufacturing Conference ILLUMINATION AND IMAGE PROCESSING FOR REAL-TIME

More information

Lab Report 3: Speckle Interferometry LIN PEI-YING, BAIG JOVERIA

Lab Report 3: Speckle Interferometry LIN PEI-YING, BAIG JOVERIA Lab Report 3: Speckle Interferometry LIN PEI-YING, BAIG JOVERIA Abstract: Speckle interferometry (SI) has become a complete technique over the past couple of years and is widely used in many branches of

More information

Guide to SPEX Optical Spectrometer

Guide to SPEX Optical Spectrometer Guide to SPEX Optical Spectrometer GENERAL DESCRIPTION A spectrometer is a device for analyzing an input light beam into its constituent wavelengths. The SPEX model 1704 spectrometer covers a range from

More information

Self-adaptive iris image acquisition system

Self-adaptive iris image acquisition system Self-adaptive iris image acquisition system Wenbo Dong, Zhenan Sun, Tieniu Tan, Xianchao Qiu National Laboratory of Pattern Recognition, Institute of Automation, Academy of Sciences, No.95 Zhongguancun

More information

EC-433 Digital Image Processing

EC-433 Digital Image Processing EC-433 Digital Image Processing Lecture 2 Digital Image Fundamentals Dr. Arslan Shaukat 1 Fundamental Steps in DIP Image Acquisition An image is captured by a sensor (such as a monochrome or color TV camera)

More information

BEing an internal organ, naturally protected, visible from

BEing an internal organ, naturally protected, visible from On the Feasibility of the Visible Wavelength, At-A-Distance and On-The-Move Iris Recognition (Invited Paper) Hugo Proença Abstract The dramatic growth in practical applications for iris biometrics has

More information

Facial Biometric For Performance. Best Practice Guide

Facial Biometric For Performance. Best Practice Guide Facial Biometric For Performance Best Practice Guide Foreword State-of-the-art face recognition systems under controlled lighting condition are proven to be very accurate with unparalleled user-friendliness,

More information

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems Ricardo R. Garcia University of California, Berkeley Berkeley, CA rrgarcia@eecs.berkeley.edu Abstract In recent

More information

Predicting Eye Color from Near Infrared Iris Images

Predicting Eye Color from Near Infrared Iris Images Predicting Eye Color from Near Infrared Iris Images Denton Bobeldyk 1,2 Arun Ross 1 denny@bobeldyk.org rossarun@cse.msu.edu 1 Michigan State University, East Lansing, USA 2 Davenport University, Grand

More information

Chapter 6 Face Recognition at a Distance: System Issues

Chapter 6 Face Recognition at a Distance: System Issues Chapter 6 Face Recognition at a Distance: System Issues Meng Ao, Dong Yi, Zhen Lei, and Stan Z. Li Abstract Face recognition at a distance (FRAD) is one of the most challenging forms of face recognition

More information

EF-45 Iris Recognition System

EF-45 Iris Recognition System EF-45 Iris Recognition System Innovative face positioning feedback provides outstanding subject ease-of-use at an extended capture range of 35 to 45 cm Product Description The EF-45 is advanced next generation

More information

Presented to you today by the Fort Collins Digital Camera Club

Presented to you today by the Fort Collins Digital Camera Club Presented to you today by the Fort Collins Digital Camera Club www.fcdcc.com Photography: February 19, 2011 Fort Collins Digital Camera Club 2 Film Photography: Photography using light sensitive chemicals

More information

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH Optical basics for machine vision systems Lars Fermum Chief instructor STEMMER IMAGING GmbH www.stemmer-imaging.de AN INTERNATIONAL CONCEPT STEMMER IMAGING customers in UK Germany France Switzerland Sweden

More information

Digital camera modes explained: choose the best shooting mode for your subject

Digital camera modes explained: choose the best shooting mode for your subject Digital camera modes explained: choose the best shooting mode for your subject On most DSLRs, the Mode dial is split into three sections: Scene modes (for doing point-and-shoot photography in specific

More information

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

CPSC 4040/6040 Computer Graphics Images. Joshua Levine CPSC 4040/6040 Computer Graphics Images Joshua Levine levinej@clemson.edu Lecture 04 Displays and Optics Sept. 1, 2015 Slide Credits: Kenny A. Hunt Don House Torsten Möller Hanspeter Pfister Agenda Open

More information

Modeling and Synthesis of Aperture Effects in Cameras

Modeling and Synthesis of Aperture Effects in Cameras Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1 Outline Introduction and Related Work Modeling Vignetting

More information

Compact camera module testing equipment with a conversion lens

Compact camera module testing equipment with a conversion lens Compact camera module testing equipment with a conversion lens Jui-Wen Pan* 1 Institute of Photonic Systems, National Chiao Tung University, Tainan City 71150, Taiwan 2 Biomedical Electronics Translational

More information

Distance Estimation with a Two or Three Aperture SLR Digital Camera

Distance Estimation with a Two or Three Aperture SLR Digital Camera Distance Estimation with a Two or Three Aperture SLR Digital Camera Seungwon Lee, Joonki Paik, and Monson H. Hayes Graduate School of Advanced Imaging Science, Multimedia, and Film Chung-Ang University

More information

List of Publications for Thesis

List of Publications for Thesis List of Publications for Thesis Felix Juefei-Xu CyLab Biometrics Center, Electrical and Computer Engineering Carnegie Mellon University, Pittsburgh, PA 15213, USA felixu@cmu.edu 1. Journal Publications

More information

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and 8.1 INTRODUCTION In this chapter, we will study and discuss some fundamental techniques for image processing and image analysis, with a few examples of routines developed for certain purposes. 8.2 IMAGE

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD U.S.A.

Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD U.S.A. Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 U.S.A. Video Detection and Monitoring of Smoke Conditions Abstract Initial tests

More information

CAMERA BASICS. Stops of light

CAMERA BASICS. Stops of light CAMERA BASICS Stops of light A stop of light isn t a quantifiable measurement it s a relative measurement. A stop of light is defined as a doubling or halving of any quantity of light. The word stop is

More information

A Short History of Using Cameras for Weld Monitoring

A Short History of Using Cameras for Weld Monitoring A Short History of Using Cameras for Weld Monitoring 2 Background Ever since the development of automated welding, operators have needed to be able to monitor the process to ensure that all parameters

More information

Exposure settings & Lens choices

Exposure settings & Lens choices Exposure settings & Lens choices Graham Relf Tynemouth Photographic Society September 2018 www.tynemouthps.org We will look at the 3 variables available for manual control of digital photos: Exposure time/duration,

More information

Consumer digital CCD cameras

Consumer digital CCD cameras CAMERAS Consumer digital CCD cameras Leica RC-30 Aerial Cameras Zeiss RMK Zeiss RMK in aircraft Vexcel UltraCam Digital (note multiple apertures Lenses for Leica RC-30. Many elements needed to minimize

More information

How does prism technology help to achieve superior color image quality?

How does prism technology help to achieve superior color image quality? WHITE PAPER How does prism technology help to achieve superior color image quality? Achieving superior image quality requires real and full color depth for every channel, improved color contrast and color

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

Curriculum Vitae. Computer Vision, Image Processing, Biometrics. Computer Vision, Vision Rehabilitation, Vision Science

Curriculum Vitae. Computer Vision, Image Processing, Biometrics. Computer Vision, Vision Rehabilitation, Vision Science Curriculum Vitae Date Prepared: 01/09/2016 (last updated: 09/12/2016) Name: Shrinivas J. Pundlik Education 07/2002 B.E. (Bachelor of Engineering) Electronics Engineering University of Pune, Pune, India

More information

Digital Image Processing COSC 6380/4393

Digital Image Processing COSC 6380/4393 Digital Image Processing COSC 6380/4393 Lecture 2 Aug 24 th, 2017 Slides from Dr. Shishir K Shah, Rajesh Rao and Frank (Qingzhong) Liu 1 Instructor TA Digital Image Processing COSC 6380/4393 Pranav Mantini

More information

Tools for Iris Recognition Engines. Martin George CEO Smart Sensors Limited (UK)

Tools for Iris Recognition Engines. Martin George CEO Smart Sensors Limited (UK) Tools for Iris Recognition Engines Martin George CEO Smart Sensors Limited (UK) About Smart Sensors Limited Owns and develops Intellectual Property for image recognition, identification and analytics applications

More information

Lenses, exposure, and (de)focus

Lenses, exposure, and (de)focus Lenses, exposure, and (de)focus http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 15 Course announcements Homework 4 is out. - Due October 26

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Telling What-Is-What in Video. Gerard Medioni

Telling What-Is-What in Video. Gerard Medioni Telling What-Is-What in Video Gerard Medioni medioni@usc.edu 1 Tracking Essential problem Establishes correspondences between elements in successive frames Basic problem easy 2 Many issues One target (pursuit)

More information

Tunable wideband infrared detector array for global space awareness

Tunable wideband infrared detector array for global space awareness Tunable wideband infrared detector array for global space awareness Jonathan R. Andrews 1, Sergio R. Restaino 1, Scott W. Teare 2, Sanjay Krishna 3, Mike Lenz 3, J.S. Brown 3, S.J. Lee 3, Christopher C.

More information

1. INTRODUCTION. Appeared in: Proceedings of the SPIE Biometric Technology for Human Identification II, Vol. 5779, pp , Orlando, FL, 2005.

1. INTRODUCTION. Appeared in: Proceedings of the SPIE Biometric Technology for Human Identification II, Vol. 5779, pp , Orlando, FL, 2005. Appeared in: Proceedings of the SPIE Biometric Technology for Human Identification II, Vol. 5779, pp. 41-50, Orlando, FL, 2005. Extended depth-of-field iris recognition system for a workstation environment

More information

Learning and Using Models of Kicking Motions for Legged Robots

Learning and Using Models of Kicking Motions for Legged Robots Learning and Using Models of Kicking Motions for Legged Robots Sonia Chernova and Manuela Veloso Computer Science Department Carnegie Mellon University Pittsburgh, PA 15213 {soniac, mmv}@cs.cmu.edu Abstract

More information

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Ankit Mohan & Jack Tumblin Amit Agrawal, Mitsubishi Electric Research

More information

ISSN: [Deepa* et al., 6(2): February, 2017] Impact Factor: 4.116

ISSN: [Deepa* et al., 6(2): February, 2017] Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY IRIS RECOGNITION BASED ON IRIS CRYPTS Asst.Prof. N.Deepa*, V.Priyanka student, J.Pradeepa student. B.E CSE,G.K.M college of engineering

More information

Computational Photography and Video. Prof. Marc Pollefeys

Computational Photography and Video. Prof. Marc Pollefeys Computational Photography and Video Prof. Marc Pollefeys Today s schedule Introduction of Computational Photography Course facts Syllabus Digital Photography What is computational photography Convergence

More information

PH 481/581 Physical Optics Winter 2014

PH 481/581 Physical Optics Winter 2014 PH 481/581 Physical Optics Winter 2014 Laboratory #1 Week of January 13 Read: Handout (Introduction & Projects #2 & 3 from Newport Project in Optics Workbook), pp.150-170 of Optics by Hecht Do: 1. Experiment

More information

Supplementary Materials

Supplementary Materials Supplementary Materials In the supplementary materials of this paper we discuss some practical consideration for alignment of optical components to help unexperienced users to achieve a high performance

More information

Kit for building your own THz Time-Domain Spectrometer

Kit for building your own THz Time-Domain Spectrometer Kit for building your own THz Time-Domain Spectrometer 16/06/2016 1 Table of contents 0. Parts for the THz Kit... 3 1. Delay line... 4 2. Pulse generator and lock-in detector... 5 3. THz antennas... 6

More information

Macro and Close-up Photography

Macro and Close-up Photography Photo by Daniel Schwen Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan What Is Macro Photography? Macro commonly refers to

More information

Note on CASIA-IrisV3

Note on CASIA-IrisV3 Note on CASIA-IrisV3 1. Introduction With fast development of iris image acquisition technology, iris recognition is expected to become a fundamental component of modern society, with wide application

More information

Autotracker III. Applications...

Autotracker III. Applications... Autotracker III Harmonic Generation System Model AT-III Applications... Automatic Second Harmonic and Third Harmonic Generation of UV Wavelengths Automatic Production of IR Wavelengths by Difference Frequency

More information

White paper. Low Light Level Image Processing Technology

White paper. Low Light Level Image Processing Technology White paper Low Light Level Image Processing Technology Contents 1. Preface 2. Key Elements of Low Light Performance 3. Wisenet X Low Light Technology 3. 1. Low Light Specialized Lens 3. 2. SSNR (Smart

More information

Buxton & District U3A Digital Photography Beginners Group

Buxton & District U3A Digital Photography Beginners Group U3A Group Lesson 7: Controlling exposure / focal length / perspective / composition for a better picture & Taking Pictures of people 3 December 2013 Programme Buxton & District 19 September Exploring your

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

Near Infrared Face Image Quality Assessment System of Video Sequences

Near Infrared Face Image Quality Assessment System of Video Sequences 2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University

More information

Instruction Manual for HyperScan Spectrometer

Instruction Manual for HyperScan Spectrometer August 2006 Version 1.1 Table of Contents Section Page 1 Hardware... 1 2 Mounting Procedure... 2 3 CCD Alignment... 6 4 Software... 7 5 Wiring Diagram... 19 1 HARDWARE While it is not necessary to have

More information

Intro to Digital SLR and ILC Photography Week 1 The Camera Body

Intro to Digital SLR and ILC Photography Week 1 The Camera Body Intro to Digital SLR and ILC Photography Week 1 The Camera Body Instructor: Roger Buchanan Class notes are available at www.thenerdworks.com Course Outline: Week 1 Camera Body; Week 2 Lenses; Week 3 Accessories,

More information

The Results of the NICE.II Iris Biometrics Competition. Kevin W. Bowyer. Department of Computer Science and Engineering. University of Notre Dame

The Results of the NICE.II Iris Biometrics Competition. Kevin W. Bowyer. Department of Computer Science and Engineering. University of Notre Dame The Results of the NICE.II Iris Biometrics Competition Kevin W. Bowyer Department of Computer Science and Engineering University of Notre Dame Notre Dame, Indiana 46556 USA kwb@cse.nd.edu Abstract. The

More information

A SHORT SURVEY OF IRIS IMAGES DATABASES

A SHORT SURVEY OF IRIS IMAGES DATABASES A SHORT SURVEY OF IRIS IMAGES DATABASES ABSTRACT Mustafa M. Alrifaee, Mohammad M. Abdallah and Basem G. Al Okush Al-Zaytoonah University of Jordan, Amman, Jordan Iris recognition is the most accurate form

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 4, NO. 4, DECEMBER

IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 4, NO. 4, DECEMBER IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, VOL. 4, NO. 4, DECEMBER 2009 837 Iris Recognition Using Signal-Level Fusion of Frames From Video Karen Hollingsworth, Tanya Peters, Kevin W. Bowyer,

More information

Digital Photographic Imaging Using MOEMS

Digital Photographic Imaging Using MOEMS Digital Photographic Imaging Using MOEMS Vasileios T. Nasis a, R. Andrew Hicks b and Timothy P. Kurzweg a a Department of Electrical and Computer Engineering, Drexel University, Philadelphia, USA b Department

More information

ME 6406 MACHINE VISION. Georgia Institute of Technology

ME 6406 MACHINE VISION. Georgia Institute of Technology ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class

More information

Deblurring. Basics, Problem definition and variants

Deblurring. Basics, Problem definition and variants Deblurring Basics, Problem definition and variants Kinds of blur Hand-shake Defocus Credit: Kenneth Josephson Motion Credit: Kenneth Josephson Kinds of blur Spatially invariant vs. Spatially varying

More information

CMDragons 2009 Team Description

CMDragons 2009 Team Description CMDragons 2009 Team Description Stefan Zickler, Michael Licitra, Joydeep Biswas, and Manuela Veloso Carnegie Mellon University {szickler,mmv}@cs.cmu.edu {mlicitra,joydeep}@andrew.cmu.edu Abstract. In this

More information

A New Fake Iris Detection Method

A New Fake Iris Detection Method A New Fake Iris Detection Method Xiaofu He 1, Yue Lu 1, and Pengfei Shi 2 1 Department of Computer Science and Technology, East China Normal University, Shanghai 200241, China {xfhe,ylu}@cs.ecnu.edu.cn

More information

Imaging with hyperspectral sensors: the right design for your application

Imaging with hyperspectral sensors: the right design for your application Imaging with hyperspectral sensors: the right design for your application Frederik Schönebeck Framos GmbH f.schoenebeck@framos.com June 29, 2017 Abstract In many vision applications the relevant information

More information

Study of self-interference incoherent digital holography for the application of retinal imaging

Study of self-interference incoherent digital holography for the application of retinal imaging Study of self-interference incoherent digital holography for the application of retinal imaging Jisoo Hong and Myung K. Kim Department of Physics, University of South Florida, Tampa, FL, US 33620 ABSTRACT

More information

IR Laser Illuminators

IR Laser Illuminators Eagle Vision PAN/TILT THERMAL & COLOR CAMERAS - All Weather Rugged Housing resist high humidity and salt water. - Image overlay combines thermal and video image - The EV3000 CCD colour night vision camera

More information

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

More information

Pixel CCD RASNIK. Kevan S Hashemi and James R Bensinger Brandeis University May 1997

Pixel CCD RASNIK. Kevan S Hashemi and James R Bensinger Brandeis University May 1997 ATLAS Internal Note MUON-No-180 Pixel CCD RASNIK Kevan S Hashemi and James R Bensinger Brandeis University May 1997 Introduction This note compares the performance of the established Video CCD version

More information

Development of optical imaging system for LIGO test mass contamination and beam position monitoring

Development of optical imaging system for LIGO test mass contamination and beam position monitoring Development of optical imaging system for LIGO test mass contamination and beam position monitoring Chen Jie Xin Mentors: Keita Kawabe, Rick Savage, Dan Moraru Progress Report 2: 29 July 2016 Summary of

More information

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific

More information

Instruction manual for T3DS software. Tool for THz Time-Domain Spectroscopy. Release 4.0

Instruction manual for T3DS software. Tool for THz Time-Domain Spectroscopy. Release 4.0 Instruction manual for T3DS software Release 4.0 Table of contents 0. Setup... 3 1. Start-up... 5 2. Input parameters and delay line control... 6 3. Slow scan measurement... 8 4. Fast scan measurement...

More information

CRISATEL High Resolution Multispectral System

CRISATEL High Resolution Multispectral System CRISATEL High Resolution Multispectral System Pascal Cotte and Marcel Dupouy Lumiere Technology, Paris, France We have designed and built a high resolution multispectral image acquisition system for digitizing

More information

EX85 Megapixel-IP Infrared Imager powered by

EX85 Megapixel-IP Infrared Imager powered by Megapixel IP Infrared Imaging (I 3 ) Design Black Diamond Infrared Bit-Reduce Design - IP67 Rated DCRI Performance Parameters Detection Classification Recognition Identification 420ft (128m) 320ft (98m)

More information

Training Guide for Leica SP8 Confocal/Multiphoton Microscope

Training Guide for Leica SP8 Confocal/Multiphoton Microscope Training Guide for Leica SP8 Confocal/Multiphoton Microscope LAS AF v3.3 Optical Imaging & Vital Microscopy Core Baylor College of Medicine (2017) Power ON Routine 1 2 Turn ON power switch for epifluorescence

More information

UNDERSTANDING LENSES

UNDERSTANDING LENSES 1 UNDERSTANDING LENSES INTRODUCTION This article is part of the Understanding CCTV Series which are abstracts from STAM InSight - The Award Winning CCTV Program on CD-ROM. This CD-ROM has many innovative

More information

The 2019 Biometric Technology Rally

The 2019 Biometric Technology Rally DHS SCIENCE AND TECHNOLOGY The 2019 Biometric Technology Rally Kickoff Webinar, November 5, 2018 Arun Vemury -- DHS S&T Jake Hasselgren, John Howard, and Yevgeniy Sirotin -- The Maryland Test Facility

More information

Student Attendance Monitoring System Via Face Detection and Recognition System

Student Attendance Monitoring System Via Face Detection and Recognition System IJSTE - International Journal of Science Technology & Engineering Volume 2 Issue 11 May 2016 ISSN (online): 2349-784X Student Attendance Monitoring System Via Face Detection and Recognition System Pinal

More information

So far, I have discussed setting up the camera for

So far, I have discussed setting up the camera for Chapter 3: The Shooting Modes So far, I have discussed setting up the camera for quick shots, relying on features such as Auto mode for taking pictures with settings controlled mostly by the camera s automation.

More information

Physics 1230 Homework 8 Due Friday June 24, 2016

Physics 1230 Homework 8 Due Friday June 24, 2016 At this point, you know lots about mirrors and lenses and can predict how they interact with light from objects to form images for observers. In the next part of the course, we consider applications of

More information

Advances in Iris Recognition Interoperable Iris Recognition systems

Advances in Iris Recognition Interoperable Iris Recognition systems Advances in Iris Recognition Interoperable Iris Recognition systems Date 5/5/09 Agenda How best to meet operational requirements Historical Overview of iris technology The current standard Market and Technological

More information

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition V. K. Beri, Amit Aran, Shilpi Goyal, and A. K. Gupta * Photonics Division Instruments Research and Development

More information