1. INTRODUCTION ABSTRACT
|
|
- Peter Sharp
- 6 years ago
- Views:
Transcription
1 Long-Range Night/Day Human Identification using Active-SWIR Imaging Brian E. Lemoff, Robert B. Martin, Mikhail Sluch, Kristopher M. Kafka, William McCormick and Robert Ice WVHTC Foundation, 1000 Technology Drive, Suite 1000, Fairmont, WV, USA ABSTRACT Positive identification of personnel from a safe distance is a long-standing need for security and defense applications. Advances in computer face recognition have made this a reliable means of identification when facial imagery of sufficient resolution is available to be matched against a database of mug shots. Long-range identification at night requires that the face be actively illuminated; however, for visible and NIR illumination, the intensity required to produce high-resolution long-range imagery typically creates an eye-safety hazard. SWIR illumination makes active- SWIR imaging a promising approach to long-range night-time identification. We will describe an active-swir imaging system that is being developed to covertly detect, track, zoom in on, and positively identify a human target, night or day, at hundreds of meters range. The SWIR illuminator pans, tilts, and zooms with the imager to always just fill the imager field of view. The illuminator meets Class 1 eye-safety limits (safe even with magnifying optics) at the intended target, and meets Class 1M eye-safety limits (safe to the naked eye) at point-blank range. Close-up night-time facial imagery will be presented along with experimental face recognition performance results for matching SWIR imagery to a database of visible mug shots at distance. Keywords: Face Recognition, SWIR, Night Vision, Surveillance, Biometrics, Active Imaging 1. INTRODUCTION The capability to detect and identify individuals from a great distance, night or day, without their knowledge, could have many applications for defense, law enforcement, and private security. Installation security guards could detect known bad actors as they approach or survey a facility from a distance. Law enforcement or intelligence agents would be able to covertly monitor unlit locations 24 hours a day from a safe distance, identifying individuals and recording their actions. SWAT teams or commandos could confirm the presence of specific individuals prior to launching a targeted attack or rescue mission. Under daylight or otherwise lit conditions, it is possible today for an operator using highpower optics to manually identify people at a distance if they are familiar to him or if he can refer to a short watch list of mug shots; however, automated identification at long range is not yet available. At night, or under otherwise dark conditions, there is currently no technology that produces imagery that allows for long-range identification, either manual or automated. To address this capability gap, the West Virginia High Technology Consortium Foundation (WVHTCF), under a research contract from the Office of Naval Research (ONR), is developing the Tactical Imager for Night/Day Extended Range Surveillance (TINDERS), an active short-wave infrared (SWIR) imaging system that illuminates targets with an invisible and eye-safe SWIR laser beam. Goals for the project, from bright sunlight to total darkness, include: human detection and tracking at ranges up to 3 km; generating recognizable facial imagery at ranges up to 800 m; and identification through computer face recognition at ranges up to 400 m. When complete, the goal is for TINDERS to be a portable, easy to set-up system that can automatically detect, track, zoom in on, and image a moving person, and identify them through computer face recognition. 1.1 Background There are a number of excellent long-range imaging technologies, commercially available today for human surveillance applications, each with its own strengths and weaknesses; however, none appears to be a suitable solution for compact, long-range, covert, night/day human identification. Whether the goal is computer face recognition or simply recognition by a human operator, visible-spectrum imagery will always produce the best result if conditions allow for a quality image to be obtained. Unfortunately, under nighttime or otherwise dark conditions, there is insufficient ambient illumination of the target to produce a visible image. A spotlight could be used, but this would not be covert, and the
2 intensity required to produce a high-quality close-up facial image at long range would be damaging to the eye. Thermal or long-wave infrared (LWIR) imagery is an excellent tool for nighttime detection of personnel; however, it does not produce recognizable facial imagery. In addition, compact thermal imagers are better suited to wide-angle imagery, as narrow-angle thermal imagery (e.g. 2-mm per pixel at 150-m range) requires very large and heavy lenses. Passive SWIR imagery is another excellent technology for day/night wide-area surveillance 1. Even with no moon and slightly overcast conditions, there is enough ambient night-glow to produce wide-angle imagery such as that shown in Figure 1(a). Unfortunately, nighttime signal levels, even under a full moon are too low for narrow-angle imagery, such as that needed to recognize a person at 100-m range. Because the amount of light hitting each pixel in the target image is proportional to the total amount of light coming from the target divided by the number of pixels spanned by the target, the signal level in a passive image increases as the square of the field of view. For example, narrowing the field of view of an image by a factor of 10 reduces the signal level by a factor of 100. Active near-infrared (NIR) surveillance systems are available commercially from companies such as Vumii 2. These systems combine a long-range camera (conventional silicon CCD) with a NIR illuminator (typically around 800-nm wavelength) to produce high-quality, long-range imagery night and day. By illuminating the camera field of view with light that is invisible to the human eye, but close-enough to the visible spectrum to produce familiar-looking imagery, high-quality long-range imagery like that shown in Figure 2(b) is possible. While imagery such as this should be sufficient for human identification, the illumination power required to produce quality facial images at ranges beyond 100-m creates a possible eye-safety hazard to the target, whose face is being deliberately illuminated, and a severe eyesafety hazard (immediate and permanent damage to the retina) in close proximity to the illuminator. In addition, while virtually invisible to the human eye (at high intensity, 800-nm appears as a dull red glow), the NIR illumination is clearly visible with any night-vision goggle and most silicon-based cameras. (a) (b) Figure 1. (a) Passive SWIR image at 600-m range at night with no moon and slightly overcast conditions 1. Ambient night-glow provides sufficient illumination for wide-angle imagery, but narrow-angle imagery is not possible with this technology. (b) Active-NIR facial image at 130-m range at night, using Vumii Discoverii D With compact optics, useful image signal levels can only be achieved at such long range by creating a severe eye-safety hazard in close proximity to the illuminator. NIR Illumination is also easily seen with night vision goggles and most silicon-based cameras. 1.2 Eye-safe active-swir illumination Active-SWIR imagery at wavelengths > 1400 nm, particularly in the wavelength band most commonly used by the telecommunications industry for fiber-optic communication, overcomes the two primary limitations of Active-NIR imagery in that it is completely invisible to night-vision goggles and humans, and the eye-safe power levels are much higher. Table 1 shows a comparison of the visibility and maximum eye-safe power levels of 4 potential illumination wavelengths. As defined in the ANSI Z136 and IEC laser eye safety standards 3, Class 1M means that there is no hazard to the naked eye, but there is a potential hazard when magnifying optics (e.g. binoculars or scope) are used, while Class 1 means that there is no hazard, even when magnifying optics are used (up to 7X magnification). For the TINDERS application, the absolute minimum illumination spot diameter intentionally shined on a person s face is 1 meter, so Class 1 safety at that diameter is the goal. To keep the optics compact, the output aperture of the TINDERS
3 illuminator is limited to 5 inches. Thus, to be safe to inadvertent exposure near the illuminator, we require Class 1M safety at this beam diameter. Notice that the safe power level at >1400 nm is ~ 65 times higher than at 800 nm. Table 1. Comparison of potential illumination wavelengths. Wavelength Human visibility NVG visibility Class 1 m diameter spot Class 5-inch diameter beam 800 nm Dull red glow Visible < W < W 980 nm Invisible Visible < W < W 1064 nm Invisible Visible < W < W >1400 nm Invisible Invisible < 16.7 W < W Unlike thermal infrared, where facial features are determined by skin temperature and can vary widely depending upon the thermal conditions and metabolic state of the individual, active-swir facial imagery produces repeatable imagery, showing only features that scatter the incident light. Figure 2 shows the same individual illuminated with visible white light and illuminated with an eye-safe SWIR laser. While the skin and hair pigmentation appear quite different in the two images, the geometry of the facial features are the same. Thus, it should be possible to match a SWIR facial image against a database of visible-spectrum facial images using an appropriate computer face recognition algorithm. In addition, once a human operator becomes accustomed to the darker skin and lighter hair appearing in SWIR facial images, manual recognition of individuals based on SWIR facial images should be possible. Figure 2. (left) Facial image of an individual illuminated with visible-spectrum white light. (right) Facial image of the same individual illuminated with an eye-safe SWIR laser operating in a wavelength band commonly used for long-distance telecommunications. Note that hair appears white and skin appears dark in the SWIR image, but the same features, with the same shapes, are present in both images. Early in the TINDERS project, visible and SWIR facial imagery similar to that shown in Figure 2 was collected from 56 subjects. An experiment was performed using a commercial face recognition software package, ABIS System FaceExaminer 4, from Identix (now MorphoTrust USA), in which a single SWIR facial image from each subject was matched against a database containing 1156 visible-spectrum facial images, including 1 visible image from each of the 56 subjects and 1100 visible images from the FERET facial database 5. The commercial software, which had been designed only to match visible images to other visible images, achieved a correct match for 40 out of 56 subjects, for a Rank 1 success rate of 71%. This indicates the overall feasibility of using active-swir imaging for long-range identification. 2. SYSTEM DESCRIPTION The TINDERS project began in spring 2009, with the design of a laboratory prototype whose purpose was to prove feasibility of the system concept. Following a successful field demonstration of this proof-of-concept prototype in summer 2010, a second-generation prototype was designed with a physical form factor and software architecture more suitable for eventual field use. This second-generation prototype was first used in a field experiment in Fall 2011, and the current research efforts continue to evolve this prototype.
4 2.1 Hardware A conceptual illustration of the TINDERS hardware is shown in Figure 3. The TINDERS system consists of three physical units, an optical head, that sits on a pan-tilt (PT) stage, an electronics box that provides power, light (through and optical fiber), and communications to the optical head, and a computer that runs the user interface, low-level camera control functions, system automation, and face recognition software. Figure 3. Conceptual illustration of the TINDERS hardware. The optical head includes both the SWIR illuminator optics and the imager. In the current version of the hardware, the optical head weighs roughly 30 pounds and sits in an environmentally-controlled enclosure atop a commercial pan-tilt stage. The imager and illuminator pan, tilt, and zoom together so that the illuminator beam is always just filling the imager field of view. This serves to maximize the image signal level and avoid wasted light. The illuminator light source, located in the electronics box, delivers a maximum power of 5W to the optical head through an optical fiber in the umbilical. This light source leverages commercial technology developed for the long-distance telecommunications industry. The zoom optics in the imager are optimized for monochromatic imaging of a narrow field of view, allowing a dramatic reduction in lens complexity and weight relative to traditional zoom optics that must compensate for chromatic aberration and provide distortion-free images over the entire zoom range. Figure 4. (left) Original TINDERS proof-of-concept prototype demonstrated at a field experiment in the summer of (right) Current TINDERS prototype as of December The current prototype has a physical form factor and software architecture that are more appropriate for eventual field use.
5 Figure 5. (left, center) TINDERS prototype at a December 2011 field experiment in which a long (> 50-ft) umbilical was used to connect the optical head to the electronics box located in a powered trailer some distance away. (right) TINDERS deployed atop a 35-ft mast during a December 2011 demonstration. For this implementation, the electronics box was located at the base of the mast, with the umbilical extending the length of the mast. Figure 4 shows the original TINDERS proof-of-concept prototype, as demonstrated in a field experiment in summer 2010 and the current TINDERS hardware as of December In the current hardware configuration, the TINDERS computer is connected to the electronics box through an Ethernet cable. Cables as long as 300-ft have been used. This connection could also be made through a switched network; however, performance may suffer if the network bandwidth is too low or latency too high. The umbilical that connects the electronics box to the optical head includes power, data communications, and optical cables. When the electronics box is located near the optical head, as in Figure 4, a short umbilical (~ 15 ft) is typically used; however, longer umbilicals (> 50 ft) have been used when the TINDERS optics was deployed atop a mast or on a tripod located in a field far from any power source. Figure 5 shows examples of TINDERS deployed in such situations. 2.2 Software The TINDERS software functions include low-level hardware control, automation, enterprise messaging, face recognition, and the graphical user interface. Low-level hardware control software moves the lenses in the imager and illuminator to achieve the correct zoom and focus, controls the PT stage, controls the image sensor and receives video, and controls other system components such as the light source, GPS, laser rangefinder, and temperature controllers. Automation software can currently detect people and faces in the live video, and when completed will automatically track moving targets and automatically queue detected faces for face recognition. Messaging software allows the TINDERS system to interoperate with other systems that may need access to TINDERS status, target position, target identity, or may need to cue TINDERS to point to a particular location. The GUI allows an operator to view live video while controlling and monitoring all of the TINDERS software functions. The TINDERS face recognition software leverages the commercial ABIS System FaceExaminer software 4 from MorphoTrust USA. As part of the TINDERS research program, researchers at MorphoTrust USA developed a preprocessing filter to apply to the SWIR facial images to improve the matching performance of the SWIR images to visible-spectrum images contained in the database. In its current form, the TINDERS software allows the operator to submit video frames to the face recognition software by clicking a button on the GUI. Face recognition results are then displayed in the TINDERS GUI. Work is currently underway to automate this process, so that faces detected in the live video will automatically be submitted to the face recognition software. MorphoTrust USA is also continuing to work with WVHTCF on improving the performance of the SWIR-to-visible matching algorithms used in TINDERS. 3. RESULTS Two datasets of TINDERS facial imagery were collected. The first dataset collected using the original proof-of-concept prototype, included facial imagery of 56 subjects at distances of 50 m and 106 m, indoors in total darkness. For each subject, frontal still images were collected with both neutral and talking expressions, and images were collected with the head turned left and right by 10º and 20º while talking. The second dataset, recently collected using the secondgeneration prototype, included facial video imagery of 104 subjects at distances of 100 m, 200 m, and 350 m, all
6 collected outdoors under dark nighttime conditions. Video was collected with the subjects stationary and facing the camera as well as with the subjects rotating 360º. Figure 6 shows example imagery from the second dataset for two stationary subjects at distances of 100 m, 200 m, and 350 m. As expected, the resolution and contrast degrade as the distance increases, but sufficient resolution remains at 350 m for possible recognition. The stated goal for the TINDERS project is to achieve computer face recognition at distances as high as 400 m, which is slightly farther than the longest distance included in this dataset. Figure 6. Example TINDERS facial imagery under dark nighttime conditions for two subjects at distances of (left) 100 m, (left center) 200 m, and (right center) 350 m, along with (right) visible-spectrum image of the same subjects. The first dataset was shared with two research groups at West Virginia University (WVU), who were working independently on SWIR-to-visible face recognition algorithms 6,7. Kalka, et. al. applied a pre-processing algorithm to the SWIR images before matching them to a visible-spectrum database using FaceIt G8 software from MorphoTrust USA. They achieved a Rank 1 success rate of 90% for the 50-m TINDERS images and 80% for the 106-m TINDERS images 6. Zuo, et. al. fused the results of the FaceIt G8 software with a face recognition algorithm developed by their group 7. With a 0.1% False Acceptance Rate, they achieved a Correct Acceptance Rate of 85% for the 50-m TINDERS images and 74% for the 106-m TINDERS images. The same dataset was also used by researchers at MorphoTrust USA in their development of the face recognition software that is integrated into the TINDERS system. To evaluate their pre-processing filter, they processed 9 SWIR images for each subject at each distance, including 3 frontal neutral images, 2 frontal talking images, and 4 images with a 10º pose angle. Each image was pre-processed and matched against a database containing visible-spectrum images of all 56 subjects. For each subject, the results of the 9 searches were fused by keeping the result with the highest matching score. Figure 7 shows the receiver operating characteristics (ROC) results at 50 m and 106 m with and without the preprocessing algorithm. With a 1% False Acceptance Rate, the pre-processed results achieved a Correct Acceptance Rate of roughly 70% at both 50 m and 106 m. Surprisingly, the images with 10º pose angle accounted for more than 25% of the highest scores in the successful matches, indicating the algorithm is fairly robust for pose angles within 10º of frontal.
7 Figure 7. Receiver operating characteristic generated by MorphoTrust USA using TINDERS images of 56 test subjects at 50-m and 106-m range in total darkness. Correct Acceptance Rate of roughly 70% was achieved with False Acceptance Rate of 1% at both distances. A proper statistical analysis of face recognition performance has not yet been completed for the second dataset, recently collected with the second-generation TINDERS prototype; however, the face recognition software is certainly performing better than random chance at both 200 m and 350 m range. Figure 8 shows screen shots of successful TINDERS face recognition at 200 m and 350 m, where the correct person is chosen out of a database containing visiblespectrum images of more than 1600 individuals. For these examples, TINDERS was playing back recorded video from the second dataset as if it were live. The operator clicked a button on the TINDERS GUI, which sent 11 video frames to the face recognition software for matching. In the case of the 200-m result, 8 of the 11 frames were detected as good faces and eye positions automatically marked. For the 350-m result, 9 good faces were detected and eyes automatically marked. The detected good facial images were then searched against the visible-spectrum facial database containing over 1600 individuals, and the results fused to produce an aggregate matching score for each of the 1600 candidates. The top 20 candidates are then displayed in rank order along the bottom of the screen. In both examples, the correct person was chosen as the top match. Aside from clicking the button on the TINDERS GUI to intiate the process, the operator did not need to interact with the system to produce the results. The entire process completed in < 20 seconds. Figure 8. Example screen shots showing TINDERS face recognition from video under dark nighttime conditions at distances of (left) 200 m and (right) 350 m. In both cases, the correct individual was chosen from a database containing visible-spectrum images of more than 1600 people.
8 4. DISCUSSION In many implementations of computer face recognition, a single, high-resolution visible-spectrum facial image is matched with a very high confidence level against a large database of high-resolution visible-spectrum facial images. In the case of the TINDERS system, where the detected facial images are SWIR, rather than visible, and where longdistance resolution is often much lower than optimal, it is unlikely that a single detected SWIR facial image will ever produce a high-confidence match to a large visible database. Nevertheless, TINDERS produces repeatable, recognizable images of people under both daytime and nighttime conditions, at distances well beyond 100 m that can be matched to a visible-spectrum database using computer face recognition software, in which the statistical performance of the software is much better than random chance. Given this, we hope to be able to produce high-confidence matching through the fusion of many video frames, acquired as a single person is tracked over time. Tracking software is being developed that will allow TINDERS to follow a moving person over time. With 30 video frames per second, the strategy is to automatically select the best facial images from the video and continually submit them for face recognition. As more and more SWIR facial images of the same person are collected and compared to the visible database, the scores and or ranks of the database images can be fused to produce an identification result that continues to increase in confidence level as the process continues. Just as a noisy signal can be clarified through time averaging, a face recognition capability that has low confidence for a single captured image can be made high confidence through capturing may images of the same person at slightly different times, angles, expressions, etc. Automation software to achieve the tracking, real time face detection, quality analysis, and fusion of the face recognition results is currently being developed. In addition, Morpho Trust USA will use the most recent video dataset of 104 subjects to further improve the performance of the TINDERS face recognition algorithms. In summary, WVHTCF has developed a portable active-swir imaging system that is capable of generating recognizable facial imagery at distances of at least 350 meters under conditions ranging from bright sunlight to total darkness. Three independent research groups have used TINDERS imagery collected in total darkness from a distance of 106 m to achieve computer face recognition success rates of at least 70% in matching against a database of visiblespectrum images. The TINDERS system has integrated face recognition software developed by MorphoTrust USA, that can be used for live identification of subjects, day or night, with some success achieved at distances up to 350 meters. 5. ACKNOWLEDGMENTS This research was performed under contract N C-0064 from the Office of Naval Research, with funding from the Deployable Force Protection Science and Technology Program, and with additional support from a subcontract from West Virginia University Research Corporation through Award No DD-BX-0161 awarded by the National Institutes of Justice. The authors would like to acknowledge important technical contributions from Jason Stanley and Andrew Dolby, the MorphoTrust USA team, and the cooperation of the WVU Center for Identification Technology Research in the collection of the most recent facial dataset. REFERENCES [1] Acton, D., Counting Photons: Advances in Passive Short Wave Infrared Imaging, Technology Today, issue 2 (2010), [2] Vumii Discoverii product brochure. ftp://ftp.vumii.com/documentation/discoverii/vumii_discoverii_brochure.pdf [3] Laser Safety, Wikipedia page. [4] ABIS System FaceExaminer web page. [5] FERET Database web page. [6] Kalka, N.D., Bourlai, T., Cukic, B., Hornak, L., "Cross-spectral face recognition in heterogeneous environments: A case study on matching visible to short-wave infrared imagery," Biometrics (IJCB), 2011 International Joint Conference on, pp.1-8, Oct [7] Zuo,J., Nicolo, F., Schmid, N.A., Boothapati, S., "Encoding, matching and score normalization for cross spectral face recognition: Matching SWIR versus visible data," Biometrics: Theory, Applications and Systems (BTAS), 2012 IEEE Fifth International Conference on, pp , Sept. 2012
ABSTRACT 1. INTRODUCTION 2. SYSTEM DESCRIPTION
Active-SWIR Signatures for Long-Range Night/Day Human Detection and Identification Robert B. Martin, Mikhail Sluch, Kristopher M. Kafka, Robert Ice, and Brian E. Lemoff WVHTC Foundation, 1000 Technology
More informationAdvanced Technologies Group programs aim to improve security
Advanced Technologies Group programs aim to improve security Dr. Brian Lemoff The Robert H. Mollohan Research Center, located in Fairmont's I 79 Technology Park, is home to the WVHTC Foundation's Advanced
More informationApplication Note. Digital Low-Light CMOS Camera. NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions
Digital Low-Light CMOS Camera Application Note NOCTURN Camera: Optimized for Long-Range Observation in Low Light Conditions PHOTONIS Digital Imaging, LLC. 6170 Research Road Suite 208 Frisco, TX USA 75033
More informationContinuous Wave Laser Illumination: The Clear Choice over Thermal Imaging for Long-Range, High-Magnification Night Vision Perimeter Protection
Continuous Wave Laser Illumination: The Clear Choice over Thermal Imaging for Long-Range, High- September 2008 Contents Executive Summary...3 Thermal Imaging and Continuous Wave Laser Illumination Defined...3
More informationCOLOUR INSPECTION, INFRARED AND UV
COLOUR INSPECTION, INFRARED AND UV TIPS, SPECIAL FEATURES, REQUIREMENTS LARS FERMUM, CHIEF INSTRUCTOR, STEMMER IMAGING THE PROPERTIES OF LIGHT Light is characterized by specifying the wavelength, amplitude
More informationThe Importance of Wavelengths on Optical Designs
1 The Importance of Wavelengths on Optical Designs Bad Kreuznach, Oct. 2017 2 Introduction A lens typically needs to be corrected for many different parameters as e.g. distortion, astigmatism, spherical
More informationCompact Dual Field-of-View Telescope for Small Satellite Payloads
Compact Dual Field-of-View Telescope for Small Satellite Payloads James C. Peterson Space Dynamics Laboratory 1695 North Research Park Way, North Logan, UT 84341; 435-797-4624 Jim.Peterson@sdl.usu.edu
More informationDevelopment of Hybrid Image Sensor for Pedestrian Detection
AUTOMOTIVE Development of Hybrid Image Sensor for Pedestrian Detection Hiroaki Saito*, Kenichi HatanaKa and toshikatsu HayaSaKi To reduce traffic accidents and serious injuries at intersections, development
More informationFacial Biometric For Performance. Best Practice Guide
Facial Biometric For Performance Best Practice Guide Foreword State-of-the-art face recognition systems under controlled lighting condition are proven to be very accurate with unparalleled user-friendliness,
More informationRETINAR SECURITY SYSTEMS Retinar PTR & Retinar OPUS Vehicle Mounted Applications
RETINAR SECURITY SYSTEMS Retinar PTR & Retinar OPUS Vehicle Mounted Applications 1 The world in the 21 st century is a chaotic place and threats to the public are diverse and complex more than ever. Due
More informationNear Infrared Face Image Quality Assessment System of Video Sequences
2011 Sixth International Conference on Image and Graphics Near Infrared Face Image Quality Assessment System of Video Sequences Jianfeng Long College of Electrical and Information Engineering Hunan University
More informationIR Laser Illuminators
Eagle Vision PAN/TILT THERMAL & COLOR CAMERAS - All Weather Rugged Housing resist high humidity and salt water. - Image overlay combines thermal and video image - The EV3000 CCD colour night vision camera
More information1-YEAR LIMITED WARRANTY
1-YEAR LIMITED WARRANTY The igen NV20/20 is warranted against defects in materials and workmanship under normal use for one year from the date of purchase to the original owner. Damage due to neglect,
More informationExperiments with An Improved Iris Segmentation Algorithm
Experiments with An Improved Iris Segmentation Algorithm Xiaomei Liu, Kevin W. Bowyer, Patrick J. Flynn Department of Computer Science and Engineering University of Notre Dame Notre Dame, IN 46556, U.S.A.
More informationMeasurement Guide. Solarzentrum Stuttgart GmbH Rotebühlstr. 145, Stuttgart
Solarzentrum Stuttgart GmbH Rotebühlstr. 145, 70197 Stuttgart www.solarzentrum-stuttgart.com Tel.: +49 (0) 711 31589433 Fax.: +49 (0) 711 31589435 Table of Contents Table of Contents... 1 1 Quick Facts...
More informationWhere Image Quality Begins
Where Image Quality Begins Filters are a Necessity Not an Accessory Inexpensive Insurance Policy for the System The most cost effective way to improve repeatability and stability in any machine vision
More informationSuggested FL-36/50 Flash Setups By English Bob
Suggested FL-36/50 Flash Setups By English Bob Over a period of time I've experimented extensively with the E system and its flash capabilities and put together suggested flash setups for various situations.
More informationThomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD U.S.A.
Thomas G. Cleary Building and Fire Research Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 U.S.A. Video Detection and Monitoring of Smoke Conditions Abstract Initial tests
More informationModel-Based Design for Sensor Systems
2009 The MathWorks, Inc. Model-Based Design for Sensor Systems Stephanie Kwan Applications Engineer Agenda Sensor Systems Overview System Level Design Challenges Components of Sensor Systems Sensor Characterization
More informationCamera Requirements For Precision Agriculture
Camera Requirements For Precision Agriculture Radiometric analysis such as NDVI requires careful acquisition and handling of the imagery to provide reliable values. In this guide, we explain how Pix4Dmapper
More informationTRINITY Standard configuration for littoral defence
Standard configuration for littoral defence Member of the Thales Mission Solution family Unrivalled tracking and fire control solution for small manoeuvring targets Innovative approach and easy to install
More informationFull Spectrum. Full Calibration. Full Testing. Collimated Optics, Software and Uniform Source Solutions
Full Spectrum. Full Calibration. Full Testing. Collimated Optics, Software and Uniform Source Solutions Combining the Expertise of Two Industry Leaders to Give You An Immense Range of Complete Electro-Optical
More informationInstructions for the Experiment
Instructions for the Experiment Excitonic States in Atomically Thin Semiconductors 1. Introduction Alongside with electrical measurements, optical measurements are an indispensable tool for the study of
More informationMeasuring intensity in watts rather than lumens
Specialist Article Appeared in: Markt & Technik Issue: 43 / 2013 Measuring intensity in watts rather than lumens Authors: David Schreiber, Developer Lighting and Claudius Piske, Development Engineer Hardware
More informationFACE RECOGNITION BY PIXEL INTENSITY
FACE RECOGNITION BY PIXEL INTENSITY Preksha jain & Rishi gupta Computer Science & Engg. Semester-7 th All Saints College Of Technology, Gandhinagar Bhopal. Email Id-Priky0889@yahoo.com Abstract Face Recognition
More informationChoosing the Best Optical Filter for Your Application. Georgy Das Midwest Optical Systems, Inc.
Choosing the Best Optical Filter for Your Application Georgy Das Midwest Optical Systems, Inc. Filters are a Necessity, Not an Accessory. Key Terms Transmission (%) 100 90 80 70 60 50 40 30 20 10 OUT-OF-BAND
More informationIntroduction. Corona. Corona Cameras. Origo Proposed Corona Camera. Origo Corporation Corona Camera Product Inquiry 1
Origo Corporation Corona Camera Product Inquiry 1 Introduction This Whitepaper describes Origo s patented corona camera R&D project. Currently, lab and daylight proof-of-concept tests have been conducted
More informationARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)
COST (In Thousands) FY 2002 FY 2003 FY 2004 FY 2005 FY 2006 FY 2007 FY 2008 FY 2009 Actual Estimate Estimate Estimate Estimate Estimate Estimate Estimate H95 NIGHT VISION & EO TECH 22172 19696 22233 22420
More informationDEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING
DEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING James M. Bishop School of Ocean and Earth Science and Technology University of Hawai i at Mānoa Honolulu, HI 96822 INTRODUCTION This summer I worked
More informationPractice Problems for Chapter 25-26
Practice Problems for Chapter 25-26 1. What are coherent waves? 2. Describe diffraction grating 3. What are interference fringes? 4. What does monochromatic light mean? 5. What does the Rayleigh Criterion
More informationHow interference filters can outperform colored glass filters in automated vision applications
How interference filters can outperform colored glass filters in automated vision applications High Performance Machine Vision Filters from Chroma It s all about the contrast Vision applications rely on
More informationDifrotec Product & Services. Ultra high accuracy interferometry & custom optical solutions
Difrotec Product & Services Ultra high accuracy interferometry & custom optical solutions Content 1. Overview 2. Interferometer D7 3. Benefits 4. Measurements 5. Specifications 6. Applications 7. Cases
More informationKnow Your Digital Camera
Know Your Digital Camera With Matt Guarnera Sponsored by Topics To Be Covered Understanding the language of cameras. Technical terms used to describe digital camera features will be clarified. Using special
More informationDigital Photographic Imaging Using MOEMS
Digital Photographic Imaging Using MOEMS Vasileios T. Nasis a, R. Andrew Hicks b and Timothy P. Kurzweg a a Department of Electrical and Computer Engineering, Drexel University, Philadelphia, USA b Department
More informationEX85 Megapixel-IP Infrared Imager powered by
Megapixel IP Infrared Imaging (I 3 ) Design Black Diamond Infrared Bit-Reduce Design - IP67 Rated DCRI Performance Parameters Detection Classification Recognition Identification 420ft (128m) 320ft (98m)
More information3x Magnification. Digital Zoom to 6x. CAUTION: Do not point Infrared Emitter directly into eye at close range.
MxGenPRO MANUAL-English.qx_MxGenPRO Manual-English 12/16/14 9:24 AM Page 3 Instruction Manual 3x Magnification. Digital Zoom to 6x. CAUTION: Do not point Infrared Emitter directly into eye at close range.
More informationCamera Requirements For Precision Agriculture
Camera Requirements For Precision Agriculture Radiometric analysis such as NDVI requires careful acquisition and handling of the imagery to provide reliable values. In this guide, we explain how Pix4Dmapper
More informationLow Cost Earth Sensor based on Oxygen Airglow
Assessment Executive Summary Date : 16.06.2008 Page: 1 of 7 Low Cost Earth Sensor based on Oxygen Airglow Executive Summary Prepared by: H. Shea EPFL LMTS herbert.shea@epfl.ch EPFL Lausanne Switzerland
More informationCapturing Realistic HDR Images. Dave Curtin Nassau County Camera Club February 24 th, 2016
Capturing Realistic HDR Images Dave Curtin Nassau County Camera Club February 24 th, 2016 Capturing Realistic HDR Images Topics: What is HDR? In Camera. Post-Processing. Sample Workflow. Q & A. Capturing
More informationHome-made Infrared Goggles & Lighting Filters. James Robb
Home-made Infrared Goggles & Lighting Filters James Robb University Physics II Lab: H1 4/19/10 Trying to build home-made infrared goggles was a fun and interesting project. It involved optics and electricity.
More informationApplications of Optics
Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics
More informationNFRAD: Near-Infrared Face Recognition at a Distance
NFRAD: Near-Infrared Face Recognition at a Distance Hyunju Maeng a, Hyun-Cheol Choi a, Unsang Park b, Seong-Whan Lee a and Anil K. Jain a,b a Dept. of Brain and Cognitive Eng. Korea Univ., Seoul, Korea
More informationCompact Dual Field-of-View Telescope for Small Satellite Payloads. Jim Peterson Trent Newswander
Compact Dual Field-of-View Telescope for Small Satellite Payloads Jim Peterson Trent Newswander Introduction & Overview Small satellite payloads with multiple FOVs commonly sought Wide FOV to search or
More informationWhite Paper on SWIR Camera Test The New Swux Unit Austin Richards, FLIR Chris Durell, Joe Jablonski, Labsphere Martin Hübner, Hensoldt.
White Paper on Introduction SWIR imaging technology based on InGaAs sensor products has been a staple of scientific sensing for decades. Large earth observing satellites have used InGaAs imaging sensors
More informationSpecial Projects Office. Mr. Lee R. Moyer Special Projects Office. DARPATech September 2000
Mr. Lee R. Moyer DARPATech 2000 6-8 September 2000 1 CC&D Tactics Pose A Challenge to U.S. Targeting Systems The Challenge: Camouflage, Concealment and Deception techniques include: Masking: Foliage cover,
More informationPolaris Sensor Technologies, Inc. SMALLEST THERMAL POLARIMETER
Polaris Sensor Technologies, Inc. SMALLEST THERMAL POLARIMETER Pyxis LWIR 640 Industry s smallest polarization enhanced thermal imager Up to 400% greater detail and contrast than standard thermal Real-time
More informationRicoh's Machine Vision: A Window on the Future
White Paper Ricoh's Machine Vision: A Window on the Future As the range of machine vision applications continues to expand, Ricoh is providing new value propositions that integrate the optics, electronic
More informationBasics of Photographing Star Trails
Basics of Photographing Star Trails By Rick Graves November 15, 2016 1 What are Star Trails? Night sky images with foreground elements that show the passage of time and the motion of the stars 2 Which
More informationMulti aperture coherent imaging IMAGE testbed
Multi aperture coherent imaging IMAGE testbed Nick Miller, Joe Haus, Paul McManamon, and Dave Shemano University of Dayton LOCI Dayton OH 16 th CLRC Long Beach 20 June 2011 Aperture synthesis (part 1 of
More informationFeature Extraction Techniques for Dorsal Hand Vein Pattern
Feature Extraction Techniques for Dorsal Hand Vein Pattern Pooja Ramsoful, Maleika Heenaye-Mamode Khan Department of Computer Science and Engineering University of Mauritius Mauritius pooja.ramsoful@umail.uom.ac.mu,
More informationMODULAR ADAPTIVE OPTICS TESTBED FOR THE NPOI
MODULAR ADAPTIVE OPTICS TESTBED FOR THE NPOI Jonathan R. Andrews, Ty Martinez, Christopher C. Wilcox, Sergio R. Restaino Naval Research Laboratory, Remote Sensing Division, Code 7216, 4555 Overlook Ave
More information6.869 Advances in Computer Vision Spring 2010, A. Torralba
6.869 Advances in Computer Vision Spring 2010, A. Torralba Due date: Wednesday, Feb 17, 2010 Problem set 1 You need to submit a report with brief descriptions of what you did. The most important part is
More informationEnhancing thermal video using a public database of images
Enhancing thermal video using a public database of images H. Qadir, S. P. Kozaitis, E. A. Ali Department of Electrical and Computer Engineering Florida Institute of Technology 150 W. University Blvd. Melbourne,
More informationWITec Alpha 300R Quick Operation Summary October 2018
WITec Alpha 300R Quick Operation Summary October 2018 This document is frequently updated if you feel information should be added, please indicate that to the facility manager (currently Philip Carubia,
More informationSupplementary Materials
Supplementary Materials In the supplementary materials of this paper we discuss some practical consideration for alignment of optical components to help unexperienced users to achieve a high performance
More informationWhy Optimizing VLT* May Not Be Optimal. *VLT = Visible Light Transmission
Why Optimizing VLT* May Not Be Optimal *VLT = Visible Light Transmission What is VLT (Visible Light Transmission)? Undoubtedly, VLT and fit are the two most compelling features in the use or aversion to
More informationIntroduction to the operating principles of the HyperFine spectrometer
Introduction to the operating principles of the HyperFine spectrometer LightMachinery Inc., 80 Colonnade Road North, Ottawa ON Canada A spectrometer is an optical instrument designed to split light into
More informationSource: (January 4, 2010)
Source: http://www.slrgear.com/reviews/showproduct.php/product/101/cat/12 (January 4, 2010) Name Nikon 105mm ƒ/2d AF DC Nikkor Image Circle 35mm Type Telephoto Prime Defocus Control Focal Length 105mm
More informationImaging Photometer and Colorimeter
W E B R I N G Q U A L I T Y T O L I G H T. /XPL&DP Imaging Photometer and Colorimeter Two models available (photometer and colorimetry camera) 1280 x 1000 pixels resolution Measuring range 0.02 to 200,000
More informationBy Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.
Leddar optical time-of-flight sensing technology, originally discovered by the National Optics Institute (INO) in Quebec City and developed and commercialized by LeddarTech, is a unique LiDAR technology
More informationRochester Institute of Technology. Wildfire Airborne Sensor Program (WASP) Project Overview
Rochester Institute of Technology Wildfire Airborne Sensor Program (WASP) Project Overview Introduction The following slides describe a program underway at RIT The sensor system described herein is being
More informationFire Fighter Location Tracking & Status Monitoring Performance Requirements
Fire Fighter Location Tracking & Status Monitoring Performance Requirements John A. Orr and David Cyganski orr@wpi.edu, cyganski@wpi.edu Electrical and Computer Engineering Department Worcester Polytechnic
More informationExam 4. Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.
Name: Class: Date: Exam 4 Multiple Choice Identify the choice that best completes the statement or answers the question. 1. Mirages are a result of which physical phenomena a. interference c. reflection
More informationAutofocus Problems The Camera Lens
NEWHorenstein.04.Lens.32-55 3/11/05 11:53 AM Page 36 36 4 The Camera Lens Autofocus Problems Autofocus can be a powerful aid when it works, but frustrating when it doesn t. And there are some situations
More informationTESTING VISUAL TELESCOPIC DEVICES
TESTING VISUAL TELESCOPIC DEVICES About Wells Research Joined TRIOPTICS mid 2012. Currently 8 employees Product line compliments TRIOPTICS, with little overlap Entry level products, generally less expensive
More informationChapter 6 Face Recognition at a Distance: System Issues
Chapter 6 Face Recognition at a Distance: System Issues Meng Ao, Dong Yi, Zhen Lei, and Stan Z. Li Abstract Face recognition at a distance (FRAD) is one of the most challenging forms of face recognition
More informationPolarization Gratings for Non-mechanical Beam Steering Applications
Polarization Gratings for Non-mechanical Beam Steering Applications Boulder Nonlinear Systems, Inc. 450 Courtney Way Lafayette, CO 80026 USA 303-604-0077 sales@bnonlinear.com www.bnonlinear.com Polarization
More informationImproving the Collection Efficiency of Raman Scattering
PERFORMANCE Unparalleled signal-to-noise ratio with diffraction-limited spectral and imaging resolution Deep-cooled CCD with excelon sensor technology Aberration-free optical design for uniform high resolution
More informationRENISHAW INVIA RAMAN SPECTROMETER
STANDARD OPERATING PROCEDURE: RENISHAW INVIA RAMAN SPECTROMETER Purpose of this Instrument: The Renishaw invia Raman Spectrometer is an instrument used to analyze the Raman scattered light from samples
More informationSoftware Development Kit to Verify Quality Iris Images
Software Development Kit to Verify Quality Iris Images Isaac Mateos, Gualberto Aguilar, Gina Gallegos Sección de Estudios de Posgrado e Investigación Culhuacan, Instituto Politécnico Nacional, México D.F.,
More informationLlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points
WRITE ON SCANTRON WITH NUMBER 2 PENCIL DO NOT WRITE ON THIS TEST LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points Multiple Choice Identify the choice that best completes the statement or
More informationINNOVATIVE SPECTRAL IMAGING
INNOVATIVE SPECTRAL IMAGING food inspection precision agriculture remote sensing defense & reconnaissance advanced machine vision product overview INNOVATIVE SPECTRAL IMAGING Innovative diffractive optics
More informationGlobal and Local Quality Measures for NIR Iris Video
Global and Local Quality Measures for NIR Iris Video Jinyu Zuo and Natalia A. Schmid Lane Department of Computer Science and Electrical Engineering West Virginia University, Morgantown, WV 26506 jzuo@mix.wvu.edu
More informationA piece of white paper can be 1,000,000,000 times brighter in outdoor sunlight than in a moonless night.
Light intensities range across 9 orders of magnitude. A piece of white paper can be 1,000,000,000 times brighter in outdoor sunlight than in a moonless night. But in a given lighting condition, light ranges
More informationGlossary of Terms (Basic Photography)
Glossary of Terms (Basic ) Ambient Light The available light completely surrounding a subject. Light already existing in an indoor or outdoor setting that is not caused by any illumination supplied by
More informationExam 3--PHYS 102--S10
ame: Exam 3--PHYS 02--S0 Multiple Choice Identify the choice that best completes the statement or answers the question.. At an intersection of hospital hallways, a convex mirror is mounted high on a wall
More informationULS24 Frequently Asked Questions
List of Questions 1 1. What type of lens and filters are recommended for ULS24, where can we source these components?... 3 2. Are filters needed for fluorescence and chemiluminescence imaging, what types
More informationFeasibility and Design for the Simplex Electronic Telescope. Brian Dodson
Feasibility and Design for the Simplex Electronic Telescope Brian Dodson Charge: A feasibility check and design hints are wanted for the proposed Simplex Electronic Telescope (SET). The telescope is based
More information771 Series LASER SPECTRUM ANALYZER. The Power of Precision in Spectral Analysis. It's Our Business to be Exact! bristol-inst.com
771 Series LASER SPECTRUM ANALYZER The Power of Precision in Spectral Analysis It's Our Business to be Exact! bristol-inst.com The 771 Series Laser Spectrum Analyzer combines proven Michelson interferometer
More informationROAD TO THE BEST ALPR IMAGES
ROAD TO THE BEST ALPR IMAGES INTRODUCTION Since automatic license plate recognition (ALPR) or automatic number plate recognition (ANPR) relies on optical character recognition (OCR) of images, it makes
More informationPolaris Sensor Technologies, Inc. Visible - Limited Detection Thermal - No Detection Polarization - Robust Detection etherm - Ultimate Detection
Polaris Sensor Technologies, Inc. DETECTION OF OIL AND DIESEL ON WATER Visible - Limited Detection - No Detection - Robust Detection etherm - Ultimate Detection Pyxis Features: Day or night real-time sensing
More informationChapter 36. Image Formation
Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these
More informationVixar High Power Array Technology
Vixar High Power Array Technology I. Introduction VCSELs arrays emitting power ranging from 50mW to 10W have emerged as an important technology for applications within the consumer, industrial, automotive
More informationOne Week to Better Photography
One Week to Better Photography Glossary Adobe Bridge Useful application packaged with Adobe Photoshop that previews, organizes and renames digital image files and creates digital contact sheets Adobe Photoshop
More informationGlassSpection User Guide
i GlassSpection User Guide GlassSpection User Guide v1.1a January2011 ii Support: Support for GlassSpection is available from Pyramid Imaging. Send any questions or test images you want us to evaluate
More informationOUTDOOR PORTRAITURE WORKSHOP
OUTDOOR PORTRAITURE WORKSHOP SECOND EDITION Copyright Bryan A. Thompson, 2012 bryan@rollaphoto.com Goals The goals of this workshop are to present various techniques for creating portraits in an outdoor
More informationEight Tips for Optimal Machine Vision Lighting
Eight Tips for Optimal Machine Vision Lighting Tips for Choosing the Right Lighting for Machine Vision Applications Eight Tips for Optimal Lighting This white paper provides tips for choosing the optimal
More informationShort Wave Infrared (SWIR) Imaging In Machine Vision
Short Wave Infrared (SWIR) Imaging In Machine Vision Princeton Infrared Technologies, Inc. Martin H. Ettenberg, Ph. D. President martin.ettenberg@princetonirtech.com Ph: +01 609 917 3380 Booth Hall 1 J12
More informationWIRELESS LINKS AT THE SPEED OF LIGHT
FREE SPACE OPTICS (FSO) WIRELESS LINKS AT THE SPEED OF LIGHT WISAM ABDURAHIMAN INTRODUCTION 2 In telecommunications, Free Space Optics (FSO) is an optical communication technology that uses light propagating
More informationVisible-light and Infrared Face Recognition
Visible-light and Infrared Face Recognition Xin Chen Patrick J. Flynn Kevin W. Bowyer Department of Computer Science and Engineering University of Notre Dame Notre Dame, IN 46556 {xchen2, flynn, kwb}@nd.edu
More informationBMC s heritage deformable mirror technology that uses hysteresis free electrostatic
Optical Modulator Technical Whitepaper MEMS Optical Modulator Technology Overview The BMC MEMS Optical Modulator, shown in Figure 1, was designed for use in free space optical communication systems. The
More informationEXPRIMENT 3 COUPLING FIBERS TO SEMICONDUCTOR SOURCES
EXPRIMENT 3 COUPLING FIBERS TO SEMICONDUCTOR SOURCES OBJECTIVES In this lab, firstly you will learn to couple semiconductor sources, i.e., lightemitting diodes (LED's), to optical fibers. The coupling
More informationPICO MASTER 200. UV direct laser writer for maskless lithography
PICO MASTER 200 UV direct laser writer for maskless lithography 4PICO B.V. Jan Tinbergenstraat 4b 5491 DC Sint-Oedenrode The Netherlands Tel: +31 413 490708 WWW.4PICO.NL 1. Introduction The PicoMaster
More informationPhysics 1230: Light and Color. Guest Lecture, Jack again. Lecture 23: More about cameras
Physics 1230: Light and Color Chuck Rogers, Charles.Rogers@colorado.edu Ryan Henley, Valyria McFarland, Peter Siegfried physicscourses.colorado.edu/phys1230 Guest Lecture, Jack again Lecture 23: More about
More informationChapter 36. Image Formation
Chapter 36 Image Formation Notation for Mirrors and Lenses The object distance is the distance from the object to the mirror or lens Denoted by p The image distance is the distance from the image to the
More informationFusion of Heterogeneous Multisensor Data
Fusion of Heterogeneous Multisensor Data Karsten Schulz, Antje Thiele, Ulrich Thoennessen and Erich Cadario Research Institute for Optronics and Pattern Recognition Gutleuthausstrasse 1 D 76275 Ettlingen
More information1. Any wide view of a physical space. a. Panorama c. Landscape e. Panning b. Grayscale d. Aperture
Match the words below with the correct definition. 1. Any wide view of a physical space. a. Panorama c. Landscape e. Panning b. Grayscale d. Aperture 2. Light sensitivity of your camera s sensor. a. Flash
More informationEE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name:
EE119 Introduction to Optical Engineering Fall 2009 Final Exam Name: SID: CLOSED BOOK. THREE 8 1/2 X 11 SHEETS OF NOTES, AND SCIENTIFIC POCKET CALCULATOR PERMITTED. TIME ALLOTTED: 180 MINUTES Fundamental
More informationGeneral Imaging System
General Imaging System Lecture Slides ME 4060 Machine Vision and Vision-based Control Chapter 5 Image Sensing and Acquisition By Dr. Debao Zhou 1 2 Light, Color, and Electromagnetic Spectrum Penetrate
More informationLabVIEW based Intelligent Frontal & Non- Frontal Face Recognition System
LabVIEW based Intelligent Frontal & Non- Frontal Face Recognition System Muralindran Mariappan, Manimehala Nadarajan, and Karthigayan Muthukaruppan Abstract Face identification and tracking has taken a
More information