Automated Inspection With Machine Vision

Size: px
Start display at page:

Download "Automated Inspection With Machine Vision"

Transcription

1 Automated Inspection With Machine Vision Part 2 Stanley N. Hack, D.Sc., PE ConsulTech Engineering, PLLC November 9, 2015

2 PRESENTATION GOALS Understanding Machine Vision and Its Uses Understanding Machine Vision Components Appreciating the Interacting Complexities of Machine Vision Components and Processing Disclaimer: Most of ConsulTech Engineering s clients regard their processes, which include machine vision applications, as highly proprietary. Many of the applications presented have been devised for this presentation. 2

3 PRESENTATION SCOPE Part I Review: CNY Engineering Expo 2014 Machine Vision Definition Machine Vision Uses Machine Vision Components Illumination Part II: CNY Engineering Expo 2015 Machine Vision Components Cameras and Sensors Optics Part III: CNY Engineering Expo 2016 Machine Vision Processors Machine Vision Software Machine Vision Systems Final Exam 3

4 MACHINE VISION DEFINTION Machine Vision is defined as the technology and methods used to automatically inspect materials, components, and manufactured systems using image-based sensors and systems. Key Words Automatic Inspection Materials, Components, Manufactured Systems Image-Based 4

5 MACHINE VISION DEFINITION Machine Vision has the following attributes: The input is an image, and the output is a set of data, such as feature existence, object type, object or feature location, and measurements. The system analyzes inanimate objects. The system has a priori knowledgeof the imaged object(s), including object shape, size, position, and attributes. The system performs its processing repeatedly and often rapidly. The imaging environment, including illumination, geometry, and motion, is controlled by the Machine Vision system. 5

6 MACHINE VISION ENVIRONMENT Machine Vision is unique among all of the computer imaging modalities (computer vision, image processing, medical imaging, remote sensing) in that many aspects of the imaging environment can often be controlled Illumination Sensor positioning Sensor size and resolution Image magnification and orientation Optical filtering Reflections A large component of Machine Vision system design is the optimization of the captured images to make the software s job feasible, easier, and/or faster. 6

7 USES OF MACHINE VISION Quality Control Process Control Robotic Guidance Images courtesy of MicroscanSystems, Inc., NERLITE Lighting Solutions, Renton, WA NDA Expiration 01 December 2007 Image courtesy of Matrox Electronic Systems Ltd., Dorval, Quebec, Canada Image courtesy of Matrox Electronic Systems Ltd., Dorval, Quebec, Canada 7

8 USES OF MACHINE VISION Manufacturing Quality Control Dimension verification Parts placement accuracy Debris detection * Label placement Label printing Coatings integrity Circuit continuity Color verification Cracks, dents, and other defects * Defect Inspection * Little or no a priori knowledgeof the imaged object(s). Images courtesy of Matrox Electronic Systems Ltd., Dorval, Quebec, Canada Placement Inspection 8

9 USES OF MACHINE VISION Manufacturing Process Control Temperature control Cutting / grinding adjustments Parts sorting Flow and speed control Timing control Robot control Baking Temperature Control Grinding Marks and Defects Robot Welding 9

10 MACHINE VISION COMPONENTS Work station image courtesy of Comark LLC, Milford, MA 10

11 Machine Vision Components The Three Most Important Components Illumination Illumination Illumination Detailed in Part 1 The Other Most Important Components Cameras and Sensors Optics Synchronization (Motion Control) Processing Image courtesy of Operations Technology, Inc., Blairstown, NJ A large component of Machine Vision system design is the optimization of the captured images to make the software s job feasible, easier, and/or faster. 11

12 Illumination Review Automated Inspection with Machine Vision Part 1 summarized in two theorems: Theorem 1 If adding shadows to a captured image helps the processing, configure the illumination to add the appropriate shadowing. Theorem 2 If shadows in a captured image hurts the processing, configure the illumination to remove shadowing. Figures courtesy of Illumination Technologies, Elbridge, NY 12

13 Illumination Review Imaging Geometry On-axis or coaxial Dark field Partial bright field or directional Back lighting Diffuse, dome, or cloudy day Structured Figures from - A Practical Guide to Machine Vision Lighting, Daryl Martin, Advanced Illumination, Inc.,

14 Illumination Review Ultraviolet Spectrum ~ 100 nm 400 nm Visible Spectrum ~ 390 nm 700 nm Infrared Spectrum ~ 750 nm 100 μm 14

15 Cameras and Sensors 15

16 Sensors CCD(Charge-Coupled Device) o Bell Labs 1970 o Photoactive region (capacitor array) Each capacitor accumulates an electric charge proportional to the light intensity at that location. o Transmission region (shift register) A control circuit causes each capacitor to transfer its contents to its neighbor (operating as a shift register) The last capacitor in the array dumps its charge into a charge amplifier, which converts the charge into a voltage that is read out 16

17 Sensors CMOS(Complimentary Metal Oxide Semiconductor) o First suggested in o Commercialized in late 1980 s early 1990 s o Machine Vision-quality sensors now available from Sony, CMOSIS, and ON Semiconductor o Active Pixel Sensor(APS) each pixel includes its own amplifier o Until recently, lower image quality than CCD o Requires less power than CCD o Lower cost than CCD o Less blooming than CCD o Potential rolling shutter effect o Less quantum efficiency than CCD o Traditionally used for less demanding applications such as cell phones and photography cameras (measurement precision not required) 17

18 Sensors Bolometer o Measures the power of incident electromagnetic radiation via the heating of a material with a temperaturedependent electrical resistance o Predominantly used in the infrared spectrum Long-Wave Infrared (LWIR) o 8 to 14 µm o Thermal imaging Mid-Wave Infrared (MWIR) o 3 to 5 μm o Long distance tracking through the atmosphere Short-Wave Infrared (SWIR) o 0.9 to 1.7 μm o Low light level imaging (night vision, fog,..) Long Distance Tracking (MWIR) Low Light Level Imaging (SWIR) LWIR Bolometer Core Courtesy of Xenics, NV, Leuven, Belgium 18

19 Camera Specifications Sensor o Sensor Size o Shutter Control o Sensor Resolution (Pixel Count) o Pixel Size o Frame Rate o Quantum Efficiency o Signal-to-Noise Ratio Trigger Capability o Synchronous o Asynchronous o Exposure Control Camera Type o Area Scan o Line Scan o Color / Black & White Interface Area Scan and Line Scan Courtesy of Edmund Optics Asynchronous Exposure Control 19

20 Sensors Shutter Control Global Shutter o All pixels are exposed simultaneously o Film cameras use a mechanical shutter o Available with CCD o Newly available with CMOS Rolling Shutter o Sequential exposure start and stop for each row of pixels o Can create motion artifacts o Artifact of legacy CMOS sensors Global Shutter (Courtesy Basler AG) Flash Motion Artifact Due to Rolling Shutter (courtesy Wikipedia) Roling Shutter (Courtesy Basler AG) 20

21 Camera Specifications Interface o Camera Link Specialized Interface MB/S Up to 300 m Cable o USB Vision USB-3.0 Up to 350 MB/S Up to 100 m Cable o GigE Vision Gigabit Ethernet Up to 100 MB/S Up to 100 m Cable o CoaXPress Coax Cable Up to 6.25 Gb/S Up to 100 m Cable o Analog (Legacy) RS-170 (B&W) NTSC, CCIR, SECAM (Color) o Others FireWire or IEEE-1394 USB

22 Camera Specifications B&W o Gray Levels (bits) Color o 3 CCD o Bayer Filter o Color Levels (bits) Frame Rate o Total frame rate o Windowed frame rate Lens Mount o C-Mount (16 mm) o F-Mount (Nikon 35 mm) o K-Mount (Pentax 35 mm) o Large Format -M37 X 0.75, Bayer Filter 22

23 Camera Specifications CMOS Sensors UV Blue Green Red IR Quantum Efficiency (QE) Plot Sony CMOS Sensors Courtesy of Point Grey Research, Inc., Richmond, Canada 23

24 Camera Specifications Quantum Efficiency (QE) of Infrared Detection Sensor Materials Courtesy The Optical Society of America, Washington, DC 24

25 Camera Specifications Video Frame Sizes Frame Size Diagonal (mm) Width (mm) Height (mm) Aspect Ratio 4/ : : 3 1/ : 1 2/ : 3 1/ : 3 1/ : 3 1/ : 3 1/ : 1 1/ : 1 High Definition (HD) 16 : 9 Shaded areas indicate standard broadcast aspect ratios Film Frame Sizes Frame Size Diagonal (mm) Width (mm) Height (mm) Aspect Ratio 35 mm (slide) : 1 35 mm (movie) : 1 16 mm : 1 8 mm : 1 Super 8 mm : 1 IMAX : 1 Cinema aspect ratios: 1.85:1 - normal widescreen and 2.39:1 - anamorphic widescreen (rectangular pixels) Television aspect ratios: 4:3 - standard definition and 16:9 - high definition 25

26 Camera Specifications Relationship between bit depth and signal-to-noise ratio Bit-depth o Defines the number of discrete values of gray or of a color o Defines the contrast resolving power o Examples: Bit depth = 8-bits 256 shades of gray or color (RGB) Bit depth = 12-bits 4096 shades of gray or color (RGB) Signal-to-Noise Ratio (SNR) o Primarily due to electronic noise o Can include quantum noise in photon-limited systems Relationship o Pixel bit depth resolution must be greater than the SNR o Example 1: SNR = 40 db = 1 : 10,000 > 4,096 > 256 o Example 2: SNR = 30 db = 1 : 1,000 > 256 < 4,096 26

27 Smart Cameras Internal processing capability o CPU o FPGA Graphical development software Uses o Bar code reading o Optical character recognition o Simple detection and measurement applications Cognex EasyBuilder and InSight Camera Dalsa Sherlock and BOA Camera National Instruments LabView Matrox Design Assistant and IRIS Camera 27

28 Other Sensors Vacuum Tube - Vidicon, Plumbicon, Orthicon, Saticon, Newvicon "Orthicon" by Tecchese-Own work. Licensed under CC BY-SA 3.0 via Commons - Solid State CID (Charge Injection Device) Every pixel in a CID array can be individually addressed via electrical indexing of row and column electrodes Each pixel is read non-destructively Asynchronous trigger and clear Long integration times possible 28

29 Other Sensors X-Ray Ultra-Sound Electrical Impedance Seismic (sound waves) LIDAR (Light RADAR) o Time-of-Flight o Interferometry Courtesy of Google, Inc., Mountain View, CA 29

30 Optics Snell's Law (aka Snell Descartes Law, Law of Refraction) Defines the relationship between the angles of incidence and refraction, when referring to light or other waves passing through a boundary (interface) between two different isotropic media (uniform in all orientations), such as water, glass, or air. = = where: angle measured from the normal of the boundary velocity of light in the respective medium wavelength of light in the respective medium refractive index of the respective medium 30

31 Lens Specifications Lens Type o Finite / Finite Conjugates o Infinite / Infinite Conjugates o Infinite / Finite Conjugates Size (Diameter) Field-of-View (FOV) Depth-of-Field (DOF) Working Distance (WD) Magnification Numerical Aperture (NA) or f/stop Chromatic Optical Aberrations Resolution -Modulation Transfer Function (MTF) Contrast Courtesy of Edmund Optics, Barrington, NJ 31

32 Finite / Finite Conjugate Lens Type Single Element + F = M = = = where: F = effective focal length of lens system O = object-to-lens distance I = image-to-lens distance (image on sensor) M = magnification H I = half height of image H O = half height of object 32

33 Finite / Finite Conjugate Lens Type Two Elements F = where: F = effective focal length of lens system F I = focal length of lens closest to image F O = focal length of lens closest to object O = object-to-lens distance I = image-to-lens distance (image on sensor) M = = = d = distance between elements M = magnification H I = half height of image H O = half height of object 33

34 Finite / Finite Conjugate Lens Type Single Element Revisited + F = M = = = where: F = effective Focal Length of lens system WD = Working Distance (WD) or object-to-lens distance BF = Back Focus or image-to-lens distance M = Magnification H S = half of Sensor Height or half height of image H O = half of Field-of-Viewor half height of object 34

35 Chromatic Optical Aberrations Transverse chromatic aberration (TCA)occurs white light is used, and the red, yellow, and blue wavelengths focus at separate points in a vertical plane. Longitudinal chromatic aberration (LCA) occurs when different wavelengths focus at different points along the horizontal optical axis since the refractive index of a glass is wavelength dependent. Mitigated using lens coatings. where: C = red (656.3 nm) d = yellow light (587.6 nm) F = blue light (486.1 nm) 35

36 Depth-of-Field Depth-of-Fieldis the increment surrounding the working distance in which the object is in focus. Dependent upon the lens aperture (f/stop) Dependent upon the sensor pixel size which defines the Circle of Confusion 36

37 Depth-of-Field Increased Aperture and Constant Circle of Confusion DOF Decreases 37

38 Depth-of-Field Numerical Aperture NA = sin Image-Space Numerical Aperture (lens is focused to infinity) NA = sin = sin f-number f N = = ~ where: D = lens aperture (entrance pupil) diameter = index of refraction (1.0 in air) = half-angle of the maximum cone of light that can enter or exit the lens NA = numerical aperture NA = image-space numerical aperture (lens focused to infinity) F = lens focal length f N = f-number of aperture 38

39 Infinite / Infinite Conjugate Lens Type Magnification notdependent upon back focus or working distance M d + where: M = magnification F O = focal length of object-side lens F I = focal length of image-side lens H O = half height of object H I = half height of image d = distance between lens elements D = lens aperture diameter 39

40 Infinite / Finite Conjugate Lens Type M where: M = magnification H O = half height of object H I = half height of image F = lens focal length I = lens-to-image distance (back focus) = image-side angle, dependent upon focal length D = lens aperture diameter Magnification notdependent upon working distance 40

41 Telecentric Lens Single-Sided Telecentric Lens o Infinite/Finite conjugate lens o Magnification dependent on focal length o Magnification dependent on back-focus Double-Sided Telecentric Lens o Infinite/Infinite conjugate lens o Magnification dependent on focal length only Uses o Camera lens o Back-light lens Courtesy of Edmund Optics, Barrington, NJ 41

42 Telecentric Lens Experiment Configuration Conventional Lens Image Telecentric Lens Image Conventional Lens / Diffuse Backlight Telecentric Lens / Telecentric Backlight Courtesy of Edmund Optics, Barrington, NJ 42

43 Resolving Power Resolution Number of line-pairs per unit distance that are resolved by an imaging system. Contrast Relative gray-levels of black and white objects produced by an imaging system. Modulation Transfer Function (MTF) Plot of perceived contrasts between black and white objects verses linepairs per unit distance. Limiting Resolution Function of field-of-view, magnification, and sensor size (pixel count). Line-Pair Images Resolution Depiction Courtesy of Edmund Optics, Barrington, NJ 43

44 Resolving Power Object Imaging Lens Image Object Imaging Lens Image White 100% Contrast 20% Contrast White Black Black Courtesy of Edmund Optics, Barrington, NJ 44

45 Modulation Transfer Function (MTF) Courtesy of Edmund Optics, Barrington, NJ 45

46 Distortion Spherical Aberation % where: AD = actual distance of imaged points from center of field PD = predicted distance of imaged points from center of field with no distortion Focus Range Black Circles Red Circles actual locations imaged points with distortion present predicted locations of imaged points without distortion Light incident near the edges of the lens come to focus too early. 46

47 Test Targets IEEE Resolution Chart Depth-of-Field Test Target EIA Gray Level Chart USAF 1951 Resolution Chart 47

48 Test Targets Projected Distortion Test Pattern Fixed Frequency Distortion Chart TV Test Pattern TV Color Bar Pattern 48

49 Infrared Optics (Lens) Materials % transmittance SWIR 0.9 to 1.7 µm MWIR 3 to 5 µm LWIR 8 to 14 µm wavelength (μm) From R.E. Fisher, et al., Optical System Design, 2 nd Ed. Courtesy McGraw Hill, New York, NY 49

50 PART III: CNY Engineering Expo 2016 Machine Vision Processors Machine Vision Software Machine Vision Systems 50

51 SUMMARY Machine Vision is defined as the technology and methods used to automatically inspect materials, components, and manufactured systems using image-based sensors and systems. Machine Vision is used for Quality Control, Process Control, and Robotic Guidance. Machine Vision Systems include the following components: Illuminators Cameras and Sensors Optics (Lenses) Synchronization (Motion Control) Processors Machine Vision Software is able to extract, highlight, and manipulate information from a captured image, but it cannot add information. In other words, if the software can t see it, it can t process it. Machine Vision Systems Engineering is required for success. 51

52 QUESTIONS?? 52

53 REVIEW QUESTIONS and ANSWERS 1. What is Machine Vision? Machine Vision is the technology and methods used to automatically inspect materials, components, and manufactured systems using image-based sensors and systems. 2. What are the uses of Machine Vision? Machine Vision is used for Quality Control, Process Control, and Robotic Guidance. 3. What are the components that compose a Machine Vision System? A Machine Vision System includes illuminators, sensors (cameras), optics (lenses), synchronizers (motion controllers and/or encoders), and processors. 4. What types of sensors are used in Machine Vision cameras? CCD, CMOS, and bolometers. 5. What are some of the parameters used to specify lenses? Size, focal length, resolving power (MTF and contrast). 6. How excited are you to attend Part III of this series? EXTREMELY! 7. How many PDHs did you earn by attending this session? 53

54 Thank You For Attending Automated Inspection With Machine Vision Part 2 Stanley N. Hack, D.Sc., PE November 9,

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

LENSES. INEL 6088 Computer Vision

LENSES. INEL 6088 Computer Vision LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

Imaging Optics Fundamentals

Imaging Optics Fundamentals Imaging Optics Fundamentals Gregory Hollows Director, Machine Vision Solutions Edmund Optics Why Are We Here? Topics for Discussion Fundamental Parameters of your system Field of View Working Distance

More information

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman Advanced Camera and Image Sensor Technology Steve Kinney Imaging Professional Camera Link Chairman Content Physical model of a camera Definition of various parameters for EMVA1288 EMVA1288 and image quality

More information

General Imaging System

General Imaging System General Imaging System Lecture Slides ME 4060 Machine Vision and Vision-based Control Chapter 5 Image Sensing and Acquisition By Dr. Debao Zhou 1 2 Light, Color, and Electromagnetic Spectrum Penetrate

More information

Cameras. Fig. 2: Camera obscura View of Hotel de Ville, Paris, France, 2015 Photo by Abelardo Morell

Cameras.  Fig. 2: Camera obscura View of Hotel de Ville, Paris, France, 2015 Photo by Abelardo Morell Cameras camera is a remote sensing device that can capture and store or transmit images. Light is A collected and focused through an optical system on a sensitive surface (sensor) that converts intensity

More information

Cameras CS / ECE 181B

Cameras CS / ECE 181B Cameras CS / ECE 181B Image Formation Geometry of image formation (Camera models and calibration) Where? Radiometry of image formation How bright? What color? Examples of cameras What is a Camera? A camera

More information

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Real world Optics Sensor Devices Sources of Error

More information

Using Optics to Optimize Your Machine Vision Application

Using Optics to Optimize Your Machine Vision Application Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information

More information

The Xiris Glossary of Machine Vision Terminology

The Xiris Glossary of Machine Vision Terminology X The Xiris Glossary of Machine Vision Terminology 2 Introduction Automated welding, camera technology, and digital image processing are all complex subjects. When you combine them in a system featuring

More information

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

CPSC 4040/6040 Computer Graphics Images. Joshua Levine CPSC 4040/6040 Computer Graphics Images Joshua Levine levinej@clemson.edu Lecture 04 Displays and Optics Sept. 1, 2015 Slide Credits: Kenny A. Hunt Don House Torsten Möller Hanspeter Pfister Agenda Open

More information

Introduction to Computer Vision

Introduction to Computer Vision Introduction to Computer Vision CS / ECE 181B Thursday, April 1, 2004 Course Details HW #0 and HW #1 are available. Course web site http://www.ece.ucsb.edu/~manj/cs181b Syllabus, schedule, lecture notes,

More information

Image Formation and Capture

Image Formation and Capture Figure credits: B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, A. Theuwissen, and J. Malik Image Formation and Capture COS 429: Computer Vision Image Formation and Capture Real world Optics Sensor Devices

More information

The Importance of Wavelengths on Optical Designs

The Importance of Wavelengths on Optical Designs 1 The Importance of Wavelengths on Optical Designs Bad Kreuznach, Oct. 2017 2 Introduction A lens typically needs to be corrected for many different parameters as e.g. distortion, astigmatism, spherical

More information

How to Choose a Machine Vision Camera for Your Application.

How to Choose a Machine Vision Camera for Your Application. Vision Systems Design Webinar 9 September 2015 How to Choose a Machine Vision Camera for Your Application. Andrew Bodkin Bodkin Design & Engineering, LLC Newton, MA 02464 617-795-1968 wab@bodkindesign.com

More information

Introduction. Lighting

Introduction. Lighting &855(17 )8785(75(1'6,10$&+,1(9,6,21 5HVHDUFK6FLHQWLVW0DWV&DUOLQ 2SWLFDO0HDVXUHPHQW6\VWHPVDQG'DWD$QDO\VLV 6,17()(OHFWURQLFV &\EHUQHWLFV %R[%OLQGHUQ2VOR125:$< (PDLO0DWV&DUOLQ#HF\VLQWHIQR http://www.sintef.no/ecy/7210/

More information

TECHSPEC COMPACT FIXED FOCAL LENGTH LENS

TECHSPEC COMPACT FIXED FOCAL LENGTH LENS Designed for use in machine vision applications, our TECHSPEC Compact Fixed Focal Length Lenses are ideal for use in factory automation, inspection or qualification. These machine vision lenses have been

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

ME 6406 MACHINE VISION. Georgia Institute of Technology

ME 6406 MACHINE VISION. Georgia Institute of Technology ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class

More information

7 Big Ideas To Understanding Imaging Systems

7 Big Ideas To Understanding Imaging Systems LEARNING UNDERSTANDING INTRODUCING APPLYING 7 Big Ideas To Understanding Imaging Systems A P P L I C A T I O N N O T E S Basics Of Digital Camera Settings For Improved Imaging Results Telecentricity And

More information

Opti 415/515. Introduction to Optical Systems. Copyright 2009, William P. Kuhn

Opti 415/515. Introduction to Optical Systems. Copyright 2009, William P. Kuhn Opti 415/515 Introduction to Optical Systems 1 Optical Systems Manipulate light to form an image on a detector. Point source microscope Hubble telescope (NASA) 2 Fundamental System Requirements Application

More information

Speed and Image Brightness uniformity of telecentric lenses

Speed and Image Brightness uniformity of telecentric lenses Specialist Article Published by: elektronikpraxis.de Issue: 11 / 2013 Speed and Image Brightness uniformity of telecentric lenses Author: Dr.-Ing. Claudia Brückner, Optics Developer, Vision & Control GmbH

More information

Image Formation and Camera Design

Image Formation and Camera Design Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

More information

Charged Coupled Device (CCD) S.Vidhya

Charged Coupled Device (CCD) S.Vidhya Charged Coupled Device (CCD) S.Vidhya 02.04.2016 Sensor Physical phenomenon Sensor Measurement Output A sensor is a device that measures a physical quantity and converts it into a signal which can be read

More information

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło Visual perception basics Image aquisition system Light perception by humans Humans perceive approx. 90% of information about the environment by means of visual system. Efficiency of the human visual system

More information

How does prism technology help to achieve superior color image quality?

How does prism technology help to achieve superior color image quality? WHITE PAPER How does prism technology help to achieve superior color image quality? Achieving superior image quality requires real and full color depth for every channel, improved color contrast and color

More information

Computer Vision. Image acquisition. 10 April 2018

Computer Vision. Image acquisition. 10 April 2018 Computer Vision Image acquisition 10 April 2018 Copyright 2001 2018 by NHL Stenden Hogeschooland Van de Loosdrecht Machine Vision BV All rights reserved j.van.de.loosdrecht@nhl.nl, jaap@vdlmv.nl Image

More information

Optical design of a high resolution vision lens

Optical design of a high resolution vision lens Optical design of a high resolution vision lens Paul Claassen, optical designer, paul.claassen@sioux.eu Marnix Tas, optical specialist, marnix.tas@sioux.eu Prof L.Beckmann, l.beckmann@hccnet.nl Summary:

More information

Building a Real Camera

Building a Real Camera Building a Real Camera Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible? Less light gets through Diffraction

More information

FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics

FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics FOR 353: Air Photo Interpretation and Photogrammetry Lecture 2 Electromagnetic Energy/Camera and Film characteristics Lecture Outline Electromagnetic Radiation Theory Digital vs. Analog (i.e. film ) Systems

More information

COLOUR INSPECTION, INFRARED AND UV

COLOUR INSPECTION, INFRARED AND UV COLOUR INSPECTION, INFRARED AND UV TIPS, SPECIAL FEATURES, REQUIREMENTS LARS FERMUM, CHIEF INSTRUCTOR, STEMMER IMAGING THE PROPERTIES OF LIGHT Light is characterized by specifying the wavelength, amplitude

More information

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Building a Real Camera. Slides Credit: Svetlana Lazebnik Building a Real Camera Slides Credit: Svetlana Lazebnik Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible?

More information

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH Optical basics for machine vision systems Lars Fermum Chief instructor STEMMER IMAGING GmbH www.stemmer-imaging.de AN INTERNATIONAL CONCEPT STEMMER IMAGING customers in UK Germany France Switzerland Sweden

More information

VC 11/12 T2 Image Formation

VC 11/12 T2 Image Formation VC 11/12 T2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

Imaging Overview. For understanding work in computational photography and computational illumination

Imaging Overview. For understanding work in computational photography and computational illumination Imaging Overview For understanding work in computational photography and computational illumination Light and Optics Optics The branch of physics that deals with light Ray optics Wave optics Photon optics

More information

Time Delay Integration (TDI), The Answer to Demands for Increasing Frame Rate/Sensitivity? Craige Palmer Assistant Sales Manager

Time Delay Integration (TDI), The Answer to Demands for Increasing Frame Rate/Sensitivity? Craige Palmer Assistant Sales Manager Time Delay Integration (TDI), The Answer to Demands for Increasing Frame Rate/Sensitivity? Craige Palmer Assistant Sales Manager Laser Scanning Microscope High Speed Gated PMT Module High Speed Gating

More information

VC 14/15 TP2 Image Formation

VC 14/15 TP2 Image Formation VC 14/15 TP2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

Image and Multidimensional Signal Processing

Image and Multidimensional Signal Processing Image and Multidimensional Signal Processing Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ Digital Image Fundamentals 2 Digital Image Fundamentals

More information

Chapter 25 Optical Instruments

Chapter 25 Optical Instruments Chapter 25 Optical Instruments Units of Chapter 25 Cameras, Film, and Digital The Human Eye; Corrective Lenses Magnifying Glass Telescopes Compound Microscope Aberrations of Lenses and Mirrors Limits of

More information

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember Günter Toesko - Laserseminar BLZ im Dezember 2009 1 Aberrations An optical aberration is a distortion in the image formed by an optical system compared to the original. It can arise for a number of reasons

More information

flexible lighting technology

flexible lighting technology As a provider of lighting solutions for the Machine Vision Industry, we are passionate about exceeding our customers expectations. As such, our ISO 9001 quality procedures are at the core of everything

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

Cameras. CSE 455, Winter 2010 January 25, 2010

Cameras. CSE 455, Winter 2010 January 25, 2010 Cameras CSE 455, Winter 2010 January 25, 2010 Announcements New Lecturer! Neel Joshi, Ph.D. Post-Doctoral Researcher Microsoft Research neel@cs Project 1b (seam carving) was due on Friday the 22 nd Project

More information

CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

More information

Image Formation: Camera Model

Image Formation: Camera Model Image Formation: Camera Model Ruigang Yang COMP 684 Fall 2005, CS684-IBMR Outline Camera Models Pinhole Perspective Projection Affine Projection Camera with Lenses Digital Image Formation The Human Eye

More information

Microscope anatomy, image formation and resolution

Microscope anatomy, image formation and resolution Microscope anatomy, image formation and resolution Ian Dobbie Buy this book for your lab: D.B. Murphy, "Fundamentals of light microscopy and electronic imaging", ISBN 0-471-25391-X Visit these websites:

More information

7x P/N C1601. General Description

7x P/N C1601. General Description METRICZOOM SWIR 7x METRIC ZOOM-SWIR ZOOM 7x P/N C1601 C General Description This family of high resolution METRIC ZOOM SWIR lenses image from 0.9 to 2.3 µm making them especially well-suited well for surveillance,

More information

CS 376b Computer Vision

CS 376b Computer Vision CS 376b Computer Vision 09 / 03 / 2014 Instructor: Michael Eckmann Today s Topics This is technically a lab/discussion session, but I'll treat it as a lecture today. Introduction to the course layout,

More information

Machine Vision for the Life Sciences

Machine Vision for the Life Sciences Machine Vision for the Life Sciences Presented by: Niels Wartenberg June 12, 2012 Track, Trace & Control Solutions Niels Wartenberg Microscan Sr. Applications Engineer, Clinical Senior Applications Engineer

More information

Compact camera module testing equipment with a conversion lens

Compact camera module testing equipment with a conversion lens Compact camera module testing equipment with a conversion lens Jui-Wen Pan* 1 Institute of Photonic Systems, National Chiao Tung University, Tainan City 71150, Taiwan 2 Biomedical Electronics Translational

More information

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems Chapter 9 OPTICAL INSTRUMENTS Introduction Thin lenses Double-lens systems Aberrations Camera Human eye Compound microscope Summary INTRODUCTION Knowledge of geometrical optics, diffraction and interference,

More information

Applied Machine Vision

Applied Machine Vision Applied Machine Vision ME Machine Vision Class Doug Britton GTRI 12/1/2005 Not everybody trusts paintings but people believe photographs. Ansel Adams Machine Vision Components Product Camera/Sensor Illumination

More information

Cameras As Computing Systems

Cameras As Computing Systems Cameras As Computing Systems Prof. Hank Dietz In Search Of Sensors University of Kentucky Electrical & Computer Engineering Things You Already Know The sensor is some kind of chip Most can't distinguish

More information

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5 Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain

More information

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics Chapters 1-3 Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation Radiation sources Classification of remote sensing systems (passive & active) Electromagnetic

More information

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

More information

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros Acquisition Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros Image Acquisition Digital Camera Film Outline Pinhole camera Lens Lens aberrations Exposure Sensors Noise

More information

Dr F. Cuzzolin 1. September 29, 2015

Dr F. Cuzzolin 1. September 29, 2015 P00407 Principles of Computer Vision 1 1 Department of Computing and Communication Technologies Oxford Brookes University, UK September 29, 2015 September 29, 2015 1 / 73 Outline of the Lecture 1 2 Basics

More information

New foveated wide angle lens with high resolving power and without brightness loss in the periphery

New foveated wide angle lens with high resolving power and without brightness loss in the periphery New foveated wide angle lens with high resolving power and without brightness loss in the periphery K. Wakamiya *a, T. Senga a, K. Isagi a, N. Yamamura a, Y. Ushio a and N. Kita b a Nikon Corp., 6-3,Nishi-ohi

More information

Optics: An Introduction

Optics: An Introduction It is easy to overlook the contribution that optics make to a system; beyond basic lens parameters such as focal distance, the details can seem confusing. This Tech Tip presents a basic guide to optics

More information

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL HEADLINE: HDTV Lens Design: Management of Light Transmission By Larry Thorpe and Gordon Tubbs Broadcast engineers have a comfortable familiarity with electronic

More information

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134 PHY 112: Light, Color and Vision Lecture 26 Prof. Clark McGrew Physics D 134 Finalities Final: Thursday May 19, 2:15 to 4:45 pm ESS 079 (this room) Lecture 26 PHY 112 Lecture 1 Introductory Chapters Chapters

More information

Vision Lighting Seminar

Vision Lighting Seminar Creators of Evenlite Vision Lighting Seminar Daryl Martin Midwest Sales & Support Manager Advanced illumination 734-213 213-13121312 dmartin@advill.com www.advill.com 2005 1 Objectives Lighting Source

More information

LENS ZOOM-SWIR 7x P/N C0628

LENS ZOOM-SWIR 7x P/N C0628 ZOOM SWIR 7x LENS ZOOM-SWIR 7x P/N C0628 General Description This family of high resolution SWIR lenses image from 0.9 2.3 m making them especially well-suited for PCB inspection, special laser applications,

More information

Notes from Lens Lecture with Graham Reed

Notes from Lens Lecture with Graham Reed Notes from Lens Lecture with Graham Reed Light is refracted when in travels between different substances, air to glass for example. Light of different wave lengths are refracted by different amounts. Wave

More information

Camera Image Processing Pipeline

Camera Image Processing Pipeline Lecture 13: Camera Image Processing Pipeline Visual Computing Systems Today (actually all week) Operations that take photons hitting a sensor to a high-quality image Processing systems used to efficiently

More information

Unit 1: Image Formation

Unit 1: Image Formation Unit 1: Image Formation 1. Geometry 2. Optics 3. Photometry 4. Sensor Readings Szeliski 2.1-2.3 & 6.3.5 1 Physical parameters of image formation Geometric Type of projection Camera pose Optical Sensor

More information

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor Image acquisition Digital images are acquired by direct digital acquisition (digital still/video cameras), or scanning material acquired as analog signals (slides, photographs, etc.). In both cases, the

More information

Applied Optics. , Physics Department (Room #36-401) , ,

Applied Optics. , Physics Department (Room #36-401) , , Applied Optics Professor, Physics Department (Room #36-401) 2290-0923, 019-539-0923, shsong@hanyang.ac.kr Office Hours Mondays 15:00-16:30, Wednesdays 15:00-16:30 TA (Ph.D. student, Room #36-415) 2290-0921,

More information

WHITE PAPER. Sensor Comparison: Are All IMXs Equal? Contents. 1. The sensors in the Pregius series

WHITE PAPER. Sensor Comparison: Are All IMXs Equal?  Contents. 1. The sensors in the Pregius series WHITE PAPER www.baslerweb.com Comparison: Are All IMXs Equal? There have been many reports about the Sony Pregius sensors in recent months. The goal of this White Paper is to show what lies behind the

More information

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal Yashvinder Sabharwal, 1 James Joubert 2 and Deepak Sharma 2 1. Solexis Advisors LLC, Austin, TX, USA 2. Photometrics

More information

Basler Accessories. Technical Specification BASLER LENS C M. Order Number

Basler Accessories. Technical Specification BASLER LENS C M. Order Number Basler Accessories Technical Specification BASLER LENS C23-526-2M Order Number 22183 Document Number: DG1916 Version: 1 Language: (English) Release Date: 17 January 218 Contacting Basler Support Worldwide

More information

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC ROBOT VISION Dr.M.Madhavi, MED, MVSREC Robotic vision may be defined as the process of acquiring and extracting information from images of 3-D world. Robotic vision is primarily targeted at manipulation

More information

Compact Dual Field-of-View Telescope for Small Satellite Payloads

Compact Dual Field-of-View Telescope for Small Satellite Payloads Compact Dual Field-of-View Telescope for Small Satellite Payloads James C. Peterson Space Dynamics Laboratory 1695 North Research Park Way, North Logan, UT 84341; 435-797-4624 Jim.Peterson@sdl.usu.edu

More information

Digital Imaging Rochester Institute of Technology

Digital Imaging Rochester Institute of Technology Digital Imaging 1999 Rochester Institute of Technology So Far... camera AgX film processing image AgX photographic film captures image formed by the optical elements (lens). Unfortunately, the processing

More information

Telecentric Imaging Object space telecentricity stop source: edmund optics The 5 classical Seidel Aberrations First order aberrations Spherical Aberration (~r 4 ) Origin: different focal lengths for different

More information

Basler Accessories. Technical Specification BASLER LENS C M. Order Number

Basler Accessories. Technical Specification BASLER LENS C M. Order Number Basler Accessories Technical Specification BASLER LENS C23-1616-2M Order Number 2200000180 Document Number: DG001913 Version: 01 Language: 000 (English) Release Date: 17 January 2018 Contacting Basler

More information

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0 TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0 TABLE OF CONTENTS Overview... 3 Color Filter Patterns... 3 Bayer CFA... 3 Sparse CFA... 3 Image Processing...

More information

CHARGE-COUPLED DEVICE (CCD)

CHARGE-COUPLED DEVICE (CCD) CHARGE-COUPLED DEVICE (CCD) Definition A charge-coupled device (CCD) is an analog shift register, enabling analog signals, usually light, manipulation - for example, conversion into a digital value that

More information

Make Machine Vision Lighting Work for You

Make Machine Vision Lighting Work for You Make Machine Vision Lighting Work for You Lighting is our passion Flexibility is our model Daryl Martin Technical Sales and Product Specialist Advanced illumination 734-213-1312 dmartin@advill.com Who

More information

Section 1: Sound. Sound and Light Section 1

Section 1: Sound. Sound and Light Section 1 Sound and Light Section 1 Section 1: Sound Preview Key Ideas Bellringer Properties of Sound Sound Intensity and Decibel Level Musical Instruments Hearing and the Ear The Ear Ultrasound and Sonar Sound

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR)

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR) COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR) PAPER TITLE: BASIC PHOTOGRAPHIC UNIT - 3 : SIMPLE LENS TOPIC: LENS PROPERTIES AND DEFECTS OBJECTIVES By

More information

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions e2v s Onyx family of image sensors is designed for the most demanding outdoor camera and industrial machine vision applications,

More information

University Of Lübeck ISNM Presented by: Omar A. Hanoun

University Of Lübeck ISNM Presented by: Omar A. Hanoun University Of Lübeck ISNM 12.11.2003 Presented by: Omar A. Hanoun What Is CCD? Image Sensor: solid-state device used in digital cameras to capture and store an image. Photosites: photosensitive diodes

More information

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

More information

Measuring intensity in watts rather than lumens

Measuring intensity in watts rather than lumens Specialist Article Appeared in: Markt & Technik Issue: 43 / 2013 Measuring intensity in watts rather than lumens Authors: David Schreiber, Developer Lighting and Claudius Piske, Development Engineer Hardware

More information

GraspIT Questions AQA GCSE Physics Waves

GraspIT Questions AQA GCSE Physics Waves A Waves in air, fluids and solids 1. The diagrams below show two types of wave produced on a slinky spring. A B a. Which one is a transverse wave? (1) Wave B b. What is the name of the other type of wave?

More information

IR Laser Illuminators

IR Laser Illuminators Eagle Vision PAN/TILT THERMAL & COLOR CAMERAS - All Weather Rugged Housing resist high humidity and salt water. - Image overlay combines thermal and video image - The EV3000 CCD colour night vision camera

More information

EE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name:

EE119 Introduction to Optical Engineering Fall 2009 Final Exam. Name: EE119 Introduction to Optical Engineering Fall 2009 Final Exam Name: SID: CLOSED BOOK. THREE 8 1/2 X 11 SHEETS OF NOTES, AND SCIENTIFIC POCKET CALCULATOR PERMITTED. TIME ALLOTTED: 180 MINUTES Fundamental

More information

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters 1. Film Resolution Introduction Resolution relates to the smallest size features that can be detected on the film. The resolving power is a related

More information

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987) Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group bdawson@goipd.com (987) 670-2050 Introduction Automated Optical Inspection (AOI) uses lighting, cameras, and vision computers

More information

Why select a BOS zoom lens over a COTS lens?

Why select a BOS zoom lens over a COTS lens? Introduction The Beck Optronic Solutions (BOS) range of zoom lenses are sometimes compared to apparently equivalent commercial-off-the-shelf (or COTS) products available from the large commercial lens

More information

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008 CS559: Computer Graphics Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008 Today Eyes Cameras Light Why can we see? Visible Light and Beyond Infrared, e.g. radio wave longer wavelength

More information

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis Chapter 2: Digital Image Fundamentals Digital image processing is based on Mathematical and probabilistic models Human intuition and analysis 2.1 Visual Perception How images are formed in the eye? Eye

More information

Exercise questions for Machine vision

Exercise questions for Machine vision Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided

More information

CMOS Image Sensors in Cell Phones, Cars and Beyond. Patrick Feng General manager BYD Microelectronics October 8, 2013

CMOS Image Sensors in Cell Phones, Cars and Beyond. Patrick Feng General manager BYD Microelectronics October 8, 2013 CMOS Image Sensors in Cell Phones, Cars and Beyond Patrick Feng General manager BYD Microelectronics October 8, 2013 BYD Microelectronics (BME) is a subsidiary of BYD Company Limited, Shenzhen, China.

More information

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design Outline Chapter 1: Introduction Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design 1 Overview: Integration of optical systems Key steps

More information

Variable microinspection system. system125

Variable microinspection system. system125 Variable microinspection system system125 Variable micro-inspection system Characteristics Large fields, high NA The variable microinspection system mag.x system125 stands out from conventional LD inspection

More information