Using Optics to Optimize Your Machine Vision Application

Similar documents
Imaging Optics Fundamentals

TEXTILE INSPECTION INDUSTRY OVERVIEW

DATAMAN 470 SERIES BARCODE READERS. Premium fixed-mount barcode readers for the most challenging applications

IN-SIGHT 9902L LINE SCAN VISION SYSTEM

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Basler Accessories. Technical Specification BASLER LENS C M. Order Number

Basler Accessories. Technical Specification BASLER LENS C M. Order Number

Optical design of a high resolution vision lens

TECHSPEC COMPACT FIXED FOCAL LENGTH LENS

IN-SIGHT 2000 VISION SENSORS

Opto Engineering S.r.l.

Telecentric lenses.

Keysight Technologies Why Magnification is Irrelevant in Modern Scanning Electron Microscopes. Application Note

PRODUCT GUIDE Vision software from the world leader.

Optical and mechanical parameters. 100 mm N. of elements 20.5 mm Dimensions 11.7 degrees Weight F/N = 4 (fixed) N.A.

PRODUCT GUIDE Vision software from the world leader.

Keysight Technologies E1834E/G/J/M/Z Mounted Beam Delivery Optics. Preliminary Data Sheet

ABOUT RESOLUTION. pco.knowledge base

MML-High Resolution 5M Series

LENS ZOOM-SWIR 7x P/N C0628

mm F2.6 6MP IR-Corrected. Sensor size

MEASUREMENT APPLICATION GUIDE OUTER/INNER

Optics: An Introduction

ECEN 4606, UNDERGRADUATE OPTICS LAB

Macro Varon 4.5/85. Key features. Applications. Web and surface inspections

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

LENSES. INEL 6088 Computer Vision

Frame through-beam sensors

Low Capacitance Probes Minimize Impact on Circuit Operation

CODE V Introductory Tutorial

Through-beam ring sensors

CISCO ONS /100-GHZ INTERLEAVER/DE-INTERLEAVER FOR THE CISCO ONS MULTISERVICE TRANSPORT PLATFORM

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

LEICA Summarit-S 70 mm ASPH. f/2.5 / CS

Reikan FoCal Aperture Sharpness Test Report

There is a range of distances over which objects will be in focus; this is called the depth of field of the lens. Objects closer or farther are

Compact camera module testing equipment with a conversion lens

Measuring Vgs on Wide Bandgap Semiconductors APPLICATION NOTE

Using molded chalcogenide glass technology to reduce cost in a compact wide-angle thermal imaging lens

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Reikan FoCal Aperture Sharpness Test Report

7x P/N C1601. General Description

LENS OB-SWIR500/7 P/N C0615

High Quality Automotive Glass

Imaging Particle Analysis: The Importance of Image Quality

Keysight Measuring High Impedance Sources Using the U8903B Audio Analyzer. Application Note

Reikan FoCal Aperture Sharpness Test Report

Reikan FoCal Aperture Sharpness Test Report

INFLUENCE OF VARIABLE APERTURE STOP

Isolation Addresses Common Sources of Differential Measurement Error

Variable microinspection system. system125

Keysight Technologies Accurate Evaluation of MEMS Piezoelectric Sensors and Actuators Using the E4990A Impedance Analyzer.

Simplifying DC-DC Converter Characterization using a 2600B System SourceMeter SMU Instrument and MSO/DPO5000 or DPO7000 Series Scope APPLICATION NOTE

Measuring Power Supply Switching Loss with an Oscilloscope

ORIFICE MEASUREMENT VERISENS APPLICATION DESCRIPTION: REQUIREMENTS APPLICATION CONSIDERATIONS RESOLUTION/ MEASUREMENT ACCURACY. Vision Technologies

The Fastest, Easiest, Most Accurate Way To Compare Parts To Their CAD Data

Measurement Statistics, Histograms and Trend Plot Analysis Modes

Verifying Power Supply Sequencing with an 8-Channel Oscilloscope APPLICATION NOTE

Topic 6 - Optics Depth of Field and Circle Of Confusion

Evaluating Oscilloscope Bandwidths for your Application

Physics 2020 Lab 9 Wave Interference

Robert B.Hallock Draft revised April 11, 2006 finalpaper2.doc

Cisco ONS Metropolitan Dense Wavelength Division Multiplexing 100-GHz FlexLayer Filter Solution

How to Optimize the Sharpness of Your Photographic Prints: Part I - Your Eye and its Ability to Resolve Fine Detail

Xenon-Zirconia 3.3/92

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Speed and Image Brightness uniformity of telecentric lenses

INTRODUCTION TO VISION SENSORS The Case for Automation with Machine Vision. AUTOMATION a division of HTE Technologies

Better Imaging with a Schmidt-Czerny-Turner Spectrograph

If I Could... Imagine Perfect Vision

TOOLS & TECHNOLOGY. # techguide

SOURCE MEASURE UNITS. Make Multiple Measurements Accurately Using a Single Instrument All While Saving Space, Time and Money

Keysight Technologies Precise Current Profile Measurements of Bluetooth Low Energy Devices using the CX3300. Application Brief

How to Choose a Machine Vision Camera for Your Application.

Keysight Technologies

Chapter 25 Optical Instruments

Keysight Technologies Using an External Trigger to Generate Pulses with the B2960A

Reikan FoCal Aperture Sharpness Test Report

Solar Array Simulation System Integration

1.5 GHz Active Probe TAP1500 Datasheet

Techniques to Achieve Oscilloscope Bandwidths of Greater Than 16 GHz

Automated Frequency Response Measurement with AFG31000, MDO3000 and TekBench Instrument Control Software APPLICATION NOTE

Nikon 24mm f/2.8d AF Nikkor (Tested)

Evaluating Oscilloscopes for Low-Power Measurements

Keysight Technologies

Xenon-Diamond 2.9/106 With beam splitter

BenchTop Extraction Arms with unbeatable flexibility

Performing Safe Operating Area Analysis on MOSFETs and Other Switching Devices with an Oscilloscope APPLICATION NOTE

Overview: Integration of Optical Systems Survey on current optical system design Case demo of optical system design

digital film technology Resolution Matters what's in a pattern white paper standing the test of time

Keysight Technologies N6850A Broadband Omnidirectional Antenna. Data Sheet

Single-view Metrology and Cameras

Quality Testing of Intraocular Lenses. OptiSpheric IOL Family and WaveMaster IOL 2

Smart vision and optical solutions for the food and beverage industry

Using the Model 4225-RPM Remote Amplifier/ Switch to Automate Switching Between DC I-V, C-V, and Pulsed I-V Measurements APPLICATION NOTE

ME 297 L4-2 Optical design flow Analysis

In-circuit Measurements of Inductors and Transformers in Switch Mode Power Supplies APPLICATION NOTE

Solutions for Solar Cell and Module Testing

Keysight Technologies Phase Noise X-Series Measurement Application

Transcription:

Expert Guide Using Optics to Optimize Your Machine Vision Application Introduction The lens is responsible for creating sufficient image quality to enable the vision system to extract the desired information about the object from the image. The lens is critical to machine vision performance because information that is not captured by the lens cannot be re-created in software. In a typical application, the lens is required to locate features within the field of view (FOV), ensure the features are in focus, maximize contrast and avoid perspective distortion. What may be adequate image quality for one application may be insufficient for another. This white paper will explain the fundamentals of using optics to optimize a machine vision application. Basics of machine vision optics The object area imaged by the lens is called the field of view (FOV). The FOV should cover all features that are to be inspected with tolerance for alignment errors. Features within the FOV must be large enough to be measured. In alignment and gauging applications, the lens is also responsible for presenting the image in a fixed geometry that is calibrated to the object s position in space. The working distance (WD) is the distance from the front of the lens to the object Depth Of Field Sensor Size Camera Field Of View Sensor Working Distance Figure : Field of view. Resolution being imaged. The depth of field (DOF) is the maximum object depth that can be maintained entirely in focus. The DOF also determines the amount of variation in the working distance that can be allowed while still achieving an acceptable focus. The sensor size is the size of a camera sensor s active area, typically specified in the horizontal dimension. The primary magnification is the ratio between the sensor size and the field of view. With primary magnification held constant, reducing the sensor size reduces the field of view and increasing the sensor size increases the field of view. If the sensor is large enough, it will exceed the size of the image circle that is created by the lens, creating blank spots in the corners of the lens that are known as vignetting. Resolution Resolution is a measurement of the vision system s ability to reproduce object detail. Figure (a) shows an image with two small objects with some finite distance between them. As they are imaged through the lens onto the sensor, they are so close together that they are imaged onto adjacent pixels. If we were to zoom in, we would see one object that is two pixels in size because the sensor cannot resolve the distance between the objects. In Figure (b), on the other hand, the separation between the objects has been increased to the point that there is a pixel of separation between them in the image. This pattern a pixel on, a pixel off, and a pixel on is referred to as a line pair and is used to define the pixel limited resolution of the system. SENSOR SENSOR Pixels LENS (a) OBJECT (b) Line Pair NOT RESOLVED RESOLVED Figure : Resolution.

Figure shows a spark plug being imaged on two sensors with different levels of resolution. Each cell in the grid on the image represents one pixel. So the resolution in the image on the left with a 0. megapixel sensor is not sufficient to distinguish characteristics such as spacing, scratches or bends in the features of interest. The image on the right with a. megapixel sensor provides the ability to discern details in the features of interest. In this case, simply swapping sensors provides a considerable improvement in resolution. But as we move to more powerful sensors, we need to ensure that the optics are able to reproduce the details that we need to image. Targets can be used to determine the limiting resolution of a system and how well the sensor and optics complement each other. The UASF 9 target shown in Figure has horizontal and vertical lines so it can be used to test both horizontal and vertical resolution. Measurements are made in the frequency domain where spatial frequency is usually measured in line pairs per millimeter (LP/mm). Figure : Field of view and resolution example. The lines are arranged so that their spatial frequency is increased moving in a spiral towards the center of the target. USAF 9 X EDMUND - 0 0 - - Contrast Contrast is the separation in intensity between blacks and whites in an image. The greater the difference between a black and a white line, the better the contrast. Figure shows two different images of a UPS label taken with the same high resolution sensor at the same position and focal length with different lenses. The difference is that the lens used to take the image on the right provides higher levels of contrast because it is a better match for the high resolution sensor. Figure : The importance of contrast. Color filtering can be used to increase contrast. Figure shows a machine vision application designed to distinguish between red and green gel capsules. The image on the left with no filter shows a subtle difference in contrast between the different color capsules. A sensor could distinguish between the different color capsules in this image, however, variations in lighting or to the ambient environment could generate false positives or false negatives. Adding either a red or a green filter increases the contrast to the point that the vision solution becomes much more robust. NO FILTER RED FILTER Sampling Area Sampling Area Figure : Color filtering. GREEN FILTER Sampling Area Figure : UASF 9 target.

Diffraction In the real world, diffraction, sometimes called lens blur, reduces the contrast at high spatial frequencies, setting a lower limit on image spot size. The differences between ideal and real lens behavior are called aberrations. Figure 7 shows how these effects degrade the quality of the image. The object on the top of Figure 7 has a relatively low spatial frequency while the object on the bottom has a higher spatial frequency. After passing through the lens, the upper image has 90% contrast while the bottom image has only 0% contrast due to its higher spatial frequency. Lens designers choose the geometry of the lens to keep aberrations within acceptable limits however it is impossible to design a lens that works perfectly under all possible conditions. Lenses are generally designed to operate under a specific set of conditions, such as field of view, wavelengths, etc. Now let s look at lens performance across an entire field of view. The three images enclosed in different colors in Figure 8 are close-up views of the boxes shown in the same color on the larger image. The chart at the bottom of Figure 9 shows the modulation transfer function (MTF) of the lens at each position in the field of view. MTF is a measurement of the ability of an optical system to reproduce various levels of detail from the object to the image as shown by the degree of contrast in the image. The MTF measures the percentage of contrast loss as a function of the spatial frequency in units of LP/mm. The lens in Figure 9 has 9% average contrast in the center section, % in the bottom middle and % in the corner. The image demonstrates the importance of checking the MTF of a lens over the entire area that will be used in the application. Figure 8: Lens performance differs over its field of view. Figure 7: Relationship between spatial frequency and contrast. Figure 9: Modular transfer function of a lens at three different locations in Figure 8.

Figure 0 shows the same test applied to a different lens with the same focal length and same field of view using the same image sensor. In this case the contrast is reduced to 7% in the centre, % in the bottom middle and 7% in the corner. The third lens, shown in Figure, is different in that the performance is good in the center of the image at % contrast, drops off in the corner position to %, and drops even more in the bottom middle to %. It s important to note that all three of these lenses have the same FOV, depth of field, resolution and primary magnification. The differences in their performance show how the lens performance can have a dramatic impact on the ability of the sensor to discern the details that are important in the application. Depth of Field Depth of field is the difference between the closest and furthest working distances an object may be viewed before an unacceptable blur is observed. The F stop number (F/#), also called the aperture setting or the iris setting of the lens helps to determine the depth of field. The F/# is the focal length of the lens divided by the diameter of the lens. F/# s are specified for most lenses at a focal length of infinity. As the F/# is reduced, the lens collects less light. The absolute resolution limit of the lens is reduced when the aperture is reduced in size. Reducing the aperture setting or making the aperture smaller increases depth of field as shown in Figure. The purple lines show the depth of field and the red lines indicate the maximum allowable blur. Increasing the allowable blur also increases the depth of field. The best focused position in the depth of field is indicated by the green line, which is close to the end of the depth of field closest to the lens. Low f# (Large Aperture) DOF Small f# (Small Aperture) Best Focus Maximum Blue Allowable To Obtain Desired Resolution Figure 0: Modular transfer function of a lens at three different locations in Figure 8. DOF Figure : Effect of F/# or iris or aperture setting on depth of field. Figure shows a depth of field target with a set of lines on a sloping base. A ruler on the target makes it simple to determine how far above and below the best focus the lens is able to resolve the image. Figure : Modular transfer function of a third lens on the same image shown in Figure 8. Figure : Target used to measure depth of field.

Figure shows the performance of short fixed focal length lens that is used in machine vision applications. With the aperture completely open looking far up the target in an area defined by the red box that s beyond the depth of field range, we see a considerable amount of blur. With the aperture half open, the resolution at this depth of field position increases. The lines are crisper and clearer and the numbers are now legible. But when we continue to close the iris to the point where there is very little light coming in the overall resolution is reduced and the numbers and lines both become less clear. Figures and 7 show another lens with a different focal length that is designed specifically for machine vision applications. With the iris completely open, the lines are gray rather than black and white and the numbers are somewhat legible but highly blurred. With the iris halfway closed, the lines come into sharper focus and the numbers are crisper. With the iris mostly closed, the resolution improves even more in the area of interest and the image is sharp throughout the range of working distances shown. Figure : Effect of changing iris setting on 0mm Double Gauss lens. Figure : Effect of changing iris setting on 8.mm fixed focal length lens. Figure shows this same lens but this time looking at the best focus position. With the iris completely open we see the image and numbers clearly. With the iris half open the image has become blurred. The resolution degrades even more with the iris mostly closed. Figure 7: Effect of changing iris setting on 0mm Double Gauss lens. Figure : Effect of changing iris setting on 8.mm fixed focal length lens.

Distortion Figure 8 shows an example of distortion, an optical error or aberration that results in a difference in magnification at different points within the image. The black dots show the position of different points in the object as seen through the lens while the red dots show the actual position of the object. Distortion can sometimes be corrected by the vision system which calculates where each pixel is supposed to be and moves it to the correct position. % Distortion = (AD-PD) X 00 PD Perspective distortion can also be minimized optically with a telecentric lens as shown in Figure 0. The image on the right shows the objects four pins mounted perpendicular to a base. The image captured by the conventional lens suffers from perspective distortion. The telecentric lens, on the other hand, maintains magnification over the depth of field so it reduces or eliminates perspective distortion. Conventional Lens Non-Telecentric Region Object AD PD Telecentric Lens FOCUS F-STOP Telecentric Region Non-Telecentric Region Object AD = Actual Distance of the image points from the center of the field PD = Predicted Distance that the real world points would be from the center of the field if distortion was not present Figure 8: Distortion. As shown in Figure 9, perspective distortion caused by the fact that the further an object is from the camera, the smaller it appears through a lens. It is particularly important for gauging or other high precision applications. Perspective distortion can be minimized by keeping the camera perpendicular to the field of view. Figure 0: Telecentric vs. conventional lens. Figure shows perspective distortion in a real world scenario. The object in the top center appears through two different lenses in the left and right lower images. Using a conventional fixed focal length lens produces the image on the lower left. The two parts appear to be different heights on the monitor even though they are exactly the same height in real life. This is the same way our eyes see the objects although our brain automatically corrects for perspective distortion and we perceive the objects as being of equal height. In the image on the lower right, the telecentric lens has corrected for perspective distortion and the objects can be measured accurately. Figure 9: Perspective distortion. Figure : Maintaining consistent feature size.

Conclusion Optics is very important to the overall success of a machine vision application. The examples shown here demonstrate the importance of considering the overall system including the optics, lighting and vision system as opposed to simply picking out components. When you are discussing the application with suppliers, be sure to completely explain the goals of the inspection as opposed to just asking for specific components so that the supplier can contribute to the success of the application. Finally, expect a lot from your optical and vision system suppliers and find trusted partners that are committed to the success of your application and willing to put in the effort needed to make it happen. Americas United States, East + 08 0 000 United States, West + 0 99 8 United States, South + 8 8 United States, Detroit + 8 8 00 United States, Chicago + 0 9 00 Canada + 90 7 Mexico + 8 00 78 Central America + 8 00 78 South America + 909 7 0 Brazil + 7 880 00 Europe Austria + 00 0 Belgium + 8080 9 France + 777 0 Germany +9 7 9 0 Hungary + 0 00 Ireland + 8 0 Italy +9 0 77 00 Netherlands + 08 080 77 Poland +8 7 77 07 Spain + 9 7 78 Sweden + 88 Switzerland + 7 0 0 Turkey +90 7 8 United Kingdom + 7 8 00 Asia China +8 00 99 India +90 0 780 Japan +8 977 00 Korea +8 9 907 Singapore + 700 Taiwan +88 78 000 www.cognex.com Corporate Headquarters One Vision Drive Natick, MA 070 USA Tel: + 08 0 000 Fax: + 08 0 Copyright 0, Cognex Corporation. All information in this document is subject to change without notice. All Rights Reserved. Cognex is a registered trademark of Cognex Corporation. All other trademarks are the property of their respective owners. Printed in the USA. Lit. No. VNEG-007