Introduction. Lighting

Similar documents
ME 6406 MACHINE VISION. Georgia Institute of Technology

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

FSI Machine Vision Training Programs

Image Acquisition. Jos J.M. Groote Schaarsberg Center for Image Processing

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

APPLICATIONS FOR TELECENTRIC LIGHTING

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Vision Lighting Seminar

ULS24 Frequently Asked Questions

The Importance of Wavelengths on Optical Designs

Sensors and Sensing Cameras and Camera Calibration

HR2000+ Spectrometer. User-Configured for Flexibility. now with. Spectrometers

Vixar High Power Array Technology

Cameras CS / ECE 181B

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

COLOUR INSPECTION, INFRARED AND UV

Digital Photographic Imaging Using MOEMS

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Applied Machine Vision

Image sensor combining the best of different worlds

Coating Thickness Measurement System

Exercise questions for Machine vision

TL2 Technology Developer User Guide

Applications for cameras with CMOS-, CCD- and InGaAssensors. Jürgen Bretschneider AVT, 2014

Laser Telemetric System (Metrology)

Make Machine Vision Lighting Work for You

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Spark Spectral Sensor Offers Advantages

High-speed Micro-crack Detection of Solar Wafers with Variable Thickness

The future of the broadloom inspection

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Detectors for microscopy - CCDs, APDs and PMTs. Antonia Göhler. Nov 2014

In the name of God, the most merciful Electromagnetic Radiation Measurement

Image Formation and Capture

Parallel Mode Confocal System for Wafer Bump Inspection

General Imaging System

Opto Engineering S.r.l.

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Instructions for the Experiment

Digital Cameras The Imaging Capture Path

Optimizing throughput with Machine Vision Lighting. Whitepaper

flexible lighting technology

Laser Scanning for Surface Analysis of Transparent Samples - An Experimental Feasibility Study

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy

Reflectors vs. Refractors

Spatially Resolved Backscatter Ceilometer

The Medipix3 Prototype, a Pixel Readout Chip Working in Single Photon Counting Mode with Improved Spectrometric Performance

WHITE PAPER. Sensor Comparison: Are All IMXs Equal? Contents. 1. The sensors in the Pregius series

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

PICO MASTER 200. UV direct laser writer for maskless lithography

instruments Solar Physics course lecture 3 May 4, 2010 Frans Snik BBL 415 (710)

The FTNIR Myths... Misinformation or Truth

Company synopsis. MSU series

Supplementary Materials

Machine Vision Basics

Cvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro

Measuring intensity in watts rather than lumens

THE OFFICINE GALILEO DIGITAL SUN SENSOR

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

SPECTRAL SCANNER. Recycling

Introduction to Computer Vision

Using Stock Optics. ECE 5616 Curtis

remote sensing? What are the remote sensing principles behind these Definition

Digital Imaging Rochester Institute of Technology

Laser Beam Analysis Using Image Processing

Applications of Optics

CRISATEL High Resolution Multispectral System

CHARGE-COUPLED DEVICE (CCD)

Eight Tips for Optimal Machine Vision Lighting

A Digital Camera and Real-time Image correction for use in Edge Location.

Image Formation: Camera Model

Beam Profiling. Introduction. What is Beam Profiling? by Michael Scaggs. Haas Laser Technologies, Inc.

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

Development of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI)

Pulsed Laser Power Measurement Systems

How does prism technology help to achieve superior color image quality?

Hyperspectral Imager for Coastal Ocean (HICO)

Gerhard K. Ackermann and Jurgen Eichler. Holography. A Practical Approach BICENTENNIAL. WILEY-VCH Verlag GmbH & Co. KGaA

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Making Industries Smarter

Rotation/ scale invariant hybrid digital/optical correlator system for automatic target recognition

TechNote. T001 // Precise non-contact displacement sensors. Introduction

CCD-array with RTSC. Laserdiode. Multi-lens optics. Filter

771 Series LASER SPECTRUM ANALYZER. The Power of Precision in Spectral Analysis. It's Our Business to be Exact! bristol-inst.com

Beamscope-P8 Wavelength Range. Resolution ¼ - 45 ¼ - 45

Fundamentals of CMOS Image Sensors

BMC s heritage deformable mirror technology that uses hysteresis free electrostatic

Spectral signatures of surface materials in pig buildings

Evaluation of laser-based active thermography for the inspection of optoelectronic devices

CMOS Star Tracker: Camera Calibration Procedures

Optical design of a high resolution vision lens

CS 376b Computer Vision

The Xiris Glossary of Machine Vision Terminology

Section 2 ADVANCED TECHNOLOGY DEVELOPMENTS

ECEN 4606, UNDERGRADUATE OPTICS LAB

Optical Sensor Systems from Carl Zeiss CORONA PLUS. Tuned by Carl Zeiss. The next generation in the compact class

Spectral Analysis of the LUND/DMI Earthshine Telescope and Filters

Beam Shaping and Simultaneous Exposure by Diffractive Optical Element in Laser Plastic Welding

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

Transcription:

&855(17 )8785(75(1'6,10$&+,1(9,6,21 5HVHDUFK6FLHQWLVW0DWV&DUOLQ 2SWLFDO0HDVXUHPHQW6\VWHPVDQG'DWD$QDO\VLV 6,17()(OHFWURQLFV &\EHUQHWLFV %R[%OLQGHUQ2VOR125:$< (PDLO0DWV&DUOLQ#HF\VLQWHIQR http://www.sintef.no/ecy/7210/ Introduction Our definition of a machine vision system is a system for measurement, inspection or surveillance based on connecting an electronic camera to a computer. To be able to build successful machine vision systems one must control the following technologies and parts of a machine vision system. Lighting Optics Camera sensor Electronics Image processing System integration 7KHSXUSRVHRIWKLVSDSHULVWRSURYLGHDQ RYHUYLHZRIFXUUHQWWUHQGVZLWKLQHDFKRI WKHVHILHOGVDQGWKHLULPSDFWRQPDFKLQH YLVLRQDSSOLFDWLRQV or geometrical properties. There are a number of important design factors for lighting: Intensity Spatial distribution Spectral distribution Temporal variation Temperature sensitivity Shielding against unwanted light Without the proper images, we may spend awful amounts of time and money to obtain reliable measurements. The emergence of specific equipment for even illumination is the major trend in lighting. Fiber pads provide even back light illumination, half domes provide even diffuse front light illumination, ring lights, pits and fiber probes provide even side light illumination and beam shaped lasers provide even pattern illumination. The light intensity can often be controlled directly from the computer over an RS-232 connection and long-term temporal variation can be adjusted. The impact of this equipment is that prototyping is performed much faster without rigorous lab testing. Standard off-the-shelf equipment is used to solve the most common machine vision tasks. Fig.1: Machine vision systems. (Photo: Jan D. Martens) Lighting It is a main issue in machine vision to have full control of the lighting to achieve the proper image quality. The lighting should be designed to enhance the measurement of the wanted physical )LJ/DVHUSODQHSURMHFWLRQRQWRDVWHHOEROW

Optics The optics is crucial for many machine vision systems. The optics is designed to collect and focus the incoming light on the sensor. Important effects of the optics are: Geometric aberrations Colour aberrations Collimation Optical transfer function (spatial resolution) Projections Special effects (filters, gratings, mirrors, beam-splitters, micro lenses etc.) To obtain high-precision measurement some of the optical effects must be corrected either by calibration or by expensive optics. It is a trend to use diffractive optic elements for a range of light shaping tasks, such as laser beam forming, diffusers, large-scale telecentric lenses and tailored spectrometric measurements. The diffractive optic elements can be produced in plastics using much of the same technology as in Compact Disc (CD) production. Small-scale telecentric lenses are becoming stateof-the-art for most measurement applications with a field-of-view up to 30-50 mm. A telecentric lens collects only light rays within a small angle to the optical axis of the lens system and provides larger depth-of-field than ordinary lenses. Camera sensors The semiconductor camera sensors are based on arrays or matrices of light sensitive elements called pixels. Silicon is light sensitive in the visible (VIS) to near infrared (NIR) part of the electromagnetic spectrum (300-1000 nm). Other semiconductors are sensitive in other parts of the spectrum, ultraviolet (UV), mid infrared (MIR) and far infrared (FIR). Using special layers called scintilators the semiconductors can even be made sensitive to X-ray radiation. Since applications in the visible part of the spectrum proliferate, silicon sensors are the most common ones. Charged Coupled Devices (CCD) are most common today, while Charge Injection Devices (CID) and Metal-Oxide Semiconductors (MOS) are used for special purposes. The CCDs allow efficient transfer of the electronic charges from the sensor elements to the read-out electronics by a principle called bucket brigade where the charges are shifted from sensor element to sensor element on the chip itself. CCDs are today produced on special semiconductor process lines. The current trend is towards CMOS sensors that can be produced by the same production process as ordinary microchips, allowing cheap sensors with the possibility of integrating processing power directly on the sensor chip. CMOS sensors allow direct access to selected pixels, a principle called active pixel access. The market for camera sensors is already divided in several segments; the machine vision cameras are better suited than standard surveillance and analog TV-quality cameras, but are more expensive. We believe that the price difference will diminish in the future, since the new progressive-scan digital video broadcasting standards are based on much of the same camera technology. )LJ,QVSHFWLRQRIDLUEUDNHILWWLQJVDW 5DXIRVV$6XVLQJDWHOHFHQWULFOHQV In the future we will also see special-purpose CMOS sensors with special types of image processing performed on the chip itself. We will also see integrated sensors with several different measurement principles operating concurrently.

the frame-grabber obsolete, each PC will soon have a plug-n-play digital video connection. The next giant step is to move general-purpose processors into the camera, making them into real "smart cameras". Several producers offer such solutions today based on special-purpose processors, but we believe the trend will be towards general-purpose processors. In the future the machine vision camera will contain a self-sustained PC, allowing transparent application development and system integration. )LJ3DUTXHWIORRUERDUGLQVSHFWLRQE\VPDUW FDPHUD Important camera sensor characteristics are: Pixel ratio and area Pixel sensitivity, gain and saturation Fill factor (percentage of light sensitive area) Pixel-to-pixel variation Dark current (background electronic noise) Smear and blooming Electronic shuttering (controlling exposure) Sensor alignment with the optical axis Progressive-scan digital output Some of these objectives are not possible to combine. 100% fill factor sensors do not allow electronic shuttering, but require mechanical shuttering or strobe (pulsed) lighting, as an example, due to the architecture of the sensor itself. Electronics After exposure each pixel in the sensor has an electronic charge corresponding to the total intensity of the incoming light during exposure. This electronic charge must be read out from the sensor, amplified and digitised, converting the analog electronic charges to digital signals that can be stored and processed on a digital computer. The trend is to put more and more of the electronics into the camera. CMOS sensors allow integration of the camera specific electronics directly on the chip. Several machine vision cameras offers digital output and even framebuffers which allows storage of several to a few hundred images before transfer to the computer. We believe that digital cameras soon will make The electronics introduce many new effects that we must be aware of and control. Dynamic range of the digitisation Gamma-factor (non-linear corrective gain) Digitisation noise Synchronisation of read-out and exposure Jitter (line-to-line synchronisation) Transmission noise Automatic gain control Automatic white balance Automatic colour correction To date everything that is automatic is avoided in most successful machine vision applications since processing gets more complicated when using for example automatic gain. Fixed thresholds are only fixed for a specific gain. )LJ'D\VRIWKHSDVW"$IUDPHJUDEEHUIRU PDFKLQHYLVLRQZLWKVSHFLDOSXUSRVHSURFHVVRUV Originally the pixels do have a linear light response function, but the electronics may distort the signal from the sensor. These distortions should carefully be avoided in high-precision measurement systems. Many machine vision cameras are specially designed for this task and avoid the greatest pitfalls.

Image processing The images from a machine vision measurement system must be processed to extract the specific measurement information. The main task of the image-processing module is often to transform a digital image to a set of invariant measurements. It is of utmost importance to keep the image processing as simple as possible to make it work in real applications. The concept of what can be done in real-time is expanding rapidly as the seemingly everincreasing amounts of computer power become available. There is a trend from simple greyscale measurements, thresholding and edge detection towards utilising high-level shape, colour, texture and spatial information in machine vision systems. We are able to perform tasks that were unimaginable a few years ago. This leads to larger research and development projects, since more valuable tasks can be solved by machine vision systems. noise. The curves with zero second directional derivative of the intensity distribution is for example the correct physical locations of the edges of an image if we assume symmetric smearing in the optics and image formation process. These curves can be reconstructed with a much higher precision from a surface representation using geometrical operations than from a pixel representation using thresholding techniques. Advantages of machine vision 100% inspection and control Objective measurements Non-contact measurements High accuracy High capacity High flexibility, reprogramming is possible Traceability Scalability System duplication is straightforward Mass production is relatively cheap )LJ+HLJKWSORWRIWKHOHWWHU5RQDFHOOXODU SKRQHGLVSOD\ZLQGRZIURP,3ODVW Prototyping will be done in high-level languages with mathematical capabilities. Because of the boost in computer power, less time will be spent on optimising software code for speed, more time will be spent on user interface and ease of use. The main limitation to many problems is no longer computer power, but our knowledge and understanding of methodology, mathematics, physics, statistics and perception. One possible step forward in image processing will be to leave the sampled digital image domain and reconstruct the original continuous intensity distribution to obtain better shape, colour and texture information about the images, avoiding many of the effects of quantisation and sampling Theoretical foundations for shape, colour and texture are developing currently, but there are many remaining problems to be solved. The development of a consistent shape theory will require knowledge of geometry, physics and perception, colour will require knowledge of spectrometry and perception, while texture will require knowledge of the interaction between light and matter, physics and perception. Object recognition is an important factor in many machine vision systems. The current trend is towards flexible templates, discarding fixed templates. We believe the largest challenge in object recognition is to make the systems automatically or semi-automatically configurable by allowing the systems to learn the template shape and allowed deviation of the template from real samples or by specifying a template for measurements manually in a user-friendly graphical user interface. We believe there will be a trend towards modelling the physics of image formation in future machine vision systems. We will also see a trust towards understanding human perception more thoroughly.

System integration Most machine vision systems for measurements, inspection and surveillance are an integrated part of a larger system. The machine vision system must be able to communicate in real-time with the other parts of the system to report results, initiate actions like generating alarms, sorting and rejection of the measured objects and building reliable measurement models. In addition the equipment must meet certain environmental standards to endure varying mechanical stress, temperature, vibrations, electromagnetic noise and air quality (dust, dirt). Many new small technology-driven companies will emerge based on image processing solving particular tasks. These companies will have to market their equipment or software on the global market or to a strong home market to survive. We have pointed out the trends towards standard illumination equipment, advanced optical modules, digital cameras and general-purpose processors. The hardware will for many tasks be directly off the shelf allowing faster and cheaper system integration. A few professional system integrators will probably dominate the Norwegian market because of their ability to solve simple machine vision problems relatively cheaply using standardised equipment and solutions. Special integrated machine vision equipment complying to industry standards already exists for simple machine vision tasks including low resolution gauging, state checking, counting and sorting of mechanical parts with a simple geometric design passing by on the process line. Summary We have presented some current and future trends in machine vision, both on specific equipment and on trends in machine vision image processing. We have tried to shed light on the impact of these trends on machine vision applications, research and development. The main trends are towards a segmented market with a relatively high-volume low-price segment solving simple machine vision tasks. )LJ7ULORELWHVFDQQHGZLWKODVHUSODQH WULDQJXODWLRQ Research, development and consulting must move towards more difficult and challenging specific and more valuable problems to solve.