Exercise questions for Machine vision

Similar documents
ME 6406 MACHINE VISION. Georgia Institute of Technology

BASLER A601f / A602f

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

Basler aca640-90gm. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 02

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Solution Set #2

Basler aca gm. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 01

Optical design of a high resolution vision lens

Computer Vision. Howie Choset Introduction to Robotics

Camera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note

The Noise about Noise

Basler aca km. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 03

An Efficient Color Image Segmentation using Edge Detection and Thresholding Methods

CHAPTER 9 POSITION SENSITIVE PHOTOMULTIPLIER TUBES

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

CS534 Introduction to Computer Vision. Linear Filters. Ahmed Elgammal Dept. of Computer Science Rutgers University

2013 LMIC Imaging Workshop. Sidney L. Shaw Technical Director. - Light and the Image - Detectors - Signal and Noise

NON UNIFORM BACKGROUND REMOVAL FOR PARTICLE ANALYSIS BASED ON MORPHOLOGICAL STRUCTURING ELEMENT:

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

Image Processing Lecture 4

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

A 3MPixel Multi-Aperture Image Sensor with 0.7µm Pixels in 0.11µm CMOS

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,

A Novel Morphological Method for Detection and Recognition of Vehicle License Plates

EC-433 Digital Image Processing

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

Hello, welcome to the video lecture series on Digital Image Processing.

Mech 296: Vision for Robotic Applications. Vision for Robotic Applications

Radial Polarization Converter With LC Driver USER MANUAL

Basler ral km. Camera Specification. Measurement protocol using the EMVA Standard 1288 Document Number: BD Version: 01

Image Capture and Problems

QUALITY CHECKING AND INSPECTION BASED ON MACHINE VISION TECHNIQUE TO DETERMINE TOLERANCEVALUE USING SINGLE CERAMIC CUP

On spatial resolution

ROBOT VISION. Dr.M.Madhavi, MED, MVSREC

Introduction. Prof. Lina Karam School of Electrical, Computer, & Energy Engineering Arizona State University

Number Plate Recognition System using OCR for Automatic Toll Collection

Homework Set 3.5 Sensitive optoelectronic detectors: seeing single photons

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

ScrappiX. Visual inspection equipment for dimensional. and surface defects control

ECEN 4606, UNDERGRADUATE OPTICS LAB

University Of Lübeck ISNM Presented by: Omar A. Hanoun

Development of a new multi-wavelength confocal surface profilometer for in-situ automatic optical inspection (AOI)

Image Acquisition. Jos J.M. Groote Schaarsberg Center for Image Processing

Enhanced Shape Recovery with Shuttered Pulses of Light

EE 392B: Course Introduction

Camera Calibration Certificate No: DMC III 27542

Image Processing. COMP 3072 / GV12 Gabriel Brostow. TA: Josias P. Elisee (with help from Dr Wole Oyekoya) Image Processing.

Photons and solid state detection

General Imaging System

IMAGE ACQUISITION, CAMERAS

Image and Multidimensional Signal Processing

Fully depleted, thick, monolithic CMOS pixels with high quantum efficiency

Be aware that there is no universal notation for the various quantities.

INDIAN VEHICLE LICENSE PLATE EXTRACTION AND SEGMENTATION

MML-High Resolution 5M Series

Introduction. Lighting

CSE 473/573 Computer Vision and Image Processing (CVIP)

MAV-ID card processing using camera images

Digital Image Processing

Histogram and Its Processing

VEHICLE LICENSE PLATE DETECTION ALGORITHM BASED ON STATISTICAL CHARACTERISTICS IN HSI COLOR MODEL

Supplementary Figure 1

The Henryk Niewodniczański INSTITUTE OF NUCLEAR PHYSICS Polish Academy of Sciences ul. Radzikowskiego 152, Kraków, Poland.

GE 113 REMOTE SENSING. Topic 7. Image Enhancement

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Histogram and Its Processing

Lecture 20: Optical Tools for MEMS Imaging

THE CCD RIDDLE REVISTED: SIGNAL VERSUS TIME LINEAR SIGNAL VERSUS VARIANCE NON-LINEAR

Digital Image Processing

EBU - Tech 3335 : Methods of measuring the imaging performance of television cameras for the purposes of characterisation and setting

Digital Photogrammetry. Presented by: Dr. Hamid Ebadi

FSI Machine Vision Training Programs

Lecture 19: Depth Cameras. Kayvon Fatahalian CMU : Graphics and Imaging Architectures (Fall 2011)

CCD Requirements for Digital Photography

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Speckle disturbance limit in laserbased cinema projection systems

ORIFICE MEASUREMENT VERISENS APPLICATION DESCRIPTION: REQUIREMENTS APPLICATION CONSIDERATIONS RESOLUTION/ MEASUREMENT ACCURACY. Vision Technologies

An Improved Bernsen Algorithm Approaches For License Plate Recognition

Opto Engineering S.r.l.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department. 2.71/2.710 Final Exam. May 21, Duration: 3 hours (9 am-12 noon)

High-speed Micro-crack Detection of Solar Wafers with Variable Thickness

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Scrabble Board Automatic Detector for Third Party Applications

LENSES. INEL 6088 Computer Vision

The introduction and background in the previous chapters provided context in

A 3D Multi-Aperture Image Sensor Architecture

A Vehicle Speed Measurement System for Nighttime with Camera

ScanArray Overview. Principle of Operation. Instrument Components

arxiv:physics/ v1 [physics.optics] 12 May 2006

Non Linear Image Enhancement

Image processing for gesture recognition: from theory to practice. Michela Goffredo University Roma TRE

Implementation of License Plate Recognition System in ARM Cortex A8 Board

X-ray generation by femtosecond laser pulses and its application to soft X-ray imaging microscope

Master thesis: Author: Examiner: Tutor: Duration: 1. Introduction 2. Ghost Categories Figure 1 Ghost categories

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Digital Image Processing

Structured-Light Based Acquisition (Part 1)

Transcription:

Exercise questions for Machine vision This is a collection of exercise questions. These questions are all examination alike which means that similar questions may appear at the written exam. I ve divided the questions and related them to the topics covered in Lecture 1 to 6. Lecture 1: 1. Consider a machine vision system for Original Character Recognition (OCR). List and also explain shortly the fundamental steps of the image processing for this system. 2. Motivate shortly why a machine vision system should be used for optical quality inspection of items produced at a factory. Also give drawbacks, pros and cons.

Lecture 2 and 3: 3. Compare the following three different architectures for CCD sensors, Full frame CCD, Frame transfer CCD, Interline transfer CCD. Draw simple architectural layouts for each sensor, explain differences, advantages and drawbacks with the different architectures. 4. How is the Signal to Noise Ratio (SNR) related to the architectural design of an area scan image sensor? What can be done to improve SNR for a given architecture? 5. Explain the two most common modes for readout of area scan sensors, progressive- and interlaced scan. Explain advantages and disadvantages with the two different modes. 6. What is a telecentric lens, what is its most important feature and what purpose can it be used for? Draw a sketch of the basic light rays for a telecentric lens and explain its function in comparison with a standard Gaussian optics. 7. Explain shortly the following characteristics of illumination. Diffuse light Directed light Telecentric light Front light Back light Bright field Dark field

A) B) C) 8. Combine pictures A to B above with the following choices, F=2 F=16 and Telecentric lens. Each picture matches only one choice. Motivate shortly why you think a picture matches your choice. 9. I will first list a number of properties related to the illumination for a machine vision system: Diffuse light Directed light Telecentric light Front light Back light Bright field Dark field The following four pictures show different types of illumination systems with cameras. Assign the right properties to each of the illumination system. There can be more than one relevant property for an illumination. A) B) C) D) 10. A Gaussian lens is used to project an image of a car at far distance. It must be possible to resolve the car s two headlights with spacing of 1.5 meters and at a distance of 600 meters from the camera. What will be the absolute maximum pixel size of an area scan sensor if the focal length of the lens is 8 mm? 1 1 = s' s 1 f ' β = h ' s' = h s 11. Explain shortly the effects of reducing the aperture of a lens system in front of a pixel area sensor in terms of depth of field, signal to noise ratio and image resolution.

12. a) I will now list five different sources of noise that can appear in images made with for example CMOS or CCD sensors. Noise sources: - Shot noise (In amplifiers) - Sensitivity variations between pixels - Photon noise - Dark signal variations - Thermal noise Assign the above five sources of noise to the two following classes of noise. Each class will have more than one noise source and both classes will have total of five noise sources. Classes of noise: - Temporal noise - Spatial noise b) Suggest a method how to suppress the temporal noise.

Lecture 4: 13. Figure 1 depicts a diagram for the amplitude characteristic of a 2D linear filter. What kind of filter is this, High Pass, Low Pass or Band Pass? What do you expect to be the visual effect on an image if this filter where applied on it? Figure 1. Amplitude characteristic for a 2D filter. r r 14. The three pictures above show the amplitude transfer function for a 2D Butterworth filter. The amplitude characteristics is illustrated as a mesh plot, an intensity image and as a radial plot for different orders n of the filter. a) What class of filter is this, Low Pass, High Pass, Band Pass or Band Stop filter? b) Which one of the pictures labeled A to E is filtered using the smallest value for r as defined in the amplitude characteristics above.

Original image A B C D E 15. Explain shortly what kind of image processing operations is necessary for high quality downscaling of an image? 16. Figure 2 shows a picture of the silhouette of a screw taken at back lightening. The silhouette is highlighted at subpixel precision by image processing. Suggest a method for how this image processing can be done. Figure 2. Screw thread. 17. Assume a gray-level image f(r,c) and its smoothened correspondence g(r,c). The region of interest is R. Then the dynamic thresholding of brighter objects on a dark background can S = ( r, c) R f ( r, c) g( r, c) be defined as, { } Where gdiff is a fixed constant. Pictures A and B both have bright spots on a darker background. If compared with using simple global thresholding, which one of the pictures A or B will require the use of dynamic thresholding in order to successfully segment the bright spots from the background? Motivate your answer shortly. If you answer with a long and not precise story, your credits will be reduced. g diff Pic. A Pic. B

18. Image (2) was processed by Histogram equalisation to create image (1). a) Which one of the histograms A and B correspond to image (1) and (2)? Explain and motivate b) How is the graylevel transformation function computed for Histogram equalisation? Image (1) Image (2) 9000 10000 8000 6000 4000 2000 0 8000 7000 6000 5000 4000 3000 2000 1000 0 0 50 100 150 200 250 Histogram A 0 50 100 150 200 250 Histogram B

Lecture 5: 19. The following equation defines a morphological operation, OP = { x ( B) A } x. a) What is the name of this operation? b) Which one of the following pictures below is the correct graphical illustration of the effect of that operation for a binary image A and a structural element B? ) Illustration A) Illustration B) A Illustration C) Illustration D) 20. Two image points (x1,y1), (x2,y2), lying on a single line are shown. The corresponding lines in a parameter space are also shown. This transformation can be utilized in the Hough transform to find lines in an image. Explain shortly how this detection of lines works for the Hough transform and how the line parameters for that line can be measured.

21. Figure 3 depicts an image object A and a structural element B used for the morphological ) A B = x ( B) A. M ) is the reflection of operation Dilate. Dilate is defined as, { } a region M and M x is the translation of region M by a vector x. Draw a nice picture and show how A B will look like. x Figure 3. Image object A and the structural element B used for morphological operations. 22. Explain how Template matching is working and suggest also a method how to cope with the increasing execution times for Template matching as the resolution of the image is increased. 23. The following drawing shows a square shaped region A of pixels belonging to one single image component in a binary image. Region B is a circular shaped structural element having the diameter 2 and with its origin at the center, indicated with a dark spot. A 3 B 2 A morphological operation OP is defined as, OP = { x ( B) A }. a) What is the name of this operation as known in all reference litterature? b) Make a simple sketch having the right proportions showing the visual effect on region A after applying this morphological operation OP using structural element B. Also make an indication in your drawing on what the size of the processed region will be. I want just the sketch as an answer, nothing else. x

24. The following binary picture to the left shows vertical and horizontal lines having a width of 5 pixels. Distances between lines are at least 30 pixels. Consider lines as belonging to region A. The drawing to the right shows a structuring element B. B 10 pixels A morphological operation OP1 is defined as, OP = { x ( B) A } 1 x. Another morphological operation OP2 is defined as, OP = { x ( B) A } a) What are the names for operations OP1 and OP2? 1 pixel b) Apply firstly OP1 on region A and then apply OP2 such that C = OP2(OP1(A)). Make a drawing and show how region C will look like. 25. An Edge Histogram Descriptor EHD is computed on the two pictures shown below. 2 ) x Picture A Picture B Estimate and illustrate EHD for the two pictures A and B. Explain what the diagrams show and why they look like they do.

Lecture 6: 26. Explain shortly how a minimum distance classifier works. What kind of priori statistics is computed for the trainings sets? 27. Assume that you are going to apply a segmentation algorithm in a machine vision system that is built to inspect colours, sizes and shapes of cookies on a conveyer belt before packaging. Damaged cookies, burnt cookies, cookies with non circular shapes or cookies out of size range must be disposed by the system. The intensity of the illumination for this machine vision system is inhomogeneous distributed (not constant intensity) over the observation area. What kind of segmentation algorithm would you select for this task, discuss and motivate your selection of algorithm and also discuss properties of other system components that might be important to consider for this selection?

Lecture 7: 28. Draw a picture and explain how a sheet of light laser can be used together with an area scan sensor for acquisition of a 3D-surface and based on triangulation techniques. Just explain the measurement principle how it works. 29. Figure 4 depicts a schematic setup for stereo imaging based on two image sensors and an λb object W at distance Z given by Z = λ. The object W is projected onto the image x 2 x 1 sensors 1 and 2 at position (x1,y1) and (x2,y2) respectively. Explain what kind of image processing is necessary in order to measure the distance Z from the two sensors to the object W. Relate your explanation to the given expression for Z. Figure 4. Stereo imaging. 30. The position of a laser line projected onto an image detector versus height of object is shown in Figure 5. One curve is representing measured values used for calibration and second curve shows a computed transfer function. These curves comes from a setup for laser scanning used to capture a 3D surface. It shows almost a perfect linear relation between pixels and height. From measurements and it was shown that the standard deviation of computed position of laser line was 0.2 pixels. a) Explain shortly what property of captured images is limiting precision of laser line position to 0.2 pixels? b) What is the precision of height measurement that this scanner can achieve? height [mm] 4.2 4 3.8 Comparison of computed and measured levels Measured slope =0.014406 Computed slope =0.014209 Calibration reference level =3.3139 mm Deviation in slopes =0.0089986 mm Measured Heights Computed Heights 3.6 3.4 0 10 20 30 40 50 60 pixels Figure 5. Height versus position on image detector.

31. The intensity profile of an imaged laser line is shown in Figure 6. When Center Of Gravity (COG) is computed to find position of laser line in one of the spatial dimensions, a threshold can be used. a) Explain and motivate why this threshold is used for a laser scanner. 250 Gray level 200 150 100 50 Threshold 0 0 50 100 150 200 250 [Pixels] Figure 6. Gray level versus pixels for an imaged laser line. 32. A laser scanner is using a step size of 0.5 mm. What is the highest frequency along the scanning dimension that can be resolved? 33. A laser scanner is using a telecentric lens having an optical amplification of 0,25. Pixel size of image detector is 10 µm. What is the highest frequency along the laser line that can be resolved?