Chapter 2 Fourier Integral Representation of an Optical Image

Similar documents
ELEC Dr Reji Mathew Electrical Engineering UNSW

OPTICAL IMAGE FORMATION

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Physics 3340 Spring Fourier Optics

Transfer Efficiency and Depth Invariance in Computational Cameras

Εισαγωγική στην Οπτική Απεικόνιση

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department. 2.71/2.710 Final Exam. May 21, Duration: 3 hours (9 am-12 noon)

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Angular motion point spread function model considering aberrations and defocus effects

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

30 lesions. 30 lesions. false positive fraction

Some of the important topics needed to be addressed in a successful lens design project (R.R. Shannon: The Art and Science of Optical Design)

Deconvolution , , Computational Photography Fall 2017, Lecture 17

1 Laboratory 7: Fourier Optics

1.Discuss the frequency domain techniques of image enhancement in detail.

Module 3: Video Sampling Lecture 18: Filtering operations in Camera and display devices. The Lecture Contains: Effect of Temporal Aperture:

Frequency Domain Enhancement

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Bioimage Informatics

SUPER RESOLUTION INTRODUCTION

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

6.003: Signal Processing. Synthetic Aperture Optics

Transforms and Frequency Filtering

Image Deblurring. This chapter describes how to deblur an image using the toolbox deblurring functions.

Modulation Transfer Function

Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Today. next week. MIT 2.71/ /04/09 wk13-a- 1

Focused Image Recovery from Two Defocused

Be aware that there is no universal notation for the various quantities.

IMAGE ENHANCEMENT IN SPATIAL DOMAIN

Advanced Audiovisual Processing Expected Background

Teaching the Uncertainty Principle In Introductory Physics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Improving the Detection of Near Earth Objects for Ground Based Telescopes

Sensitivity analysis of phase diversity technique for high resolution earth observing telescopes

Modeling the MTF and noise characteristics of complex image formation systems

DEFOCUS BLUR PARAMETER ESTIMATION TECHNIQUE

Holography. Casey Soileau Physics 173 Professor David Kleinfeld UCSD Spring 2011 June 9 th, 2011

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Fourier transforms, SIM

Evaluation of infrared collimators for testing thermal imaging systems

Modulation Transfer Function

ELECTRONIC HOLOGRAPHY

Advanced Lens Design

Aberrations and Visual Performance: Part I: How aberrations affect vision

Digital Images & Image Quality

DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002

Improving Signal- to- noise Ratio in Remotely Sensed Imagery Using an Invertible Blur Technique

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

High Contrast Imaging

Analysis and design of filters for differentiation

Resolution. [from the New Merriam-Webster Dictionary, 1989 ed.]:

Design of Practical Color Filter Array Interpolation Algorithms for Cameras, Part 2

Physics 23 Laboratory Spring 1987

ECEN 4606, UNDERGRADUATE OPTICS LAB

Interference in stimuli employed to assess masking by substitution. Bernt Christian Skottun. Ullevaalsalleen 4C Oslo. Norway

Intorduction to light sources, pinhole cameras, and lenses

ANT5: Space and Line Current Radiation

Compressive Through-focus Imaging

The influence of phase mask position upon EDoF system Sheng-Hsun Hsieh a, Zih-Hao Lian a, Chong-Min Chang b and Chung-Hao Tien a

Today. Defocus. Deconvolution / inverse filters. MIT 2.71/2.710 Optics 12/12/05 wk15-a-1

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

8. Lecture. Image restoration: Fourier domain

Chapter 4: Fourier Optics

8.2 IMAGE PROCESSING VERSUS IMAGE ANALYSIS Image processing: The collection of routines and

Administrative details:

MTF characteristics of a Scophony scene projector. Eric Schildwachter

Linear Image Processing

Coded photography , , Computational Photography Fall 2018, Lecture 14

Imaging Particle Analysis: The Importance of Image Quality

Design of practical color filter array interpolation algorithms for digital cameras

The analysis of optical wave beams propagation in lens systems

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Defocusing and Deblurring by Using with Fourier Transfer

Digital Image Processing

Modeling and Synthesis of Aperture Effects in Cameras

International Journal of Advancedd Research in Biology, Ecology, Science and Technology (IJARBEST)

Department of Electronic Engineering NED University of Engineering & Technology. LABORATORY WORKBOOK For the Course SIGNALS & SYSTEMS (TC-202)

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

BEAM HALO OBSERVATION BY CORONAGRAPH

Optical design of a high resolution vision lens

Optical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation

Coded photography , , Computational Photography Fall 2017, Lecture 18

Defocusing Effect Studies in MTF of CCD Cameras Based on PSF Measuring Technique

On spatial resolution

9.4 Temporal Channel Models

USE OF FT IN IMAGE PROCESSING IMAGE PROCESSING (RRY025)

Fourier Transform. Any signal can be expressed as a linear combination of a bunch of sine gratings of different frequency Amplitude Phase

Automatic optical matched filtering an evaluation of Reflexite and Transitions lenses in an optical matched filtering role

Computer Graphics (Fall 2011) Outline. CS 184 Guest Lecture: Sampling and Reconstruction Ravi Ramamoorthi

CS 4501: Introduction to Computer Vision. Filtering and Edge Detection

Module 3 : Sampling & Reconstruction Lecture 26 : Ideal low pass filter

Imaging Systems Laboratory II. Laboratory 8: The Michelson Interferometer / Diffraction April 30 & May 02, 2002

12.4 Alignment and Manufacturing Tolerances for Segmented Telescopes

Diffraction. Interference with more than 2 beams. Diffraction gratings. Diffraction by an aperture. Diffraction of a laser beam

On the rendering of synthetic images with specific point spread functions F. van den Bergh

The three-dimensional point spread functions of a microscope objective in image and object space

Module 2 WAVE PROPAGATION (Lectures 7 to 9)

Transcription:

Chapter 2 Fourier Integral Representation of an Optical This chapter describes optical transfer functions. The concepts of linearity and shift invariance were introduced in Chapter 1. This chapter continues that discussion by applying those concepts to optical imaging components and systems. s are two dimensional and are accurately described by two-dimensional Fourier integrals. Common practice, however, is to analyze or measure horizontal and vertical frequency response and then use the results to characterize imager performance. This chapter describes the errors that result from assuming that an imager is characterized by its horizontal and vertical frequency response. Throughout this tutorial, the mathematics is at the introductory calculus level. However, the descriptive arguments require some familiarity with the concepts of Fourier analysis and complex functions. 2.1 Linear shift-invariant optical systems In Fig. 2.1, a simple optical system is imaging a clock onto a screen. For simplicity, unity magnification is assumed. If each point in the scene is blurred by the same amount, then the system is shift invariant. If the image intensity profile equals the sum of the individual blurs from each point in the scene, then the system is linear. The optical blur is called the point spread function (PSF). The PSF is illustrated in the lower left corner of the image. Each point source in the scene becomes a PSF in the image. The PSF is also called the impulse response of the system. Each point in the scene is blurred by the optics and projected onto the screen. This process is repeated for each of the infinite number of points in the scene. The image is the sum of all of the individual PSFs. Two considerations are important here. First, the process of the lens imaging the scene is linear and, therefore, superposition holds. The image is accurately represented by the sum of the PSF resulting from the lens that is imaging each 31

32 Chapter 2 Scene Lens Figure 2.1 Clock being imaged by a lens onto a screen; a point source in the scene (upper right) becomes a point-spread-function blur in the image (lower left). individual scene point. Second, it is assumed that the shape of the optical blur (that is, the shape of the PSF) does not depend on position within the field of view. In most optical systems, the PSF is not constant over the entire field of view. Typically, optical aberrations vary with field angle. The optical blur is generally smaller at the center of an image than it is at the edge. However, the image plane can generally be subdivided into regions within which the optical blur is approximately constant. A region of the image with approximately constant blur is sometimes called an isoplanatic patch. Optical systems are linear and shiftinvariant over isoplanatic regions of the field of view. The image within an isoplanatic patch can be represented as a convolution of the PSF over the scene. If h(x,y) represents the spatial shape (the intensity distribution) of the PSF, then h(x x',y y') represents a PSF at location (x',y') in the image plane. The units of x and y are milliradians (mrad). Let s cn (x',y') describe the brightness of the scene, and i mg (x, y) describe the brightness of the image. i ( x,y) hx ( x'y, y' ) s ( x', y' ) dx'dy' mg. (2.1) cn Each point in the scene radiates independently and produces a point PSF in the image plane with corresponding intensity and position. The image is a linear superposition of the resulting PSFs. Mathematically, that result is obtained by

Fourier Integral Representation of an Optical 33 convolving the optical PSF over the scene intensity distribution to produce the image. Since a convolution in space corresponds to a multiplication in frequency, the optical system is a spatial filter. I (ξ, η) H(ξ, η) S (ξ, η), (2.2) mg cn where I mg (, ) = Fourier transform of image, S cn (, ) = Fourier transform of scene, and H(, ) = the optical transfer function (OTF). and are spatial frequencies in the x and y direction, respectively. The units of and are cycles per milliradian (mrad 1 ). The OTF is the Fourier transform of the PSF h(x,y). However, in order to keep the image intensity proportional to scene intensity, the OTF of the optics is normalized by the total area under the PSF blur spot. H (ξ, η) -jξx -jηy hx,ye ( ) e dxdy. hx,ydxdy ( ) (2.3) The MTF of the optics is the magnitude H(, ) of the function H(, ). Note that the relationship in Eq. (2.2) applies between the scene and the image plane of a well-corrected optical system. The optical system is considered to be well-corrected because the PSF (the optical blur) is reasonably constant over the image plane. Optical systems often have multiple image planes. The first image becomes the scene that is imaged by the second set of optical elements. For example, the image in Fig. 2.1 might be re-imaged by another lens, as shown in Fig. 2.2. In this case, each point in the original image is blurred by the PSF of the next set of optics. If the OTF of the second lens is H 2 (, ), then I (ξ,η) H (ξ,η) H(ξ,η) S (ξ,η). (2.4) mg 2 The total system MTF is the product of the individual MTFs. One caution is necessary here: Diffraction is caused by the limiting aperture in the imager. In a system with multiple image planes, diffraction MTF is applied only once. The transfer function between scene and display is the product of optics MTF, detector MTF, display MTF, and the MTF of other factors that blur the cn

34 Chapter 2 Scene Display Figure 2.2 The picture is further blurred by imaging with a second lens. The OTF from the scene to the display is the product of the individual lens OTFs. image. Any blurring of the image can be treated as an MTF as long as the blur is constant over the entire image. For example, the active area of a detector acts as optical PSF. The larger the active detector area, the more blurred is the image. Light falling anywhere on the detector area is summed together. The detector area convolves with the scene to blur the image in the same way that the optical PSF blurs the image. The MTF of the detector is the Fourier transform of the detector photosensitive area. The display MTF is the Fourier transform of a display pixelintensity pattern. In the absence of sampling artifacts, the Fourier transform of the displayed image is the Fourier transform of the scene multiplied by the product of optics, detector, display, and other component MTFs. 2.2 Equivalence of spatial and frequency domain filters Equations (2.1) and (2.2) describe the filtering process in the space domain and the frequency domain, respectively. In space, the output of an LSI system is the input convolved with the system impulse response (in this case, the optical PSF). Consider the example shown in Fig. 2.3. The system is a simple lens imaging the transparency of a four-bar target. Given that the lens blur is the same across the field of view, the system is LSI. The output image is the transparency intensity convolved with the lens PSF.

Fourier Integral Representation of an Optical 35 Object Plane Plane Impulse Response Object Figure 2.3 Spatial filtering in an optical system. Figure 2.4 illustrates frequency domain filtering. The two-dimensional Fourier transform of the scene intensity is taken. The input spectrum clearly shows the fundamental harmonic of the four-bar target in the horizontal direction. The higher-order harmonics are difficult to see because they have less amplitude than have the fundamental harmonics. The Fourier transform of the image is obtained by multiplying the Fourier transform of the scene by the Fourier transform of the PSF. The output image is found by taking the inverse transform of the product. The resulting image is identical to that given by the spatial convolution of the PSF in the space domain. In Fig. 2.4, the direct-current component of the input-, transfer-, and outputfrequency spectrums has been removed so that the higher-frequency components are visible. Otherwise, all that would be seen is a bright point in the middle of the picture. LSI imaging system analysis can be performed using two methods: spatial domain analysis and frequency domain analysis. The results given by these analyses are identical, although frequency domain analysis has an advantage. Equations (2.1) and (2.3) both involve double integrals. However, an imager has many component MTFs. Using Eq. (2.1) involves calculating double integrals for line-of-sight blur, diffraction blur, optical aberration blur, detector blur, digital filtering blur, display blur, and eyeball blur. Using Eq. (2.3) involves

36 Chapter 2 Object Fourier transform Multiply by lens MTF Inverse transform Input frequencies Lens MTF Output frequencies Figure 2.4 Frequency domain filtering in an optical system. double integrals to find the Fourier transform of the scene and a second double integral to find the spatial image. Intervening calculations involve multiplying the various component MTFs. Fourier domain analysis is used because it provides accurate results with reduced computation. 2.3 Reducing LSI r Analysis to One Dimension It is common to analyze imagers separately in the horizontal and vertical directions. The two-dimensional imager MTF is assumed to be the product of horizontal and vertical MTFs. This assumption reduces two-dimensional Fourier integrals to two one-dimensional Fourier integrals. The one-dimensional treatment, therefore, saves computation. The separability assumption is almost never satisfied, even in the simplest cases; assuming separability virtually always leads to some error in the result. Nonetheless, the majority of scientists and engineers use the product of horizontal and vertical frequency response as the imager MTF. This section discusses some of the errors that result from this common simplification. Separability in Cartesian coordinates requires that a function of (x,y) can be expressed as the product of a function of x times a function of y. f ( xy, ) fx( x) fy( y). (2.5) If Eq. (2.5) is true, then the Fourier transform is also separable, and Eq. (2.6) holds. F(ξ,η) Fx(ξ) Fy(η). (2.6)