# Determining MTF with a Slant Edge Target ABSTRACT AND INTRODUCTION

Save this PDF as:

Size: px
Start display at page:

## Transcription

1 Determining MTF with a Slant Edge Target Douglas A. Kerr Issue 2 October 13, 2010 ABSTRACT AND INTRODUCTION The modulation transfer function (MTF) of a photographic lens tells us how effectively the lens transfers a luminance variation in the scene (by which detail is conveyed) onto the focal plane, and in particular how that varies with spatial frequency (which we can think of as the fineness of the detail). This function indicates, objectively, the resolving potential of the lens. We often read of the MTF being determined using a slant edge target test. In this article we review the concept of the MTF and the principles of this testing technique. THE MODULATION TRANSFER FUNCTION We will examine the concept of the modulation transfer function by looking in sequence at the three words that make up its description. Modulation Modulation in this case refers to the variation in the luminance of a scene from point to point, and the corresponding variation in the illuminance from point to point in the image deposited by the lens on the focal plane (on the film or digital sensor). Detail is conveyed by such variation; if there is no variation in luminance, the scene is uniform gray and hardly worth photographing. 1 Modulation can be quantified in terms of modulation depth, a way of expressing the ratio between the maximum and minimum luminance (or illuminance) across a certain small part of the scene (or image). Transfer For our purposes here, the job of the lens is to transfer the luminance variation of the scene into an illuminance variation on the image. It does this incompletely, for various reasons. We can quantify the degree to which it accomplishes this job in terms of the modulation transfer ratio. This is the ratio of (a) the modulation depth of the 1 For the sake of simplicity, we will assume only monochrome scenes and gray scale photography, so that luminance/illuminance is the only property of interest. Copyright 2010 Douglas A. Kerr. May be reproduced and/or distributed but only intact, including this notice. Brief excerpts may be reproduced with credit.

2 Determining MTF with a Slant Edge Target Page 2 illuminance deposited on the image to (b) the modulation depth of the luminance of the scene (usually within a small region). The modulation transfer ratio is quite parallel to the gain of an amplifier stage in an electronic system. Function In mathematics, when the value of one variable quantity depends on the value(s) of one or more other variable quantities, in some specific way, the first variable is said to be a function of the other variable(s). Under this concept, we may say that x is a function of y and z. That means that the value of variable x (called the dependent variable) is determined by the values of variables y and z (called the independent variables). A specific name identifying a function can, in common practice, mean three distinct things: The variable x itself (after all, we said x is a function... ). The value of x for a certain set of values of the independent variables (the value of the function for that situation). The overall relationship by which x depends on y and z (the function proper). This diverse use of the function name can be confusing if we have not been forewarned about it. Graphic representation of a function If a variable is a function of one other variable ( x is a function of y ), we can show the relationship graphically in the familiar way a plot of x against y. If one variable is a function of two other variables ( x is a function of y and z ), we cannot show the relationship graphically in the familiar way. Often what we will do then is to take one of the independent variables and (arbitrarily) consider it to be a parameter (it is still an independent variable; we just handle it a little differently). Suppose we decide to treat z as the parameter. We adopt some specific value of z and, holding that constant, plot the variation of x with y (labeling the curve with the value of z). Then we take another specific value of z and, holding that constant, again plot the variation of x with y (labeling that curve with the new value of z).

3 Determining MTF with a Slant Edge Target Page 3 The result is what we often describe as a family of curves, one curve for each of our chosen values of the parameter, z. But we can equally legitimately decide to treat y as a parameter. Then we choose a certain value of y and, holding that constant, plot the variation of x with z, and so forth. Which of those we do will depend on the context in which we wish to visualize the variation of x. The modulation transfer function For a given lens with a given aperture (and focal length setting, if relevant), the modulation transfer ratio varies with several factors, most prominently: The spatial frequency 2 of the modulation (which we can think of as the fineness of the detail the modulation conveys). Typically the modulation transfer ratio decreases as the spatial frequency increases. The location in the image of the area of interest (notably its distance from the optical axis of the lens. Typically the modulation transfer ratio decreases as we move from the optical axis. Thus, the modulation transfer ratio is a function of spatial frequency and distance off axis. This function is called the modulation transfer function (MTF) of the lens. And of course, as we discussed earlier, the term MTF is also applied to the modulation transfer ratio (which we then never hear of under its own name), or to its value in a particular situation. Two presentations As we mentioned above, when a variable is a function of two other variables, there are two ways to present the relationship graphically, choosing either of the two independent variables to play the role of a parameter. For scientific or optical engineering work with the MTF, we normally select distance off the axis as the parameter, and plot the modulation transfer ratio against spatial frequency (preferably in cycles/mm). But, 2 Spatial frequency has dimensions of cycles per unit distance. In scientific work, the unit is typically cycles per millimeter. Often this unit is spoken of in optical work as line pairs per mm, but sometimes as lines per mm, a source of considerable confusion. There are historical justifications for both these conflicting practices; these are beyond the scope of this article.

4 Determining MTF with a Slant Edge Target Page 4 given the dual use of the term MTF, we are almost forced to say, we select distance off the axis as the parameter, and plot MTF against spatial frequency. In other words, this form of the MTF is a plot of MTF against spatial frequency. However, when MTF data is presented by lens manufacturers, they customarily select spatial frequency as the parameter, and plot the MTF (meaning the modulation transfer ratio) against distance off axis. Usually, there are only two curves, for a low and a not so low spatial frequency. 3 DETERMINING THE MODULATION TRANSFER RATIO The classical concept The classical concept of determining the MTF of a lens involves presenting it with patterns having repetitive variations in luminance (of a known modulation depth) at different spatial frequencies. Then, the pattern deposited on the focal plane is examined (perhaps with a special instrument, or perhaps by capturing it with precisely calibrated film) and noting the modulation depth for each test pattern. We make this determination both at the center of the image and then at locations at successively greater distances from the axis. The two modulation depths, for each combination of spatial frequency and distance off axis, are compared to get the modulation transfer ratio. This is then plotted against the appropriate non-parameter independent variable for the desired form of presentation. Although in the form of the MTF curves presented by lens manufacturers often only two spatial frequencies are treated, for scientific work it is important that we have the MTF at numerous spatial frequencies. Doing so requires test exposures done with numerous test targets, each having patterns of lines at various spacings. A more modern method The availability of computers to easily perform sophisticated manipulation of data, and the fact that a digital camera inherently has an instrument for measuring illuminance the focal plane (its sensor), have led to the adoption of a quite different technique for determining the MTF of a lens, the slant edge target technique. This technique is the actual subject of this article. 3 Actually, there are often eight curves, accommodating two values of the parameters aperture and modulation axis.

5 Determining MTF with a Slant Edge Target Page 5 THE SLANT EDGE TARGET TECHNIQUE CONCEPT An analog in electrical engineering The underlying concept of the technique can perhaps be most clearly seen by considering an electrical engineering example. The MTF (in the sense of a plot of modulation transfer ratio against spatial frequency) is quite parallel to the matter of the frequency response of an electronic amplifier, where we plot the gain of the amplifier (the ratio of the output voltage to the input voltage) as a function of frequency (in this case temporal frequency, in hertz). Not surprisingly, the classical technique for determining the frequency response (we can call it the gain function ) involves presenting the amplifier with signals of known voltage at different frequencies, and in each case, measuring the output power. The plot of the gain (ratio of output voltage to input voltage) against frequency is the voltage gain function. But there is a way to determine this with a one shot test (and the term is very apt). We submit to the amplifier what is called an impulse, a single pulse which (ideally) has zero duration (zero width) but still contains energy. When we do this, a certain waveform comes out of the amplifier. It is called the impulse response of the amplifier. If we capture that (just one test is needed), we can from it determine the entire voltage gain function (gain as a function of frequency). How can this be? Well, the impulse contains energy at all frequencies (in theory, up to infinity), with a uniform distribution. If we take the Fourier transform 4 of the output waveform, the result is a description of the frequency content of that waveform. And, given that the input signal contains all frequencies, uniformly, that description will be the voltage gain function (or voltage frequency response ). Well, clever as this sounds on paper, there are some practical problems with actually doing it. One is that our impulse, if it is truly to have a zero duration (zero width in time) but nevertheless contain some energy (and of course, if it didn t there would be no output from it), it must (theoretically) have infinite amplitude (voltage). Let s be thankful we can t actually do this; if we could, our amplifier would blow up during the test. 4 A mathematical process that takes a description of a waveform and from it develops a description of its frequency content.

6 Determining MTF with a Slant Edge Target Page 6 And making a pulse have zero width isn t possible either. So we resort to a variation of the theme. Here, instead of using an impulse as our input we use a step function. This is a waveform that, for example, starts out at +1.0 volt and then, at a certain point in time, instantaneously changes to 1.0 volt. Again this is not possible to actually achieve, but it is a lot easier to approximate than an impulse with zero time width and infinite voltage. After applying this (just once) to our amplifier, and capturing the output waveform, we then take the Fourier transform of that. The result, as before, will be the frequency response (gain function) of the amplifier (although in this case, it is in terms of power gain rather than voltage gain). Now, back to optics If we present a zero-width bright line to a lens, it is the optical equivalent of the impulse in the electrical situation. Unless the lens has infinite resolution, the image of that line on the focal plane will be a pattern having non-zero width, across which the illuminance varies in some way. This is called the line spread function (LSF) of the lens. If we take its Fourier transform, we get what turns out to be the square root of the modulation transfer ratio as a function of spatial frequency: the modulation transfer function (MTF). But of course, just as for the electrical impulse, this zero width line is impractical to make, and for it to have enough photometric energy that we can see the pattern of illuminance on the focal plane, it would have to have essentially infinite luminance. So we follow the same ploy used in the electrical situation. We use a test scene that is black up to a straight line boundary and white beyond it the optical equivalent of the electrical step function. For any real lens, the image of that test scene will not have a zero width boundary between dark and light regions, but rather a boundary of some finite width, across which the illuminance varies in some way. The plot of illuminance across that boundary is called the edge spread function (ESF) of the lens. If we measure this illuminance pattern take its Fourier transform, we get the modulation transfer ratio as a function of spatial frequency: the modulation transfer function (MTF). Wow! Is this neat or what!

7 Determining MTF with a Slant Edge Target Page 7 THE REALITIES The need In order to do this, for MTFs of the kind we fortunately encounter with modern lenses, we have to be able to measure the illuminance pattern the edge spread function with very high resolution. Of course, a practical advantage of this technique is that we can use the camera sensor itself to measure the illuminance pattern. But the theoretical resolution of the sensor array is not sufficient to discern the illuminance pattern with sufficient resolution. We see this illustrated in Figure 1. a. b. c. d. Figure 1. Resolving the edge spread function

8 Determining MTF with a Slant Edge Target Page 8 In panel a, we see a hypothetical edge spread function (as would be observed downstream from the lens under test). The gray grid is at the pixel pitch of the camera sensor array, in order to give an idea of the scale. In panel b, we see what would happen if the edge image was located in a certain way on the pixel grid. (We only consider pixels along a line perpendicular to the boundary). The plot line across the band for each pixel shows the pixel output (only a single value for any pixel, of course). Note that the overall sensor output for this row of pixels seems to be a perfect step function (in electrical terms). In panel c, we see a slightly different location of the image. Now we see a different pixel output still certainly not a precise representation of the illuminance pattern itself. In panel d, we see yet another possibility again not even close to a precise representation of the illuminance pattern. So regardless of which one of these happens and this is essentially beyond our control the illuminance pattern suggested by the sensor output is useless for precise analysis. So we must fake enhanced resolution of the sensor. The slant edge target Enter now the title character of this drama. As before, we present the lens with a target with a black portion and a white portion, with a sharp boundary between. But we intentionally orient it so that the boundary does not match the pixel axis of the sensor array, by a small angle. Now, a fascinating drama can play out; we can follow it on Figure 2. We see the image of the target laid out on the sensor pixel detector grid. (The black portion is shown in gray to allow the entire grid to be seen.) Each square represents the domain of one pixel detector. But we will assume that each detector actually only responds to the illuminance at the center of its domain (where we will show a dot if we are interested in the output of that detector). The variation in illuminance (the edge spread function) happens along the ESF axis direction, and of course it happens identically all across the edge. That is, the illuminance will be constant along any line parallel to the boundary (a certain distance from the edge); the variation in illuminance will be the same along any line parallel to the ESF axis (which is drawn in an arbitrary location).

9 Determining MTF with a Slant Edge Target Page 9 Target image Pixel grid Pixels considered a. ESF axis Target image Pixel grid b. Pixels considered ESF axis Figure 2. Operation of the slant edge target We first consider the response of the line of pixel detectors (hereafter, just pixels ) highlighted in panel a. These pixels pick up the luminance of the edge spread pattern at various distances from the boundary, which are evenly spaced. That illuminance is the same all along the associated dotted line, drawn parallel to the boundary. Thus a measurement taken at any point along such a line represents the

10 Determining MTF with a Slant Edge Target Page 10 illuminance every place along it (including where the line crosses our arbitrarily-drawn ESF axis, along which we are interested in the variation of illuminance). The reason we have only concentrated on one row of pixels in this panel is not because they have any special role, but merely because if we started by considering all the pixels, the drawing would have been so busy that it might have been hard to grasp the principle from it. But now that we know what we are looking for, in panel b we consider the response of all the pixels over a larger region. Recall that the output of any pixel represents the illuminance any place along a line parallel to the boundary. Thus we have again drawn the lines parallel to the boundary through each pixel point. The illuminance is the same along any of these lines. We ve not drawn them dotted as that is just too busy for this already-too-busy drawing. But we have drawn slightly bolder the ones shown in panel a. We see now that the suite of output data from all these pixels has told us the luminance along each of many lines parallel to the boundary, and very closely (and evenly) spaced. These values are in fact the luminance at points with that particular spacing along our arbitrarily-drawn ESF axis. Accordingly, this suite of data gives us a high-resolution description of the variation of illuminance along the ESF axis; that is, a high-resolution description of the ESF itself, which we require to make a precise determination of the MTF. The spacing of the samples of the ESF is in fact the pixel pitch multiplied by the sine of the angle of rotation of the target. In our illustration (where the rotation is about ), this is a little less than one-fifth the pixel pitch. Thus, our clever approach gives us an effective resolution of about five times that which could be given by the sensor array in normal use. Because the pixel detectors actually do not pick up the luminance at a point (as suggested by our example), but rather respond to an average of some sort over a region approaching the domain of the pixel, certain special steps have to be taken in the evaluation of the edge 5 This is a greater angle than that usually used for such tests, adopted here for clarity of the illustration. One widely used test target uses an angle of about 5.7, specifically a slope (tangent of the angle) of 1:10. The tidy repetitive pattern of sample distances we see in the example requires an angle whose tangent is a ratio of integers, preferably 1/n.

11 Determining MTF with a Slant Edge Target Page 11 spread function from the set of collected pixel detector values. This is a well-known matter in digital signal processing. Note that the axis along which the edge spread function is considered (by definition, perpendicular to the edge ) is not either axis of the pixel array. This is not really of any consequence to us; the edge spread function exists in two dimensional space regardless of the orientation of the target. 6 Target orientation Any given scheme for determining the MTF with the slant edge technique will have an intended rotation of the target edge. However, we cannot always assure that this angle is exactly achieved. MTF analysis software for use with the slanted edge target technique typically contains provisions for first deducing the exact rotation of the target edge from the data (you can visualize from Figure 2 how this generally could work) and then using the result in the actual analysis. SUMMARY The slant edge target approach allows a convenient one-shot determination of the MTF (in the sense of the modulation transfer ratio as a function of spatial frequency) by exploiting two clever ploys: The use of the Fourier transfer to get the MTF from the edge spread function. The use of the slanted target to get an effective resolution of the sensor array much greater than would be dictated by its pixel pitch so that the edge spread function can be adequately measured by the sensor array itself. # 6 Actually, when we get into one of the esoteric subtleties of the MTF (the matter of axis of modulation ), the direction of the ESF axis is of concern. We can deal with that by thoughtful choice of at what points in the image (at different distances from the center) do we run tests.

### The Hemispherical Receptor Incident Light Exposure Meter

The Hemispherical Receptor Incident Light Exposure Meter Douglas A. Kerr Issue 2 August 5, 2014 ABSTRACT Incident light exposure metering is a useful technique for planning photographic exposure in many

### Evaluating Commercial Scanners for Astronomical Images. The underlying technology of the scanners: Pixel sizes:

Evaluating Commercial Scanners for Astronomical Images Robert J. Simcoe Associate Harvard College Observatory rjsimcoe@cfa.harvard.edu Introduction: Many organizations have expressed interest in using

### digital film technology Resolution Matters what's in a pattern white paper standing the test of time

digital film technology Resolution Matters what's in a pattern white paper standing the test of time standing the test of time An introduction >>> Film archives are of great historical importance as they

### Physics 3340 Spring Fourier Optics

Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.

### FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION

FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures

### Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Purpose 1. To understand the theory of Fraunhofer diffraction of light at a single slit and at a circular aperture; 2. To learn how to measure

### Math, Magic & MTF: A Cheat Sheet For The Vision System Community. By Stuart W. Singer, senior VP & CTO, and Jim Sullivan, director, Industrial Optics

Math, Magic & MTF: A Cheat Sheet For The Vision System Community By Stuart W. Singer, senior VP & CTO, and Jim Sullivan, director, Industrial Optics The best indicator of lens performance what every buyer

### ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs)

INTERNATIONAL STANDARD ISO 14524 First edition 1999-12-15 Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs) Photographie Appareils de prises

### Resolution test with line patterns

Resolution test with line patterns OBJECT IMAGE 1 line pair Resolution limit is usually given in line pairs per mm in sensor plane. Visual evaluation usually. Test of optics alone Magnifying glass Test

### A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

### IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

### Using Figures - The Basics

Using Figures - The Basics by David Caprette, Rice University OVERVIEW To be useful, the results of a scientific investigation or technical project must be communicated to others in the form of an oral

### Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Light from distant things Chapter 36 We learn about a distant thing from the light it generates or redirects. The lenses in our eyes create images of objects our brains can process. This chapter concerns

### Measurement of the Modulation Transfer Function (MTF) of a camera lens. Laboratoire d Enseignement Expérimental (LEnsE)

Measurement of the Modulation Transfer Function (MTF) of a camera lens Aline Vernier, Baptiste Perrin, Thierry Avignon, Jean Augereau, Lionel Jacubowiez Institut d Optique Graduate School Laboratoire d

### IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

### In 1974, Erno Rubik created the Rubik s Cube. It is the most popular puzzle

In 1974, Erno Rubik created the Rubik s Cube. It is the most popular puzzle worldwide. But now that it has been solved in 7.08 seconds, it seems that the world is in need of a new challenge. Melinda Green,

### An Evaluation of MTF Determination Methods for 35mm Film Scanners

An Evaluation of Determination Methods for 35mm Film Scanners S. Triantaphillidou, R. E. Jacobson, R. Fagard-Jenkin Imaging Technology Research Group, University of Westminster Watford Road, Harrow, HA1

### Tech Paper. Anti-Sparkle Film Distinctness of Image Characterization

Tech Paper Anti-Sparkle Film Distinctness of Image Characterization Anti-Sparkle Film Distinctness of Image Characterization Brian Hayden, Paul Weindorf Visteon Corporation, Michigan, USA Abstract: The

### Diffraction. Interference with more than 2 beams. Diffraction gratings. Diffraction by an aperture. Diffraction of a laser beam

Diffraction Interference with more than 2 beams 3, 4, 5 beams Large number of beams Diffraction gratings Equation Uses Diffraction by an aperture Huygen s principle again, Fresnel zones, Arago s spot Qualitative

### Norwood s dome: a revolution in incident-light photographic exposure metering

Norwood s dome: a revolution in incident-light photographic exposure metering Douglas A. Kerr Issue 2 October 14, 2016 ABSTRACT AND INTRODUCTION In the late 1930 s, Donald W. Norwood introduced a new principle

### Lens Principal and Nodal Points

Lens Principal and Nodal Points Douglas A. Kerr, P.E. Issue 3 January 21, 2004 ABSTRACT In discussions of photographic lenses, we often hear of the importance of the principal points and nodal points of

### OPTICS I LENSES AND IMAGES

APAS Laboratory Optics I OPTICS I LENSES AND IMAGES If at first you don t succeed try, try again. Then give up- there s no sense in being foolish about it. -W.C. Fields SYNOPSIS: In Optics I you will learn

! ENEE 245: Digital Circuits & Systems Lab Lab 1 Combinational logic: Breadboard adders ENEE 245: Digital Circuits and Systems Laboratory Lab 1 Objectives The objectives of this laboratory are the following:

### The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Computational Photography The ultimate camera What does it do? Image from Durand & Freeman s MIT Course on Computational Photography Today s reading Szeliski Chapter 9 The ultimate camera Infinite resolution

### Data Communications & Computer Networks

Data Communications & Computer Networks Chapter 3 Data Transmission Fall 2008 Agenda Terminology and basic concepts Analog and Digital Data Transmission Transmission impairments Channel capacity Home Exercises

### Modeling and Synthesis of Aperture Effects in Cameras

Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1 Outline Introduction and Related Work Modeling Vignetting

### DSP First Lab 06: Digital Images: A/D and D/A

DSP First Lab 06: Digital Images: A/D and D/A Pre-Lab and Warm-Up: You should read at least the Pre-Lab and Warm-up sections of this lab assignment and go over all exercises in the Pre-Lab section before

### What is a "Good Image"?

What is a "Good Image"? Norman Koren, Imatest Founder and CTO, Imatest LLC, Boulder, Colorado Image quality is a term widely used by industries that put cameras in their products, but what is image quality?

### 10.2 Images Formed by Lenses SUMMARY. Refraction in Lenses. Section 10.1 Questions

10.2 SUMMARY Refraction in Lenses Converging lenses bring parallel rays together after they are refracted. Diverging lenses cause parallel rays to move apart after they are refracted. Rays are refracted

### CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

### The Formation of an Aerial Image, part 3

T h e L i t h o g r a p h y T u t o r (July 1993) The Formation of an Aerial Image, part 3 Chris A. Mack, FINLE Technologies, Austin, Texas In the last two issues, we described how a projection system

### BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK. Gregory Hollows Edmund Optics

BIG PIXELS VS. SMALL PIXELS THE OPTICAL BOTTLENECK Gregory Hollows Edmund Optics 1 IT ALL STARTS WITH THE SENSOR We have to begin with sensor technology to understand the road map Resolution will continue

C. BALLAERA: UTILIZING A 4-F FOURIER OPTICAL SYSTEM UTILIZING A 4-F FOURIER OPTICAL SYSTEM TO LEARN MORE ABOUT IMAGE FILTERING Author: Corrado Ballaera Research Conducted By: Jaylond Cotten-Martin and

### Basics of Light Microscopy and Metallography

ENGR45: Introduction to Materials Spring 2012 Laboratory 8 Basics of Light Microscopy and Metallography In this exercise you will: gain familiarity with the proper use of a research-grade light microscope

### Figure 1 HDR image fusion example

TN-0903 Date: 10/06/09 Using image fusion to capture high-dynamic range (hdr) scenes High dynamic range (HDR) refers to the ability to distinguish details in scenes containing both very bright and relatively

### Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

Camera projections Recall the plenoptic function: Panoramic imaging Ixyzϕθλt (,,,,,, ) At any point xyz,, in space, there is a full sphere of possible incidence directions ϕ, θ, covered by 0 ϕ 2π, 0 θ

### Photo Editing Workflow

Photo Editing Workflow WHY EDITING Modern digital photography is a complex process, which starts with the Photographer s Eye, that is, their observational ability, it continues with photo session preparations,

### Image Processing (EA C443)

Image Processing (EA C443) OBJECTIVES: To study components of the Image (Digital Image) To Know how the image quality can be improved How efficiently the image data can be stored and transmitted How the

### Antialiasing and Related Issues

Antialiasing and Related Issues OUTLINE: Antialiasing Prefiltering, Supersampling, Stochastic Sampling Rastering and Reconstruction Gamma Correction Antialiasing Methods To reduce aliasing, either: 1.

### Assistant Lecturer Sama S. Samaan

MP3 Not only does MPEG define how video is compressed, but it also defines a standard for compressing audio. This standard can be used to compress the audio portion of a movie (in which case the MPEG standard

### Signal Characteristics

Data Transmission The successful transmission of data depends upon two factors:» The quality of the transmission signal» The characteristics of the transmission medium Some type of transmission medium

### Lab Report 3: Speckle Interferometry LIN PEI-YING, BAIG JOVERIA

Lab Report 3: Speckle Interferometry LIN PEI-YING, BAIG JOVERIA Abstract: Speckle interferometry (SI) has become a complete technique over the past couple of years and is widely used in many branches of

### LCD handheld displays characterization by means of the MTF measurement

MSc in Photonics Universitat Politècnica de Catalunya (UPC) Universitat Autònoma de Barcelona (UAB) Universitat de Barcelona (UB) Institut de Ciències Fotòniques (ICFO) PHOTONICSBCN http://www.photonicsbcn.eu

### MTF Analysis and its Measurements for Digital Still Camera

MTF Analysis and its Measurements for Digital Still Camera Yukio Okano*, Minolta Co., Ltd. Takatsuki Laboratory, Takatsuki, Japan *present address Sharp Company, Nara, Japan Abstract MTF(Modulation Transfer

### Optoliner NV. Calibration Standard for Sighting & Imaging Devices West San Bernardino Road West Covina, California 91790

Calibration Standard for Sighting & Imaging Devices 2223 West San Bernardino Road West Covina, California 91790 Phone: (626) 962-5181 Fax: (626) 962-5188 www.davidsonoptronics.com sales@davidsonoptronics.com

### Topic Pulse Modulation. analyse and draw graphs to illustrate the following pulse carrier

Learning Objectives: At the end of this topic you will be able to; analyse and draw graphs to illustrate the following pulse carrier modulation techniques: o Pulse width modulation o Pulse position modulation

### FULL RESOLUTION 2K DIGITAL PROJECTION - by EDCF CEO Dave Monk

FULL RESOLUTION 2K DIGITAL PROJECTION - by EDCF CEO Dave Monk 1.0 Introduction This paper is intended to familiarise the reader with the issues associated with the projection of images from D Cinema equipment

### PROCEEDINGS OF SPIE. Measurement of the modulation transfer function (MTF) of a camera lens

PROCEEDINGS OF SPIE SPIEDigitalLibrary.org/conference-proceedings-of-spie Measurement of the modulation transfer function (MTF) of a camera lens Aline Vernier, Baptiste Perrin, Thierry Avignon, Jean Augereau,

### KODAK VISION Expression 500T Color Negative Film / 5284, 7284

TECHNICAL INFORMATION DATA SHEET TI2556 Issued 01-01 Copyright, Eastman Kodak Company, 2000 1) Description is a high-speed tungsten-balanced color negative camera film with color saturation and low contrast

### Intorduction to light sources, pinhole cameras, and lenses

Intorduction to light sources, pinhole cameras, and lenses Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 October 26, 2011 Abstract 1 1 Analyzing

Technical Note CMOS, EMCCD AND CCD CAMERAS FOR LIFE SCIENCES Camera Test Protocol Introduction The detector is one of the most important components of any microscope system. Accurate detector readings

### Telescopes and their configurations. Quick review at the GO level

Telescopes and their configurations Quick review at the GO level Refraction & Reflection Light travels slower in denser material Speed depends on wavelength Image Formation real Focal Length (f) : Distance

### Image Enhancement Using Histogram Equalization and Histogram Specification on Different Color Spaces

Image Enhancement Using Histogram Equalization and Histogram Specification on Different Color Spaces Pankaj Kumar Roll. 109CS0596 A thesis submitted in partial fulfillment for the degree of Bachelor of

### Focus-Aid Signal for Super Hi-Vision Cameras

Focus-Aid Signal for Super Hi-Vision Cameras 1. Introduction Super Hi-Vision (SHV) is a next-generation broadcasting system with sixteen times (7,680x4,320) the number of pixels of Hi-Vision. Cameras for

### Geometry of Aerial Photographs

Geometry of Aerial Photographs Aerial Cameras Aerial cameras must be (details in lectures): Geometrically stable Have fast and efficient shutters Have high geometric and optical quality lenses They can

### New foveated wide angle lens with high resolving power and without brightness loss in the periphery

New foveated wide angle lens with high resolving power and without brightness loss in the periphery K. Wakamiya *a, T. Senga a, K. Isagi a, N. Yamamura a, Y. Ushio a and N. Kita b a Nikon Corp., 6-3,Nishi-ohi

### Basic principles of photography. David Capel 346B IST

Basic principles of photography David Capel 346B IST Latin Camera Obscura = Dark Room Light passing through a small hole produces an inverted image on the opposite wall Safely observing the solar eclipse

### The quality of the transmission signal The characteristics of the transmission medium. Some type of transmission medium is required for transmission:

Data Transmission The successful transmission of data depends upon two factors: The quality of the transmission signal The characteristics of the transmission medium Some type of transmission medium is

### Median Filter and Its

An Implementation of the Median Filter and Its Effectiveness on Different Kinds of Images Kevin Liu Thomas Jefferson High School for Science and Technology Computer Systems Lab 2006-2007 June 13, 2007

### Pixel Response Effects on CCD Camera Gain Calibration

1 of 7 1/21/2014 3:03 PM HO M E P R O D UC T S B R IE F S T E C H NO T E S S UP P O RT P UR C HA S E NE W S W E B T O O L S INF O C O NTA C T Pixel Response Effects on CCD Camera Gain Calibration Copyright

### Understanding Focal Length

JANUARY 19, 2018 BEGINNER Understanding Focal Length Featuring DIANE BERKENFELD, DAVE BLACK, MIKE CORRADO & LINDSAY SILVERMAN Focal length, usually represented in millimeters (mm), is the basic description

### IMAGE PROCESSING Vedat Tavşanoğlu

Vedat Tavşano anoğlu Image Processing A Revision of Basic Concepts An image is mathematically represented by: where I( x, y) x y is the vertical spatial distance; is the horizontal spatial distance, both

### The Use of Non-Local Means to Reduce Image Noise

The Use of Non-Local Means to Reduce Image Noise By Chimba Chundu, Danny Bin, and Jackelyn Ferman ABSTRACT Digital images, such as those produced from digital cameras, suffer from random noise that is

### Sampling Efficiency in Digital Camera Performance Standards

Copyright 2008 SPIE and IS&T. This paper was published in Proc. SPIE Vol. 6808, (2008). It is being made available as an electronic reprint with permission of SPIE and IS&T. One print or electronic copy

when it s too fast to see, and too important not to. NOTES/ALERTS For the most current version visit www.phantomhighspeed.com Subject to change Rev April 2016 Boosting Sensitivity In this series of articles,

### ( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

Deriving the Lens Transmittance Function Thin lens transmission is given by a phase with unit magnitude. t(x, y) = exp[ jk o ]exp[ jk(n 1) (x, y) ] Find the thickness function for left half of the lens

### loss of detail in highlights and shadows (noise reduction)

Introduction Have you printed your images and felt they lacked a little extra punch? Have you worked on your images only to find that you have created strange little halos and lines, but you re not sure

### T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E

T I P S F O R I M P R O V I N G I M A G E Q U A L I T Y O N O Z O F O O T A G E Updated 20 th Jan. 2017 References Creator V1.4.0 2 Overview This document will concentrate on OZO Creator s Image Parameter

### EE-4022 Experiment 3 Frequency Modulation (FM)

EE-4022 MILWAUKEE SCHOOL OF ENGINEERING 2015 Page 3-1 Student Objectives: EE-4022 Experiment 3 Frequency Modulation (FM) In this experiment the student will use laboratory modules including a Voltage-Controlled

### Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Image acquisition Digital images are acquired by direct digital acquisition (digital still/video cameras), or scanning material acquired as analog signals (slides, photographs, etc.). In both cases, the

### Improving digital images with the GNU Image Manipulation Program PHOTO FIX

Improving digital images with the GNU Image Manipulation Program PHOTO FIX is great for fixing digital images. We ll show you how to correct washed-out or underexposed images and white balance. BY GAURAV

### Digital Image Processing

Digital Image Processing Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Images taken from: R. Gonzalez and R. Woods. Digital Image Processing, Prentice Hall, 2008. Digital Image Processing

### CHAPTER 3LENSES. 1.1 Basics. Convex Lens. Concave Lens. 1 Introduction to convex and concave lenses. Shape: Shape: Symbol: Symbol:

CHAPTER 3LENSES 1 Introduction to convex and concave lenses 1.1 Basics Convex Lens Shape: Concave Lens Shape: Symbol: Symbol: Effect to parallel rays: Effect to parallel rays: Explanation: Explanation:

### Getting Started. MSO/DPO Series Oscilloscopes. Basic Concepts

Getting Started MSO/DPO Series Oscilloscopes Basic Concepts 001-1523-00 Getting Started 1.1 Getting Started What is an oscilloscope? An oscilloscope is a device that draws a graph of an electrical signal.

### PhysFest. Holography. Overview

PhysFest Holography Holography (from the Greek, holos whole + graphe writing) is the science of producing holograms, an advanced form of photography that allows an image to be recorded in three dimensions.

### Digital Image Fundamentals. Digital Image Processing. Human Visual System. Contents. Structure Of The Human Eye (cont.) Structure Of The Human Eye

Digital Image Processing 2 Digital Image Fundamentals Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Those who wish to succeed must ask the right preliminary questions Aristotle Images

### Image and Video Processing

Image and Video Processing () Image Representation Dr. Miles Hansard miles.hansard@qmul.ac.uk Segmentation 2 Today s agenda Digital image representation Sampling Quantization Sub-sampling Pixel interpolation

### THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING ROGER STETTNER, HOWARD BAILEY AND STEVEN SILVERMAN Advanced Scientific Concepts, Inc. 305 E. Haley St. Santa Barbara, CA 93103 ASC@advancedscientificconcepts.com

How to use the RAW FILE CONVERTER EX powered by SILKYPIX The X-Pro1 comes with RAW FILE CONVERTER EX powered by SILKYPIX software for processing RAW images. This software lets users make precise adjustments

### Computational Photography and Video. Prof. Marc Pollefeys

Computational Photography and Video Prof. Marc Pollefeys Today s schedule Introduction of Computational Photography Course facts Syllabus Digital Photography What is computational photography Convergence

### Bar Code Labels. Introduction

Introduction to Bar Code Reading Technology Introduction Most people are familiar with bar codes. These are the bands of stripe lines which can be found on many grocery items and are used by scanning devices

### TO PLOT OR NOT TO PLOT?

Graphic Examples This document provides examples of a number of graphs that might be used in understanding or presenting data. Comments with each example are intended to help you understand why the data

### Basic Resolution Testing using Test Charts

Basic resolution Testing A resolution test chart is used to allow quick and easy testing of the ability of an optical system to produce images with fine detail. The patterns are in groups which progressively

Technical information about PhoToPlan The following pages shall give you a detailed overview of the possibilities using PhoToPlan. kubit GmbH Fiedlerstr. 36, 01307 Dresden, Germany Fon: +49 3 51/41 767

### RGB RESOLUTION CONSIDERATIONS IN A NEW CMOS SENSOR FOR CINE MOTION IMAGING

WHITE PAPER RGB RESOLUTION CONSIDERATIONS IN A NEW CMOS SENSOR FOR CINE MOTION IMAGING Written by Larry Thorpe Professional Engineering & Solutions Division, Canon U.S.A., Inc. For more info: cinemaeos.usa.canon.com

### Aperture & ƒ/stop Worksheet

Tools and Program Needed: Digital C. Computer USB Drive Bridge PhotoShop Name: Manipulating Depth-of-Field Aperture & stop Worksheet The aperture setting (AV on the dial) is a setting to control the amount

### Spatial Light Modulator (SLM) Workshop, BFY 2012 Conference Douglas Martin and Shannon O Leary Lawrence University and Lewis & Clark College

Spatial Light Modulator (SLM) Workshop, BFY 2012 Conference Douglas Martin and Shannon O Leary Lawrence University and Lewis & Clark College Briefly, a spatial light modulator (SLM) is a liquid crystal

### Fourier Theory & Practice, Part I: Theory (HP Product Note )

Fourier Theory & Practice, Part I: Theory (HP Product Note 54600-4) By: Robert Witte Hewlett-Packard Co. Introduction: This product note provides a brief review of Fourier theory, especially the unique

### Chapter 2: Digital Image Fundamentals. Digital image processing is based on. Mathematical and probabilistic models Human intuition and analysis

Chapter 2: Digital Image Fundamentals Digital image processing is based on Mathematical and probabilistic models Human intuition and analysis 2.1 Visual Perception How images are formed in the eye? Eye

### Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group (987)

Applying Automated Optical Inspection Ben Dawson, DALSA Coreco Inc., ipd Group bdawson@goipd.com (987) 670-2050 Introduction Automated Optical Inspection (AOI) uses lighting, cameras, and vision computers

### Focusing and Metering

Focusing and Metering CS 478 Winter 2012 Slides mostly stolen by David Jacobs from Marc Levoy Focusing Outline Manual Focus Specialty Focus Autofocus Active AF Passive AF AF Modes Manual Focus - View Camera

### Image Formation by Lenses

Image Formation by Lenses Bởi: OpenStaxCollege Lenses are found in a huge array of optical instruments, ranging from a simple magnifying glass to the eye to a camera s zoom lens. In this section, we will

### Exploring 3D in Flash

1 Exploring 3D in Flash We live in a three-dimensional world. Objects and spaces have width, height, and depth. Various specialized immersive technologies such as special helmets, gloves, and 3D monitors

### Εισαγωγική στην Οπτική Απεικόνιση

Εισαγωγική στην Οπτική Απεικόνιση Δημήτριος Τζεράνης, Ph.D. Εμβιομηχανική και Βιοϊατρική Τεχνολογία Τμήμα Μηχανολόγων Μηχανικών Ε.Μ.Π. Χειμερινό Εξάμηνο 2015 Light: A type of EM Radiation EM radiation:

### 2013 LMIC Imaging Workshop. Sidney L. Shaw Technical Director. - Light and the Image - Detectors - Signal and Noise

2013 LMIC Imaging Workshop Sidney L. Shaw Technical Director - Light and the Image - Detectors - Signal and Noise The Anatomy of a Digital Image Representative Intensities Specimen: (molecular distribution)

### Supermacro Photography and Illuminance

Supermacro Photography and Illuminance Les Wilk/ReefNet April, 2009 There are three basic tools for capturing greater than life-size images with a 1:1 macro lens --- extension tubes, teleconverters, and

### Problem Solving with the Coordinate Plane

Grade 5 Module 6 Problem Solving with the Coordinate Plane OVERVIEW In this 40-day module, students develop a coordinate system for the first quadrant of the coordinate plane and use it to solve problems.

### Photo Scale The photo scale and representative fraction may be calculated as follows: PS = f / H Variables: PS - Photo Scale, f - camera focal

Scale Scale is the ratio of a distance on an aerial photograph to that same distance on the ground in the real world. It can be expressed in unit equivalents like 1 inch = 1,000 feet (or 12,000 inches)

### Activity 1: Make a Digital Camera

Hubble Sight/Insight Color The Universe Student's Guide Activity 1: Make a Digital Camera Astronomers love photons! Photons are the messengers of the cosmos carrying detailed information about our amazing