USE OF FT IN IMAGE PROCESSING IMAGE PROCESSING (RRY025)
|
|
- Moses Dwayne Matthews
- 5 years ago
- Views:
Transcription
1 IMAGE PROCESSIG (RRY25) USE OF FT I IMAGE PROCESSIG Optics- originofimperfectionsinimagingsystems(limited resolution/blurring related to 2D FTs)- need to understand using Continuous FT. Sampling -Capturecontinuousimageontoasetofdiscrete pixel values, can arrange to do without loss off informationn if yquist sampled, requires understanding FTs. IMAGIG, SAMPLIG -2DDISCRETEFOURIERTRASFORMPART-I Filtering -Havingnowcapturedadigitalimagecando Discrete-FT (DFT) to Fourier domain and then keep only low spatial frequencies (hence smooth image) or just high spatial frequencies (to sharpen images). Fast Convolution - If we take sampled image to Fourier domain, multiply it by a function and inverse FT we achieve a fast convolution - using the DFT can do the operations we described in previous lectures for smoothing/sharpending/edge detection etc much faster than doing eveything in image domain. 2 USE OF FT I IMAGE PROCESSIG (cont) Image Restoration -Canremoveopticdistortionsdescribed above under optics by DFT, filter multiplication and inverse DFT (see future lectures on image restoration) Image Compression -DFTtoFourierdomain,deleteand don t transmit high spatial frequencies that are not visibleto eye - can use but other transforsm (i.e. cosine and wavelet) better. EXAMPLE OF COTIUOUS FT - Diffraction limited telescope/camera imaging Consider a telescope or camera. Looking at a distant object at a single wavelength. Consider a point in the image, say the a star or point on a person s belt buckle, s(x, y) =δ(x x o.y y o ). From optics the electric field E(u,v) at the aperture is related to the continuous Fourier Transform of the source amplitude as a function of angular coordinates, that is E(u, v) =FT(s(x, y)). Object APERTURE A(u,v) CCD LES s(x,y) E(u,v) E (u,v) s(x,y) Continuous FT Continuous FT Figure : After being blocked by the aperture function A(u, v) = over a circular aperture and A(u, v) = elsewhere,then. E (u, v) =E(u, v)a(u, v) The lens does another FT such that the electric field at the CCD is the forward Fourier transform of E (u, v) so FT(E (u, v)) = s( x, y) FT(A(u, v)) 3 4
2 [where we use the fact that two forward Fourier transforms inverts the image FT(FT(s(x,y)) = s(-x,-y)] In fact the CCD detects the incident power falling, (square of the electric field) so the effective convolving function (Point Spread Function of PSF) is FT(A(u, v)) 2. All points on the image are independant in their electric field properties so it can be shown that when viewing a complicated image s(x,y) the image formed at the CCD is s(-x.-y) * FT(A(u, v)) 2 The final image formed by the lens is thus the true image convolved with a Point Spread Function (PSF). For a circular aperture this is an Airy function or Modified Bessel fnction squared. A cluster of stars as imaged by a telescope is then not seen as a set of points. Instead each point is a little disk surrounded by weak ripples. The disk gets smaller as the lens gets bigger and the resolving power of the telescope or camera gets larger. LES AS SPATIAL LOW PASS FILTER Consider camera viewing corrugated roof functions of different spatial frequency (cycles/radian). For low spatial frequency (or large angular spacing θ beween peaks) light just pass through the lens and produces a ripple on the CCD of wavelength L θ on the CCD (where lens focal length is L). But there is a maximum spatial frequency (minimum θ min ) that passes through, for this spatial frequency the FT of the input image (Electric field at aperture) lies outside the range passed by the lens. This minimum θ min = λ/d, where d is lens diameter and λ is wavelength, corresponding to a maximum spatial freq of /θ = d/λ. Hence the minumum spacing between ripple peaks on CCD is then x min = Lλ/d. The convolution in the image domain by FT(A(u, v)) 2 is equivalent to multiplying the FT of the image by the autocorrelation of the aperture A(u,v) with itself - see autocorrelation theoroem in last lecture. The radius within which this function in non-zero is equal to the lens diameter. Hence aaperture/lensactsasalowpassspatialfrequencyfilter, passing spatial freq smaller than d/λ. 5 6 YQUIST SAMPLIG After passing through the lens, the image that falls on the CCD is a bandlimited function, i.e contains a maximum spatial frequency. If we can have a CCD spacing which is x min /2(yquist sampling) then we can recover all information in the continuous image. Figure 2: To see this consider the sampled image and it FT. If f(x, y) is the bandlimited imaging falling on the CCD then f(x, y) III x (x, y) isthesampledimage,wherethesecondfunction is the bed of nails function introduced in the last lecture. The FT of this sampled image is (using the convolution theorem) F (u, v) III / x (u, v). This is shown in the bottom figure on the next page. Figure 3: Top is the bed-of-nails sampling function (top), with δ functions separated by x. Thespectrumofthesampled image is shown at the bottom, it is the FT of the unsampled image F (u, v) copiedmultipletimesonagridof seperation / x. If light passed through an ellipical shaped lens before falling on the CCD, the FT of the image before sampling (F(u,v)) is only non-zero within the grey ellipses. In the special case of a circular lens the ellipse becomes circles 7 8
3 of radius d/(lλ) whichcanbeenclosedwithinsquares of size 2W u =2W v =2d/(Lλ). If this is smaller than the grid spacing / x as shown here there is no overlap between the circles. This is the yquist sampling criteria that x <Lλ/2d = x min /2 If the yquist criteria holds and there is no overlap between copies of F (u, v) thenwecanrecoverf (u, v) from the FT of the sampled image by multiplying by a top hat of width 2W u =2W v. Once we know F (u, v) we can inverse FT to get f(x, y) -andhencecanrecoverthe original pre-sampled image from the sampled data. If the sampling is less dense than yquist in the FT of the sampled image the grid has smaller seperation and the FTs will overlap, we get aliased frequencies, and we cannot recover the original unsampled image. 2D DISCRETE FOURIER TRASFORMS So far we have discussed the Continuous Fourier Transforms. Important for understanding optics of telecopes/cameras etc. Course deals mostly with the Discrete Fourier Transform (DFT), calculated by computer on a discretely sampled image. If this image is bandlimited and yquist sampled then DFT of image will be very close to continous image FT. Can use DFT on sampled image to filter, remove optical distortions, compress etc etc. For such x sampled pixel arrays the 2D Discrete Fourier Transform (DFT) and Inverse DFT are defined to be (Gonzalez and Wood definition). F (k, l) = m= n= f(m, n)exp( 2πi (km + ln)) f(m, n) = k= l= F (k, l)exp(+ 2πi (km + ln)) Where normally m,n,k,l all are assumed to run from to -. otethe definition used by MATLAB has instead for the forward transform the factor unity in front of the summation and for the inverse transform the factor / 2.Also in MATLAB indices run for to, not to -. 9 Some similarities and differences to continuous FT. -DFTusesasampledversionf(m, n) ofthecontinuous image f(x, y) -Usessumsnotintegrals -Limitsareand-not± -DFThasascalefactor(/) O-CETRED AD CETRED DFTs Using the normal definition of DFT, we get the largest amplitudes in the corners. We can obtain a plot which is more consistent with continuous Dourier transforms if we centre the DFT (i.e. implemented with the fftshift command in MATLAB). Evaluate in two steps, first a D transform of each row, then D transforms of each column of result. The DFT can be evaluated with a fast transform which takes of order 2 2 Log 2 operations for an x image. The DFT gives an aaccurate version of the continuous FT of the image falling on the CCD provided that this is a bandlimited image that has been yquist sampled. If not true the fact that the DFT works on a sampled image gives rise to aliased spatial frequencies in the FT. The DFT can be evaluated at any value of the spatial frequencies k,l. The uncentered DFT plots k,l for to -, we can instead evaluate and plot from -/2 to /2 -. This result uses the periodicity property of the DFT i.e. that F (k + j,l + k) =F (k, l), where j,k are integers. 2
4 Given that F (k, l) = f(m, n)exp( 2πi (km + ln)) Object f(m,n) n - m k on-centred DFT F(k,L) l - - what is the DFT at spatial frequencies k+h, l+j where handjareintegers? Extended DFT Calculated at all k,l F (k + h, l + j) = f(m, n)exp( 2πihm)exp( 2πijn)exp( 2πi (km+nl)) /2 h,m,j,m are all integers so exp( 2πihm) = and exp( 2πijn) = -/2 l /2- k /2- Hence F (k + h, l + j) =F (k, l)) The centred DFT is the contents of the blue square. Can be considered the DFT calculated in the range k=-/2 to /2-; l=-/2 to /2- Figure 4: 3 4 DFTs OF TYPICAL IMAGES Input Image In general transforms give a complex number at each sampled spatial frequency, k, l. Canplotrealandimaginary parts of transform or more commonly the amplitude, A(k, l) andphase,φ(k, l). Amplitudes normally has a very large range, often therefore plot Log(+ A(k,l)). It is the phase that contains most of the information about the position of edges in the image. The amplitude tells us mainly how sharp these edges are (see last lecture). Log Amp of Centred DFT Phase of Centred DFT many images have spikes along the k,l axes. These can be explained in terms of the approximations used in doing DFTs. Other sources of spikes are regions of sharp edges or narrow rectangles within the image (e.g. camera legs in the cameraman image). Figure 5: The log amplitude and phase of the centred DFT of the cameraman image 5 6
5 RELATIOSHIP BETWEE COTIUOUS FT AD DFT y f(x,y)dx y Can consider implementing DFT via continuous FTs. First take the sampled version of image f(x,y), then repeat this periodically an infinite number of times, then do a continuous FT. The result in the centre is the DFT of f(x,y) (see Following figures). x Mathematically form III x, y (x, y)) [III δx,δy (x, y)) f(x, y)] 2D Continuous Fourier Transform v F(o,v) v D Continuous FT Then take continuous FT which gives (after applyimg convolution theoreom twice) III / x,/ y (u, v)) [III /δx,/δy (u, v)) F (u, v) u Figure 6: How to implement 2D DFT via the 2D continuous FT. Illustrates relationship between the two types of transform. Take original x sampled image (tire), repeat an infinite number of times then take 2D continuous transform. The x DFT of the tire is found within the central square. Even if there are no sharp edges within the input image, there can be sharp disconti- 7 8 nuities between the top and bottom (or left and right sides) of the repeated image, as shown by the slice at the top right. When we take FT these discontinuities can give strong vertical (and horizontal) spikes along the u and v axes. EDGE EFFECTS/ORIGI OF SPIKES Can get spikes in FT because of sharp-edged objects in image, spike is perpendicular to direction of the edge. But also get large vertical spike when there is a large difference in brightness between the top and bottom of picture and get a large horizontal spike when there is adifferencebetweenleftandright. If we consider the DFT in terms of doing a continuous FT of a sampled repeated version of the input image we can understand the spikes in terms of requiring high spatial frequencies to represent discontinuities. One reason why the DFT is not optimum for image compression, power at high spatial frequencies. JPEG for instance uses the Cosine transform, which avoids this problem. Edge effects much more significant on images than in D signal processing (where there also is an effect if the starting and ending samples are different). Reason is much shorter length of each row/column compared to length of D signal (often 496 samples or longer). 9 2
2D Discrete Fourier Transform
2D Discrete Fourier Transform In these lecture notes the figures have been removed for copyright reasons. References to figures are given instead, please check the figures yourself as given in the course
More informationIMAGE PROCESSING (RRY025) THE CONTINUOUS 2D FOURIER TRANSFORM
IMAGE PROCESSING (RRY5) THE CONTINUOUS D FOURIER TRANSFORM INTRODUCTION A vital tool in image processing. Also a prototype of other image transforms, cosine, Wavelet etc. Applications Image Filtering -
More informationLecture 15: Fraunhofer diffraction by a circular aperture
Lecture 15: Fraunhofer diffraction by a circular aperture Lecture aims to explain: 1. Diffraction problem for a circular aperture 2. Diffraction pattern produced by a circular aperture, Airy rings 3. Importance
More informationFourier Transform. Any signal can be expressed as a linear combination of a bunch of sine gratings of different frequency Amplitude Phase
Fourier Transform Fourier Transform Any signal can be expressed as a linear combination of a bunch of sine gratings of different frequency Amplitude Phase 2 1 3 3 3 1 sin 3 3 1 3 sin 3 1 sin 5 5 1 3 sin
More informationELEC Dr Reji Mathew Electrical Engineering UNSW
ELEC 4622 Dr Reji Mathew Electrical Engineering UNSW Filter Design Circularly symmetric 2-D low-pass filter Pass-band radial frequency: ω p Stop-band radial frequency: ω s 1 δ p Pass-band tolerances: δ
More informationBinocular and Scope Performance 57. Diffraction Effects
Binocular and Scope Performance 57 Diffraction Effects The resolving power of a perfect optical system is determined by diffraction that results from the wave nature of light. An infinitely distant point
More informationThe Fourier Transform
The Fourier Transform Introduction to Digital Signal Processing (886457) 6 1 / 56 Contents Introduction Fourier Transforms One-dimensional DFT Two-dimensional DFT Fourier Transforms Function in Octave
More informationBe aware that there is no universal notation for the various quantities.
Fourier Optics v2.4 Ray tracing is limited in its ability to describe optics because it ignores the wave properties of light. Diffraction is needed to explain image spatial resolution and contrast and
More informationModulation Transfer Function
Modulation Transfer Function The Modulation Transfer Function (MTF) is a useful tool in system evaluation. t describes if, and how well, different spatial frequencies are transferred from object to image.
More informationTransforms and Frequency Filtering
Transforms and Frequency Filtering Khalid Niazi Centre for Image Analysis Swedish University of Agricultural Sciences Uppsala University 2 Reading Instructions Chapter 4: Image Enhancement in the Frequency
More informationObservational Astronomy
Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the
More informationLECTURE 13 DIFFRACTION. Instructor: Kazumi Tolich
LECTURE 13 DIFFRACTION Instructor: Kazumi Tolich Lecture 13 2 Reading chapter 33-4 & 33-6 to 33-7 Single slit diffraction Two slit interference-diffraction Fraunhofer and Fresnel diffraction Diffraction
More informationTSBB09 Image Sensors 2018-HT2. Image Formation Part 1
TSBB09 Image Sensors 2018-HT2 Image Formation Part 1 Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal
More informationBasic Mapping Simon Garrington JBO/Manchester
Basic Mapping Simon Garrington JBO/Manchester Introduction Output from radio arrays (VLA, VLBI, MERLIN etc) is just a table of the correlation (amp. & phase) measured on each baseline every few seconds.
More informationFourier transforms, SIM
Fourier transforms, SIM Last class More STED Minflux Fourier transforms This class More FTs 2D FTs SIM 1 Intensity.5 -.5 FT -1.5 1 1.5 2 2.5 3 3.5 4 4.5 5 6 Time (s) IFT 4 2 5 1 15 Frequency (Hz) ff tt
More informationBioimage Informatics
Bioimage Informatics Lecture 5, Spring 01 Fundamentals of Fluorescence Microscopy (II) Bioimage Data Analysis (I): Basic Operations Lecture 5 January 5, 01 1 Outline Performance metrics of a microscope
More informationFrequency Domain Enhancement
Tutorial Report Frequency Domain Enhancement Page 1 of 21 Frequency Domain Enhancement ESE 558 - DIGITAL IMAGE PROCESSING Tutorial Report Instructor: Murali Subbarao Written by: Tutorial Report Frequency
More informationCS4495/6495 Introduction to Computer Vision. 2C-L3 Aliasing
CS4495/6495 Introduction to Computer Vision 2C-L3 Aliasing Recall: Fourier Pairs (from Szeliski) Fourier Transform Sampling Pairs FT of an impulse train is an impulse train Sampling and Aliasing Sampling
More information( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.
Deriving the Lens Transmittance Function Thin lens transmission is given by a phase with unit magnitude. t(x, y) = exp[ jk o ]exp[ jk(n 1) (x, y) ] Find the thickness function for left half of the lens
More informationECEN 4606, UNDERGRADUATE OPTICS LAB
ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant
More information1.Discuss the frequency domain techniques of image enhancement in detail.
1.Discuss the frequency domain techniques of image enhancement in detail. Enhancement In Frequency Domain: The frequency domain methods of image enhancement are based on convolution theorem. This is represented
More informationSUPER RESOLUTION INTRODUCTION
SUPER RESOLUTION Jnanavardhini - Online MultiDisciplinary Research Journal Ms. Amalorpavam.G Assistant Professor, Department of Computer Sciences, Sambhram Academy of Management. Studies, Bangalore Abstract:-
More informationAstronomical Cameras
Astronomical Cameras I. The Pinhole Camera Pinhole Camera (or Camera Obscura) Whenever light passes through a small hole or aperture it creates an image opposite the hole This is an effect wherever apertures
More informationImages and Filters. EE/CSE 576 Linda Shapiro
Images and Filters EE/CSE 576 Linda Shapiro What is an image? 2 3 . We sample the image to get a discrete set of pixels with quantized values. 2. For a gray tone image there is one band F(r,c), with values
More informationMidterm Review. Image Processing CSE 166 Lecture 10
Midterm Review Image Processing CSE 166 Lecture 10 Topics covered Image acquisition, geometric transformations, and image interpolation Intensity transformations Spatial filtering Fourier transform and
More informationDESIGN NOTE: DIFFRACTION EFFECTS
NASA IRTF / UNIVERSITY OF HAWAII Document #: TMP-1.3.4.2-00-X.doc Template created on: 15 March 2009 Last Modified on: 5 April 2010 DESIGN NOTE: DIFFRACTION EFFECTS Original Author: John Rayner NASA Infrared
More informationToday. Defocus. Deconvolution / inverse filters. MIT 2.71/2.710 Optics 12/12/05 wk15-a-1
Today Defocus Deconvolution / inverse filters MIT.7/.70 Optics //05 wk5-a- MIT.7/.70 Optics //05 wk5-a- Defocus MIT.7/.70 Optics //05 wk5-a-3 0 th Century Fox Focus in classical imaging in-focus defocus
More informationDOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS. GUI Simulation Diffraction: Focused Beams and Resolution for a lens system
DOING PHYSICS WITH MATLAB COMPUTATIONAL OPTICS GUI Simulation Diffraction: Focused Beams and Resolution for a lens system Ian Cooper School of Physics University of Sydney ian.cooper@sydney.edu.au DOWNLOAD
More informationDiffraction Single-slit Double-slit Diffraction grating Limit on resolution X-ray diffraction. Phys 2435: Chap. 36, Pg 1
Diffraction Single-slit Double-slit Diffraction grating Limit on resolution X-ray diffraction Phys 2435: Chap. 36, Pg 1 Single Slit New Topic Phys 2435: Chap. 36, Pg 2 Diffraction: bending of light around
More informationImproving the Detection of Near Earth Objects for Ground Based Telescopes
Improving the Detection of Near Earth Objects for Ground Based Telescopes Anthony O'Dell Captain, United States Air Force Air Force Research Laboratories ABSTRACT Congress has mandated the detection of
More informationPart 2: Fourier transforms. Key to understanding NMR, X-ray crystallography, and all forms of microscopy
Part 2: Fourier transforms Key to understanding NMR, X-ray crystallography, and all forms of microscopy Sine waves y(t) = A sin(wt + p) y(x) = A sin(kx + p) To completely specify a sine wave, you need
More informationCoE4TN4 Image Processing. Chapter 3: Intensity Transformation and Spatial Filtering
CoE4TN4 Image Processing Chapter 3: Intensity Transformation and Spatial Filtering Image Enhancement Enhancement techniques: to process an image so that the result is more suitable than the original image
More informationOptics of Wavefront. Austin Roorda, Ph.D. University of Houston College of Optometry
Optics of Wavefront Austin Roorda, Ph.D. University of Houston College of Optometry Geometrical Optics Relationships between pupil size, refractive error and blur Optics of the eye: Depth of Focus 2 mm
More informationSECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS
RADT 3463 - COMPUTERIZED IMAGING Section I: Chapter 2 RADT 3463 Computerized Imaging 1 SECTION I - CHAPTER 2 DIGITAL IMAGING PROCESSING CONCEPTS RADT 3463 COMPUTERIZED IMAGING Section I: Chapter 2 RADT
More informationChapter 34 The Wave Nature of Light; Interference. Copyright 2009 Pearson Education, Inc.
Chapter 34 The Wave Nature of Light; Interference 34-7 Luminous Intensity The intensity of light as perceived depends not only on the actual intensity but also on the sensitivity of the eye at different
More informationINTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems
Chapter 9 OPTICAL INSTRUMENTS Introduction Thin lenses Double-lens systems Aberrations Camera Human eye Compound microscope Summary INTRODUCTION Knowledge of geometrical optics, diffraction and interference,
More informationSampling and Signal Processing
Sampling and Signal Processing Sampling Methods Sampling is most commonly done with two devices, the sample-and-hold (S/H) and the analog-to-digital-converter (ADC) The S/H acquires a continuous-time signal
More informationComputer Graphics (Fall 2011) Outline. CS 184 Guest Lecture: Sampling and Reconstruction Ravi Ramamoorthi
Computer Graphics (Fall 2011) CS 184 Guest Lecture: Sampling and Reconstruction Ravi Ramamoorthi Some slides courtesy Thomas Funkhouser and Pat Hanrahan Adapted version of CS 283 lecture http://inst.eecs.berkeley.edu/~cs283/fa10
More informationExperiment 1: Fraunhofer Diffraction of Light by a Single Slit
Experiment 1: Fraunhofer Diffraction of Light by a Single Slit Purpose 1. To understand the theory of Fraunhofer diffraction of light at a single slit and at a circular aperture; 2. To learn how to measure
More informationThe predicted performance of the ACS coronagraph
Instrument Science Report ACS 2000-04 The predicted performance of the ACS coronagraph John Krist March 30, 2000 ABSTRACT The Aberrated Beam Coronagraph (ABC) on the Advanced Camera for Surveys (ACS) has
More informationEpisode 323: Diffraction
Episode 323: Diffraction Note the spelling - double ff. The first recorded observation of diffraction was by Grimaldi in 1665. The shadows cast by light sources were not quite the same size as the anticipated
More informationLight gathering Power: Magnification with eyepiece:
Telescopes Light gathering Power: The amount of light that can be gathered by a telescope in a given amount of time: t 1 /t 2 = (D 2 /D 1 ) 2 The larger the diameter the smaller the amount of time. If
More informationChapter 4: Fourier Optics
Chapter 4: Fourier Optics P4-1. Calculate the Fourier transform of the function rect(2x)rect(/3) The rectangular function rect(x) is given b 1 x 1/2 rect( x) when 0 x 1/2 P4-2. Assume that ( gx (, )) G
More informationVision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5
Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain
More informationDIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 2002
DIGITAL IMAGE PROCESSING (COM-3371) Week 2 - January 14, 22 Topics: Human eye Visual phenomena Simple image model Image enhancement Point processes Histogram Lookup tables Contrast compression and stretching
More informationCoE4TN4 Image Processing. Chapter 4 Filtering in the Frequency Domain
CoE4TN4 Image Processing Chapter 4 Filtering in the Frequency Domain Fourier Transform Sections 4.1 to 4.5 will be done on the board 2 2D Fourier Transform 3 2D Sampling and Aliasing 4 2D Sampling and
More informationSingle, Double And N-Slit Diffraction. B.Tech I
Single, Double And N-Slit Diffraction B.Tech I Diffraction by a Single Slit or Disk If light is a wave, it will diffract around a single slit or obstacle. Diffraction by a Single Slit or Disk The resulting
More informationIntroduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1
Objective: Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1 This Matlab Project is an extension of the basic correlation theory presented in the course. It shows a practical application
More informationImage Processing for feature extraction
Image Processing for feature extraction 1 Outline Rationale for image pre-processing Gray-scale transformations Geometric transformations Local preprocessing Reading: Sonka et al 5.1, 5.2, 5.3 2 Image
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
Student Name Date MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.161 Modern Optics Project Laboratory Laboratory Exercise No. 3 Fall 2005 Diffraction
More informationLecture 2: SIGNALS. 1 st semester By: Elham Sunbu
Lecture 2: SIGNALS 1 st semester 1439-2017 1 By: Elham Sunbu OUTLINE Signals and the classification of signals Sine wave Time and frequency domains Composite signals Signal bandwidth Digital signal Signal
More informationExercise Problems: Information Theory and Coding
Exercise Problems: Information Theory and Coding Exercise 9 1. An error-correcting Hamming code uses a 7 bit block size in order to guarantee the detection, and hence the correction, of any single bit
More informationOptical Performance of Nikon F-Mount Lenses. Landon Carter May 11, Measurement and Instrumentation
Optical Performance of Nikon F-Mount Lenses Landon Carter May 11, 2016 2.671 Measurement and Instrumentation Abstract In photographic systems, lenses are one of the most important pieces of the system
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department. 2.71/2.710 Final Exam. May 21, Duration: 3 hours (9 am-12 noon)
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Mechanical Engineering Department 2.71/2.710 Final Exam May 21, 2013 Duration: 3 hours (9 am-12 noon) CLOSED BOOK Total pages: 5 Name: PLEASE RETURN THIS BOOKLET WITH
More information4 STUDY OF DEBLURRING TECHNIQUES FOR RESTORED MOTION BLURRED IMAGES
4 STUDY OF DEBLURRING TECHNIQUES FOR RESTORED MOTION BLURRED IMAGES Abstract: This paper attempts to undertake the study of deblurring techniques for Restored Motion Blurred Images by using: Wiener filter,
More informationImage Deblurring. This chapter describes how to deblur an image using the toolbox deblurring functions.
12 Image Deblurring This chapter describes how to deblur an image using the toolbox deblurring functions. Understanding Deblurring (p. 12-2) Using the Deblurring Functions (p. 12-5) Avoiding Ringing in
More informationModulation Transfer Function
Modulation Transfer Function The resolution and performance of an optical microscope can be characterized by a quantity known as the modulation transfer function (MTF), which is a measurement of the microscope's
More informationEE 215 Semester Project SPECTRAL ANALYSIS USING FOURIER TRANSFORM
EE 215 Semester Project SPECTRAL ANALYSIS USING FOURIER TRANSFORM Department of Electrical and Computer Engineering Missouri University of Science and Technology Page 1 Table of Contents Introduction...Page
More informationOPTICAL IMAGE FORMATION
GEOMETRICAL IMAGING First-order image is perfect object (input) scaled (by magnification) version of object optical system magnification = image distance/object distance no blurring object distance image
More informationVocabulary: Description: Materials: Objectives: Safety: Two 45-minute class periods (one for background and one for activity) Schedule:
Resolution Not just for the New Year Author(s): Alia Jackson Date Created: 07/31/2013 Subject: Physics Grade Level: 11-12 Standards: Standard 1: M1.1 Use algebraic and geometric representations to describe
More informationAstronomical Detectors. Lecture 3 Astronomy & Astrophysics Fall 2011
Astronomical Detectors Lecture 3 Astronomy & Astrophysics Fall 2011 Detector Requirements Record incident photons that have been captured by the telescope. Intensity, Phase, Frequency, Polarization Difficulty
More informationISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Resolution measurements
INTERNATIONAL STANDARD ISO 12233 First edition 2000-09-01 Photography Electronic still-picture cameras Resolution measurements Photographie Appareils de prises de vue électroniques Mesurages de la résolution
More informationASD and Speckle Interferometry. Dave Rowe, CTO, PlaneWave Instruments
ASD and Speckle Interferometry Dave Rowe, CTO, PlaneWave Instruments Part 1: Modeling the Astronomical Image Static Dynamic Stochastic Start with Object, add Diffraction and Telescope Aberrations add Atmospheric
More informationDesign of a digital holographic interferometer for the. ZaP Flow Z-Pinch
Design of a digital holographic interferometer for the M. P. Ross, U. Shumlak, R. P. Golingo, B. A. Nelson, S. D. Knecht, M. C. Hughes, R. J. Oberto University of Washington, Seattle, USA Abstract The
More information+ a(t) exp( 2πif t)dt (1.1) In order to go back to the independent variable t, we define the inverse transform as: + A(f) exp(2πif t)df (1.
Chapter Fourier analysis In this chapter we review some basic results from signal analysis and processing. We shall not go into detail and assume the reader has some basic background in signal analysis
More informationChapter 2 Fourier Integral Representation of an Optical Image
Chapter 2 Fourier Integral Representation of an Optical This chapter describes optical transfer functions. The concepts of linearity and shift invariance were introduced in Chapter 1. This chapter continues
More informationDeconvolution , , Computational Photography Fall 2018, Lecture 12
Deconvolution http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 12 Course announcements Homework 3 is out. - Due October 12 th. - Any questions?
More informationWhat is a digital image?
Chapter 4 What is a digital image? 4.1 How is the image represented by the computer? Pixels Images can have 2 or 3 spacial dimensions, a time dimensions and a number of colour-channels. An image is a rectilinear
More informationTopic 4: Image Filtering Workshop Solutions
Topic 4: Image Filtering Workshop Solutions Workshop Questions 5.1 At the Edge of an Image When an image is convolved in real space with a M M filter there is a problem of how deal with the edge of the
More informationImage and Multidimensional Signal Processing
Image and Multidimensional Signal Processing Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ Digital Image Fundamentals 2 Digital Image Fundamentals
More informationEE-527: MicroFabrication
EE-57: MicroFabrication Exposure and Imaging Photons white light Hg arc lamp filtered Hg arc lamp excimer laser x-rays from synchrotron Electrons Ions Exposure Sources focused electron beam direct write
More informationIMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics
IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)
More information1 Laboratory 7: Fourier Optics
1051-455-20073 Physical Optics 1 Laboratory 7: Fourier Optics 1.1 Theory: References: Introduction to Optics Pedrottis Chapters 11 and 21 Optics E. Hecht Chapters 10 and 11 The Fourier transform is an
More informationSharpness, Resolution and Interpolation
Sharpness, Resolution and Interpolation Introduction There are a lot of misconceptions about resolution, camera pixel count, interpolation and their effect on astronomical images. Some of the confusion
More informationExamples of image processing
Examples of image processing Example 1: We would like to automatically detect and count rings in the image 3 Detection by correlation Correlation = degree of similarity Correlation between f(x, y) and
More informationImage acquisition. Midterm Review. Digitization, line of image. Digitization, whole image. Geometric transformations. Interpolation 10/26/2016
Image acquisition Midterm Review Image Processing CSE 166 Lecture 10 2 Digitization, line of image Digitization, whole image 3 4 Geometric transformations Interpolation CSE 166 Transpose these matrices
More informationTopic 04 What is a digital image?
Topic 04 What is a digital image? Exercise 4.1 How is the image represented by the computer - Pixels Images can have 2 or 3 spacial dimensions, a time dimensions and a number of colour-channels. An image
More informationResolving Power of a Diffraction Grating
Resolving Power of a Diffraction Grating When measuring wavelengths, it is important to distinguish slightly different s. The ability of a grating to resolve the difference in wavelengths is given by the
More informationCompressive Optical MONTAGE Photography
Invited Paper Compressive Optical MONTAGE Photography David J. Brady a, Michael Feldman b, Nikos Pitsianis a, J. P. Guo a, Andrew Portnoy a, Michael Fiddy c a Fitzpatrick Center, Box 90291, Pratt School
More informationBiomedical Signals. Signals and Images in Medicine Dr Nabeel Anwar
Biomedical Signals Signals and Images in Medicine Dr Nabeel Anwar Noise Removal: Time Domain Techniques 1. Synchronized Averaging (covered in lecture 1) 2. Moving Average Filters (today s topic) 3. Derivative
More informationDigital Image Processing
Digital Image Processing Part 2: Image Enhancement Digital Image Processing Course Introduction in the Spatial Domain Lecture AASS Learning Systems Lab, Teknik Room T26 achim.lilienthal@tech.oru.se Course
More informationOption G 4:Diffraction
Name: Date: Option G 4:Diffraction 1. This question is about optical resolution. The two point sources shown in the diagram below (not to scale) emit light of the same frequency. The light is incident
More informationCEE598 - Visual Sensing for Civil Infrastructure Eng. & Mgmt.
CEE598 - Visual Sensing for Civil Infrastructure Eng. & Mgmt. Session 7 Pixels and Image Filtering Mani Golparvar-Fard Department of Civil and Environmental Engineering 329D, Newmark Civil Engineering
More informationPHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT
PHYSICS FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E Chapter 35 Lecture RANDALL D. KNIGHT Chapter 35 Optical Instruments IN THIS CHAPTER, you will learn about some common optical instruments and
More informationFilters. Materials from Prof. Klaus Mueller
Filters Materials from Prof. Klaus Mueller Think More about Pixels What exactly a pixel is in an image or on the screen? Solid square? This cannot be implemented A dot? Yes, but size matters Pixel Dots
More informationIntroduction to Imaging in CASA
Introduction to Imaging in CASA Mark Rawlings, Juergen Ott (NRAO) Atacama Large Millimeter/submillimeter Array Expanded Very Large Array Robert C. Byrd Green Bank Telescope Very Long Baseline Array Overview
More informationIntroduction to Interferometry. Michelson Interferometer. Fourier Transforms. Optics: holes in a mask. Two ways of understanding interferometry
Introduction to Interferometry P.J.Diamond MERLIN/VLBI National Facility Jodrell Bank Observatory University of Manchester ERIS: 5 Sept 005 Aim to lay the groundwork for following talks Discuss: General
More informationLecture 8. Lecture 8. r 1
Lecture 8 Achromat Design Design starts with desired Next choose your glass materials, i.e. Find P D P D, then get f D P D K K Choose radii (still some freedom left in choice of radii for minimization
More informationImage Enhancement in the Spatial Domain (Part 1)
Image Enhancement in the Spatial Domain (Part 1) Lecturer: Dr. Hossam Hassan Email : hossameldin.hassan@eng.asu.edu.eg Computers and Systems Engineering Principle Objective of Enhancement Process an image
More informationImage Filtering. Median Filtering
Image Filtering Image filtering is used to: Remove noise Sharpen contrast Highlight contours Detect edges Other uses? Image filters can be classified as linear or nonlinear. Linear filters are also know
More informationELECTRONIC HOLOGRAPHY
ELECTRONIC HOLOGRAPHY CCD-camera replaces film as the recording medium. Electronic holography is better suited than film-based holography to quantitative applications including: - phase microscopy - metrology
More informationDigital Image Processing. Image Enhancement: Filtering in the Frequency Domain
Digital Image Processing Image Enhancement: Filtering in the Frequency Domain 2 Contents In this lecture we will look at image enhancement in the frequency domain Jean Baptiste Joseph Fourier The Fourier
More informationImage Processing for Mechatronics Engineering For senior undergraduate students Academic Year 2017/2018, Winter Semester
Image Processing for Mechatronics Engineering For senior undergraduate students Academic Year 2017/2018, Winter Semester Lecture 6: Image Acquisition and Digitization 14.10.2017 Dr. Mohammed Abdel-Megeed
More informationPHYS 352. FFT Convolution. More Advanced Digital Signal Processing Techniques
PHYS 352 More Advanced Digital Signal Processing Techniques FFT Convolution take a chunk of your signal (say N=128 samples) apply FFT to it multiply the frequency domain signal by your desired transfer
More informationSampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25.
Sampling and pixels CS 178, Spring 2013 Begun 4/23, finished 4/25. Marc Levoy Computer Science Department Stanford University Why study sampling theory? Why do I sometimes get moiré artifacts in my images?
More informationLecture 2: Interference
Lecture 2: Interference λ S 1 d S 2 Lecture 2, p.1 Today Interference of sound waves Two-slit interference Lecture 2, p.2 Review: Wave Summary ( ) ( ) The formula y x,t = Acoskx ωt describes a harmonic
More information6.003: Signal Processing. Synthetic Aperture Optics
6.003: Signal Processing Synthetic Aperture Optics December 11, 2018 Subject Evaluations Your feedback is important to us! Please give feedback to the staff and future 6.003 students: http://registrar.mit.edu/subjectevaluation
More informationDiffraction of a Circular Aperture
DiffractionofaCircularAperture Diffraction can be understood by considering the wave nature of light. Huygen's principle, illustrated in the image below, states that each point on a propagating wavefront
More informationFRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION
FRAUNHOFER AND FRESNEL DIFFRACTION IN ONE DIMENSION Revised November 15, 2017 INTRODUCTION The simplest and most commonly described examples of diffraction and interference from two-dimensional apertures
More informationTDI2131 Digital Image Processing
TDI131 Digital Image Processing Frequency Domain Filtering Lecture 6 John See Faculty of Information Technology Multimedia University Some portions of content adapted from Zhu Liu, AT&T Labs. Most figures
More information