TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Similar documents
IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Chapter Ray and Wave Optics

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

OPAC 202 Optical Design and Instrumentation. Topic 3 Review Of Geometrical and Wave Optics. Department of

Εισαγωγική στην Οπτική Απεικόνιση

Big League Cryogenics and Vacuum The LHC at CERN

Applied Optics. , Physics Department (Room #36-401) , ,

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses.

Intorduction to light sources, pinhole cameras, and lenses

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

Cameras. CSE 455, Winter 2010 January 25, 2010

Observational Astronomy

CS 443: Imaging and Multimedia Cameras and Lenses

Chapter 18 Optical Elements

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Unit 1: Image Formation

LENSES. INEL 6088 Computer Vision

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8

Waves & Oscillations

Dr F. Cuzzolin 1. September 29, 2015

Computer Vision. The Pinhole Camera Model

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

GEOMETRICAL OPTICS AND OPTICAL DESIGN

LOS 1 LASER OPTICS SET

The Nature of Light. Light and Energy

CPSC 425: Computer Vision

Chapter 17: Wave Optics. What is Light? The Models of Light 1/11/13

( ) Deriving the Lens Transmittance Function. Thin lens transmission is given by a phase with unit magnitude.

Diffraction. Interference with more than 2 beams. Diffraction gratings. Diffraction by an aperture. Diffraction of a laser beam

Name. Light Chapter Summary Cont d. Refraction

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

UNIT 12 LIGHT and OPTICS

The electric field for the wave sketched in Fig. 3-1 can be written as

ECEN 4606, UNDERGRADUATE OPTICS LAB

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Fundamentals of Radio Interferometry

Reflectors vs. Refractors

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Be aware that there is no universal notation for the various quantities.

AS Physics Unit 5 - Waves 1

Tangents. The f-stops here. Shedding some light on the f-number. by Marcus R. Hatch and David E. Stoltzmann

Two strategies for realistic rendering capture real world data synthesize from bottom up

R.B.V.R.R. WOMEN S COLLEGE (AUTONOMOUS) Narayanaguda, Hyderabad.

25 cm. 60 cm. 50 cm. 40 cm.

Chapter 36: diffraction

SUBJECT: PHYSICS. Use and Succeed.

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

PHYS 202 OUTLINE FOR PART III LIGHT & OPTICS

Experiment 1: Fraunhofer Diffraction of Light by a Single Slit

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Exam 4. Name: Class: Date: Multiple Choice Identify the choice that best completes the statement or answers the question.

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name:

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

Image Formation and Camera Design

Will contain image distance after raytrace Will contain image height after raytrace

Applications of Optics

,, Last First Initial UNIVERSITY OF CALIFORNIA AT BERKELEY DEPARTMENT OF PHYSICS PHYSICS 7C FALL SEMESTER 2008 LEROY T. KERTH

Warren J. Smith Chief Scientist, Consultant Rockwell Collins Optronics Carlsbad, California

EE119 Introduction to Optical Engineering Spring 2002 Final Exam. Name:

Optical System Design

Laser Telemetric System (Metrology)

PHYS 241 FINAL EXAM December 11, 2006

Performance Factors. Technical Assistance. Fundamental Optics

Chapter Wave Optics. MockTime.com. Ans: (d)

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the

Light sources can be natural or artificial (man-made)

Cameras, lenses and sensors

EE-527: MicroFabrication

Image Formation: Camera Model

Absorption: in an OF, the loss of Optical power, resulting from conversion of that power into heat.

Ch 24. Geometric Optics

Physics 3340 Spring Fourier Optics

Exp No.(8) Fourier optics Optical filtering

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Transmission electron Microscopy

Cameras, lenses, and sensors

Chapter 23. Mirrors and Lenses

Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: Signature:

ELEC Dr Reji Mathew Electrical Engineering UNSW

12:40-2:40 3:00-4:00 PM

Reflection! Reflection and Virtual Image!


Notes from Lens Lecture with Graham Reed

Lithography. 3 rd. lecture: introduction. Prof. Yosi Shacham-Diamand. Fall 2004

The diffraction of light

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f

Lecture 3: Geometrical Optics 1. Spherical Waves. From Waves to Rays. Lenses. Chromatic Aberrations. Mirrors. Outline

GIST OF THE UNIT BASED ON DIFFERENT CONCEPTS IN THE UNIT (BRIEFLY AS POINT WISE). RAY OPTICS

Electromagnetic Spectrum

Mirrors, Lenses &Imaging Systems

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Introduction to Light Microscopy. (Image: T. Wittman, Scripps)

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Chapter 16 Light Waves and Color

Transcription:

TSBB09 Image Sensors 2018-HT2 Image Formation Part 1

Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal electrical and magnetic fields that alternate with a temporal frequency (Hertz) and spatial wavelength (meter) 2

Frequency and wavelength The relation between frequency and wavelength is c = c is the speed of light and depends on the medium, c c 0 c 0 = speed of light in vacuum 3 10 8 m/s 3

Particles and energy Light can also be represented as particles, photons The energy of a photon is E = h = h c / Energy increases with and decreases with h is Planck s constant ( 6.623 10-34 Js) 4

Particles and energy Energy depends on the frequency Energy is preserved c 2 and 2 must change with the same factor relative to c 1 and 1 c 1 c 2 < c 1 1 1 2 = 1 2 < 1 If the speed of light changes from one medium to another, the frequency is constant to make the energy constant the wavelength must change 5

Spectrum In practice, light normally consists of photons with a range of energies, or waves with a range of frequencies This mix of frequencies/wavelengths/energies is called the spectrum of the light The spectrum is a function that gives the total amount of energy for each frequency or wavelength Monochromatic light consists essentially of only one frequency/wavelength Can be produced by special light sources, e.g., lasers 6

Spectrum Less number of photons More number of photons E E Same total energies Natural light Monochromatic light 7

Classification of light spectrum 8

Polarization The electromagnetic field has a direction Perpendicular to the direction of motion The polarization of the light is defined as the direction of the electric field Natural light is a mix waves with polarization in all possible directions: unpolarized light Special light sources or filters can produce polarized light of well-defined polarization 9

Polarization Plane polarization The electric field varies only in a single plane Electric field 10

Polarization Circular/elliptical polarization The electric field vector rotates Can be constructed as the sum of two plane polarized waves with 90 o phase shift Conversely: plane polarized light can be decomposed as a sum of two circular polarized waves that rotate in opposite directions 11

Coherence The phase of the light waves can either be random: incoherent light (natural light) in a systematic relation: coherent light Coherent light is usually related to monochromatic light sources (e.g. laser) Compare a red LED and a red laser Both produce light within a narrow range The LED light is incoherent The laser light is coherent 12

Radiometry Light radiation has energy Each photon has a particular energy related to its frequency (E = h ) The number of photons of a particular frequency gives the amount of energy for this frequency Described by the spectrum Unit: Joule (or Watt second) Is usually not measured directly 13

Radiometry The power of the radiation, i.e., the energy per unit time, is the radiant flux Since the energy depends on the frequency, so does the radiant flux Unit: Watt or Joule per second Is usually not measured directly 14

Radiometry The radiant flux per unit area is the flux density Since the flux depends on the frequency, so does the flux density Unit: Watt per square meter Can be measured directly! As the energy through a specific area during a specific time interval Irradiance: flux density incident upon a surface Excitance or emittance: flux density emitted from a surface 15

Radiometry For point sources, or distant sources of small extent, the flux density can also be measured per unit solid angle The radiant intensity is the radiant flux per unit solid angle Unit: Watt per steradian 16

Basic principle Based on preservation of energy A constant light source must produce the same amount of energy through a solid angle regardless of distance to the source The radiant intensity is constant w.r.t. distance The radiant flux density decreases with the square of the distance to the source 17

The radiometric chain Light source Sensor Surface 18

The radiometric chain Light source Sensor Surface 2 Surface 1 19

The radiometric chain Light source 1 Sensor Surface 2 Light source 2 Surface 1 20

The radiometric chain Light source 1 Sensor Surface 2 Light source 2 Medium Surface 1 21

Interaction between light and matter Most types of light-matter interactions can be represented by n = the material s refractive index a = the material s absorption coefficient Both parameters depend on More complex interactions include polarization effects or non-linear effects 22

BREAK 23

Light incident upon a surface When light meets a surface Some part of it is transmitted through the new media Possibly with another speed and direction Some part of it is absorbed by the new media Usually: the light energy is transformed to heat Some part of it is reflected For the same material, all three effects depend on the light s wavelength Equivalently: they depend on the light s frequency 24

Basic principle Based on preservation of energy: E 0 = E 1 + E 2 + E 3 E 0 = incoming energy E 3 = absorbed energy E 2 = reflected energy E 1 = transmitted energy 25

Refraction The light that is transmitted into the new medium is refracted due to the change in light speed a 1 Snell s law of refraction: sina 2 sina 1 = n 1 n 2 = c 2 c 1 a 2 26

Absorption Absorption implies attenuation of transmitted or reflected light Materials get their colors as a result of different amount of absorption for different wavelengths Ex: A green object attenuates wavelengths in the green band less than in other bands. 27

Absorption The absorption of light in matter depends on the length that the light travels through the material a= e -ax a = attenuation of the light (0 a 1) a = the material s absorption coefficient x = length that the light travels in the material 28

Absorption spectrum The spectrum of the reflected/transmitted light is given by s 2 ( ) = s 1 ( ) a( ) s 1 = incident spectrum s 2 = reflected/transmitted spectrum a = absorption spectrum (0 a( ) 1) 29

Reflection Highly dependent on the surface type a a a Light is reflected equally much in all directions independent of a Mirror Lambertian surface A real surface is often a mix between the two cases 30

Emission Independent of its interaction with incident light (well, almost ): Any object, even one that is not considered a light source, emits electromagnetic radiation Primarily in the IR-band, based on its temperature More on this in the lecture on IR sensors 31

Scattering All mediums (other than vacuum) scatter light Examples: air, water, glass We can think of the medium as consisting of small particles and with some probability they reflect the light In any possible direction Different probability for different directions Weak effect and roughly proportional to -4 In general, the probability depends also on the distribution of particle sizes 32

Scattering Medium 33

Scattering Scattering is not an absorption It rather means that the light ray does not travel along a straight line through the medium There is a probability that a certain photon exits the medium in another direction than it entered. Examples: The sky is blue because of scattering of the sun light A strong laser beam becomes visible in air 34

The plenoptic function At a point x = (x 1,x 2,x 3 ) in space we can measure how much light energy that travels in the direction n = (n 1,n 2,n 3 ), n = 1 n x 35

The plenoptic function The plenoptic function is the corresponding radiance intensity function p(x,n) (5-dim since x is 3-dim & n has 2 d.o.f) Can also be a function of Frequency Time t p(x,n,,t) (7-dim) (Polarization) 36

A light camera A (light) camera is a device that samples the plenoptic function in a particular way Different types of cameras sample in different ways Pinhole-camera Orthographic camera Push-broom camera Light-field camera 37

The pinhole camera The most common camera model is the pinhole camera Swedish: hålkamera An ideal model of the camera obscura 38

The pinhole camera model Each point in the image plane is illuminated by a single ray passing through the aperture The aperture through which all light enters the camera The image plane This is where we measure the image For an ideal pinhole camera the aperture is a single point The camera front 39

The pinhole camera model Mathematically we need only know the location of the image plane and the aperture The rest is physics + practical implementation In fact, it suffices to know the aperture (why?) In the literature, the aperture point is also called camera center camera focal point 40

The pinhole camera model The image plane and the camera center define a camera-centered coordinate system (x 1,x 2,x 3 ): Principal or optical axis x 1,x 2 are parallel to the image plane, x 3 is perpendicular to the plane and defines the viewing direction of the camera P=(x 1,x 2,x 3 ) is a point in 3D space Q=(y 1,y 2 ) is the projection of P f = focal distance, the distance between the image plane and the camera center 41

The pinhole camera model R is the point where the optical axis intersects the image plane The principal point or the image center The (x 1,x 2 ) plane is the principal plane or focal plane The green line is the projection line of point P All points on the line are projected onto Q Alternatively: the projection line of Q 42

The pinhole camera model If we look at the camera coordinate system along the x 2 axis: Two similar triangles give: -y 1 f = x 1 x 3 or y 1 = - f x 1 x 3 43

The pinhole camera model Looking along the x 1 axis gives a similar expression for y 2 This can be summarized as: æ ç ç è y 1 y 2 ö = - f æ x ç 1 x ç x ø 3 è 2 ö ø 44

The virtual image plane The projected image is rotated 180 o relative to how we see the 3D world Reflection in both y 1 and y 2 coordinates = rotation Must be de-rotated before we can view it In the film based camera, the image is manually rotated In the digital camera this is taken care of by reading out the pixels in the rotated order Mathematically this is equivalent to placing the image plane in front of the focal point 45

The virtual image plane Projection lines works as before: from P through the focal point and intersect at Q This defines the virtual image plane Cannot be realized in practice Produces the same image as the rotated image from the real image plane Easier to draw? 46

The virtual image plane P = A point in 3D space Q The projection of P onto the virtual image plane O = The camera focal point æ ç è y 1 y 2 ö = f æ x 1 ø x 3 ç è x 2 ö ø 47

Lenses vs. infinitesimal aperture The pinhole camera model doesn t work in practice since If we make the aperture small, too little light enters the camera If we make the aperture larger, the image becomes blurred Solution: we replace the aperture with a lens or a system of lenses 48

Thin lenses The simplest model of a lens Focuses all points in an object plane onto the image plane object plane a b image plane 49

The object plane The object plane consists of all points that appear sharp when projected through the lens onto the image plane The object plane is an ideal model of where the sharp points are located In practice: the object plane may be non-planar: e.g. described by the surface of a sphere The shape of the object plane depends on the quality of the lens (or lens system) For thin lenses the object plane can often be approximated as a plane 50

Thin lenses The thin lens is characterized by a single parameter: the focal length f L 1 a + 1 b = 1 f L To change a (distance to object plane), we need to change b since f L is constant a = for b = f L! 51

TSBB09 Image Sensors Image Formation Part 2 52

Diffraction limited systems Due to the wave nature of light, even when various lens effects are eliminated, light from a single 3D point cannot be focused to an arbitrarily small point if it has passed an aperture For coherent light: Huygens's principle: treat the incoming light as a set of point light sources Gives diffraction pattern at the image plane 53

Diffraction limited systems Assume an ideal lens with aperture size D: D D Because of diffraction, a point source infinitely far away (a plane wave) will not be focused onto a single point in the image plane. 54

Diffraction limited systems Example: 1D x = vertical position in the aperture 55

Diffraction limited systems Each point along the aperture, at position x, acts as a wave source In the image plane, at position x, each point source contributes with a wave that has a phase difference 2 x sin / relative the position at the centre of the aperture, (assuming that x << f) is the angle from point x to the aperture, and assuming that is small it follows that sin tan = x / f We get: 2 x x / ( f) 56

Diffraction limited systems The wave function is everywhere characterized by its magnitude A and phase. As is common with sinusoidal signals, we can represent the wave-function mathematically by a complex number: Y = Ae i A phase shift corresponds to multiplying by e i Im Re 57

Diffraction limited systems The principle of superposition means that the resulting wave-function at the image plane is a sum/integral of the contributions from the different light sources. For the incoming plane wave, we set its amplitude to 1 and phase = 0: Resulting wave-function Amplitude of incoming light D/2 Y(x) = ò Y ( x )e idf d x = -D/2 ò rect ( x D )eidf d x -» ò rect ( x 2pi x x D )e f sin(p xd f l) l d x = (cmp. Fourier transform) = f l (p xd f l) - 58

Diffraction limited systems This phenomena generalizes to 2D: The resulting wave-function is the 2D FT of the incoming spatial amplitude (function of x ) Example: a circular aperture of diameter D (Input amplitude normalized to 1/f ) Y ( r ) = 1 f rect( r l D ) Y(r) = J 1(pr D f l) pr D f l First order Bessel function 59

The Airy disk from a single 3D point à à camera front focal plane image plane The Airy disk, the image of a circular pattern projected into the image plane 60

The Airy disk The smallest resolvable distance in the image plane, x, is given by Distance to first zero point in Ã(x) lens focal length lens diameter light wavelength 61

The Airy disk Conclusions: The image cannot have a better resolution than x No need to measure the image with higher resolution than x! Be aware of cameras with high pixel resolution and high diffraction Image resolution is not defined by number of pixels in the camera! 62

The point spread function The Airy disk is also called point spread function or blur disk, circle of confusion Modulation transfer function (MTF) In general the point spread function can be related to several effects that make the image of a point appear blurred Diffraction Lens imperfections Imperfections in the position of the image plane Often modeled as constant over the image Can be variable for poor optical systems 63

Depth of field We have now placed a lens at the aperture Points that are off the object plane become blurred proportional to the displacement from the object plane Due to the point spread function, it makes sense to accept blur in the order of x This blur will be there anyway due to diffraction Depth of field d is the displacement along the optical axis from the object plane that gives blur x 64

Depth of field D Depth of field (d) x a a b b 1 a + 1 b = 1 f L Insert a = a - d/2 to get the horisontal blur (b -b) Relate horisontal blur to vertical blur x 65

Depth of field For a camera where a <, an approximation (assuming d << a) for d is d» 2Dx a(a- f L ) Df L a = distance from lens to object plane f L = lens focal length D = lens diameter x = required image plane resolution d = depth of field 66

Depth of field For a lens where a =, points that are further away than d min are blurred less than x where d min = f LD 4Dx 67

The F-number f L /D is the F-number of the lens or lens system Example A typical F number of a camera = 8 Blue light = 420 nm wavelength Airy disk diameter x = 1.22 F 4 m For a lens with f L = 15 mm we get d 0.6 m at a = 1.5 m d min 1.8 m at a = This means that the depth of field is within a manageable range 68

Lens distortion A lens or a lens system can never map straight lines in the 3D scene exactly to straight lines in the image plane Depending on the lens type, a square pattern will typically appear like a barrel or a pincushion 70

Lens distortion Barrel distortion No distortion Pincushion distortion 71

Radial lens distortion This effect is called lens distortion (geometric distortion) and can, in the simplest case, be modeled as a radial distortion (y 1, y 2 ) = correct image coordinate (y 1, y 2 ) = r (cos, sin ) (y 1, y 2 ) = real image coordinate (y 1, y 2 ) = h(r) (cos, sin ) Observed point The observed positions of points in the image are displaced in the radial direction relative the image center as described by the pinhole camera model. y 2 Position according to the pinhole camera model y 1 72

Radial lens distortion h is approximately a linear function with some non-linear deviation, e.g. The deviation from a linear function usually grows with r Once modeled, we can compensate for the distortion 73

Lens distortion Which distortion function h is used depends on the type of lens and other practical considerations: Number of parameters Invertibility More complicated distortion models include angular dependent distortion Cheap lenses => significant distortion Almost no distortion => expensive lenses 74

Vignetting Even if the light that enters the camera is constant in all directions, the image plane will receive different amount of illumination This effect is called vignetting 75

Vignetting Sometimes used as a photographic effect But is usually unwanted Can be compensated for in digital cameras Image from a digital camera with a very light lens 76

Mechanical vignetting B Light from a larger solid angle emitted from point A is focused here A Light from a smaller solid angle emitted from point B is focused here 77

The cos 4 law We can see the aperture as a light source in the form of a small area that illuminates the image plane a The flux density decreases with the square of the distance to the light source: cos 2 a The effective area of the detector relative to the aperture varies as cos a The effective area of the aperture relative to the detector varies as cos a 78

The cos 4 law This effect exists also in lens-based cameras This means that, in general, there is an attenuation of the image towards the edges of the image, approximately according to cos 4 a Can be compensated for in a digital camera 79

Chromatic aberration The refraction index of matter (lenses) is wavelength dependent Example: a prism can decompose the light into its spectrum A ray of white light is decomposed into rays of different colors that intersect the image plane at different points 80

Chromatic aberration Sometimes clearly visible if you look close to the edges through a pair of glasses 81