Dr F. Cuzzolin 1. September 29, 2015

Similar documents
IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Unit 1: Image Formation

Cameras. CSE 455, Winter 2010 January 25, 2010

CSE 527: Introduction to Computer Vision

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

LENSES. INEL 6088 Computer Vision

Computer Vision. The Pinhole Camera Model

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

CSE 473/573 Computer Vision and Image Processing (CVIP)

CPSC 425: Computer Vision

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Applications of Optics

VC 11/12 T2 Image Formation

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Lenses, exposure, and (de)focus

VC 14/15 TP2 Image Formation

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Overview. Image formation - 1

Image Formation and Camera Design

CS 443: Imaging and Multimedia Cameras and Lenses

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Basic principles of photography. David Capel 346B IST

Building a Real Camera

ME 6406 MACHINE VISION. Georgia Institute of Technology

Lecture 2 Camera Models

Two strategies for realistic rendering capture real world data synthesize from bottom up

Image Formation and Capture

VC 16/17 TP2 Image Formation

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals.

ECEN 4606, UNDERGRADUATE OPTICS LAB

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Chapter 18 Optical Elements

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

Chapter 36. Image Formation

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

OPTICAL SYSTEMS OBJECTIVES

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

OPTICS DIVISION B. School/#: Names:

Astronomical Cameras

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 02 Image Formation 1

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Test Review # 8. Physics R: Form TR8.17A. Primary colors of light

Image Formation: Camera Model

Chapter 36. Image Formation

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Basic Optics System OS-8515C

General Imaging System

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8

Intorduction to light sources, pinhole cameras, and lenses

Chapter 25. Optical Instruments

Lecture 7: Camera Models

Lecture 3: Geometrical Optics 1. Spherical Waves. From Waves to Rays. Lenses. Chromatic Aberrations. Mirrors. Outline

Chapter Ray and Wave Optics

Lecture 2 Camera Models

6.A44 Computational Photography

Algebra Based Physics. Reflection. Slide 1 / 66 Slide 2 / 66. Slide 3 / 66. Slide 4 / 66. Slide 5 / 66. Slide 6 / 66.

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

The Law of Reflection

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Chapter 34: Geometric Optics

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

Digital Image Processing COSC 6380/4393

CS6670: Computer Vision

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Sensors and Sensing Cameras and Camera Calibration

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Astronomy 80 B: Light. Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses.

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved.

PHYS 202 OUTLINE FOR PART III LIGHT & OPTICS

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 23. Light Geometric Optics

UNIT 12 LIGHT and OPTICS

Applied Optics. , Physics Department (Room #36-401) , ,

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

OPTICS I LENSES AND IMAGES

Reading. Angel. Chapter 5. Optional

Lenses. Images. Difference between Real and Virtual Images

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Digital Image Processing COSC 6380/4393

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Physics 1230 Homework 8 Due Friday June 24, 2016

GEOMETRICAL OPTICS AND OPTICAL DESIGN

Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics Spring 2018 Douglas Fields

Geometry of Aerial Photographs

Cameras, lenses, and sensors

CAMERA BASICS. Stops of light

Study guide for Graduate Computer Vision

Digital Image Processing

1) An electromagnetic wave is a result of electric and magnetic fields acting together. T 1)

Transcription:

P00407 Principles of Computer Vision 1 1 Department of Computing and Communication Technologies Oxford Brookes University, UK September 29, 2015 September 29, 2015 1 / 73

Outline of the Lecture 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 2 / 73

Recalling what computer vision is about (see our discussion in Week 1) human vision allows us to observe the environment, and interact with it (e.g. explore a new place, grasp and object) purpose of computer vision: to reproduce this behavior in intelligent machines....in order to allow them to interact with the environment in the same successful way first step in any computer vision process: acquire images and videos we can process and reason upon! September 29, 2015 3 / 73

What are images then? images form the fundamental data on which vision operates they are (possibly continuous) collections of dots of light however, machine can only work with digital images - grids of "pixels" with a certain number of rows and columns each pixel is characterised by a level of brightness (usually in the range [0,255]) colour images have three separate colour channels images are acquired through, using which focus light rays onto sensors this process can be described by the perspective model September 29, 2015 4 / 73

What you will learn today what digital images are, and the mechanism by which they are generated (image formation) how the two main models: perspective (or pinhole camera ) and thin lens work what it means to calibrate a camera, and what are its intrinsic and extrinsic parameters how work, and the physical principles of refraction and reflection how BRDFs describe the reflectance properties of surfaces how work, and what are their main working parameters September 29, 2015 5 / 73

Outline of the Lecture Basics of perspective Effects of perspective Orthographic 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 6 / 73

Basics of perspective Basics of perspective Effects of perspective Orthographic Plane perspective Main elements derived from physical construction of early - mathematics is very straightforward points in the 3D space with coordinates X, Y, X.... are mapped to an image plane origin of the coordinate axes: optical center September 29, 2015 7 / 73

Basics of perspective Basics of perspective Plane perspective How does it work the distance from the optical center to the image plane is called focal length each 3D (scene) point (X, Y, Z ) is mapped to a point (x, y, z) on the image plane... Effects of perspective Orthographic... by tracing a ray through the optical center until it intersects the image plane September 29, 2015 8 / 73

Basics of perspective Basics of perspective Plane perspective vs world coordinates the relation between coordinates in 3D and in the image plane comes from the geometry of the process Effects of perspective Orthographic it can be obtained from the similarity of the two triangles: y f = Y Z, y = fy Z September 29, 2015 9 / 73

Basics of perspective Basics of perspective Plane perspective Alternative representation puts the image plane in front of the (optical) center of Effects of perspective Orthographic nevertheless, the geometry stays the same September 29, 2015 10 / 73

Basics of perspective Basics of perspective Example of coordinate calculation if a camera has a focal length of 50mm how (vertically) big is the camera image of a man 2m tall, standing 12m away? Effects of perspective Orthographic using the similarity relation (in millimeters) we get: y = fy Z = 50 2000 12000 = 8.3mm September 29, 2015 11 / 73

Effects of perspective Basics of perspective effects Point mapping a 3D point maps to a single image point BUT...... the inverse mapping of an image point is an entire ray in 3D Effects of perspective Orthographic September 29, 2015 12 / 73

Effects of perspective Basics of perspective effects Mapping of lines a 3D line maps to an image line BUT...... the inverse mapping of an image line is an entire plane in 3D Effects of perspective Orthographic September 29, 2015 13 / 73

Effects of perspective effects Midpoints midpoints are also not preserved in the image plane Basics of perspective Effects of perspective Orthographic bottom line: perspective introduces deformations, therefore images are a deformed representation of reality! September 29, 2015 14 / 73

Effects of perspective Basics of perspective effects Mapping of circles ellipses are mapped to ellipses, with different eccentricities Effects of perspective Orthographic the rotation of a circle in the scene changes the eccentricity of the projected ellipse September 29, 2015 15 / 73

Effects of perspective Basics of perspective Effects of perspective effects Vanishing points parallel lines in the 3D scene appear to disappear (in the image) at a point on the horizon - the vanishing point this mechanism was only understood in the 15th century Orthographic http://webexhibits.org/ hockneyoptics/post/tyler2. html Masaccio, Trinity, Santa Maria Novella, Florence, 1425-28 September 29, 2015 16 / 73

Effects of perspective Basics of perspective effects One vanishing point if in the 3D scene there is a single set of parallel lines...... then there is only one vanishing point Effects of perspective Orthographic September 29, 2015 17 / 73

Effects of perspective Basics of perspective effects Two vanishing points if in the 3D scene there are two sets of parallel lines...... there are also two vanishing points Effects of perspective Orthographic September 29, 2015 18 / 73

Effects of perspective effects Some conclusions Basics of perspective Effects of perspective Orthographic acquiring images is necessary to understand the surrounding environment, and interact with it however, we need to keep in mind that the perspective mechanism is not a faithful representation of the external world information is lost, when passing from a 3D space to a 2D one as we will see later, we can compensate for this by acquiring more images from different viewpoints, and/or in time! September 29, 2015 19 / 73

Week 2 Effects of perspective Optical illusions effects perspective generates all sort of optical illusion: http://www.moillusions.com/ Basics of perspective Effects of perspective Orthographic September 29, 2015 20 / 73

Orthographic Basics of perspective Effects of perspective Orthographic Orthographic the geometry of perspective is complex, leading to non-linear equations the model can be simplified by assuming all rays are parallel - the is orthographic e.g., a telephoto lens approximates to orthographic September 29, 2015 21 / 73

Outline of the Lecture Intrinsic camera parameters Extrinsic camera parameters 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 22 / 73

Intrinsic camera parameters Extrinsic camera parameters problem: we need to place the camera in the wider world (with respect to its environment) to do this we need to be able to model: translation t, a 3D vector rotation R, a 3 3 matrix (R + t = configuration of the camera in the world) perspective (pinhole) (how world points are mapped to image points) September 29, 2015 23 / 73

Intrinsic camera parameters Intrinsic camera parameters Extrinsic camera parameters Pinhole camera in formulas Homogenous coordinates recall the pinhole model: (X, Y, Z ) ( fx, fy ) Z Z is this a linear transformation? No! - division by z is nonlinear then it cannot be expressed via matrices (see Week 1, P00405) little trick: we can add one more coordinate (just a constant 1 at the bottom really), called homogeneous coordinates becomes a matrix multiplication in homogenous coordinates: X Y Z 1 fx fy Z = f 0 0 0 0 f 0 0 0 0 1 0 X Y Z 1 or, x = PX September 29, 2015 24 / 73

Intrinsic camera parameters coordinate system Intrinsic camera parameters Extrinsic camera parameters the main elements of the pinhole model are: principal axis: line from the camera center perpendicular to the image plane (camera) coordinate system: camera center is at the origin and the principal axis is the z-axis principal point (p): point where principal axis intersects the image plane (origin of normalized coordinate system) September 29, 2015 25 / 73

Intrinsic camera parameters The matrix Intrinsic camera parameters Extrinsic camera parameters to take care of the fact that image coordinates are measured from the top left corner, we need to offset the principal point ( fx (X, Y, Z ) Z + px, fy ) Z + py where p = (p x, p y) are the coordinates of the principal point in matrix form: X X Y fx + p x f 0 p x 0 Z fy + p y = 0 f p y 0 Y Z Z 0 0 1 0 1 1 the matrix P = K [I 0], where K is the first three columns of P, is called the matrix September 29, 2015 26 / 73

Intrinsic camera parameters Intrinsic camera parameters Extrinsic camera parameters Intrinsic camera parameters Pixel coordinates K can be expressed in pixels by multiplying for the appropriate values of pixels per unit (e.g., millimeters or inches) m x 0 0 f 0 p x α x 0 β x K = 0 m y 0 0 f p y = 0 α y β y 0 0 1 0 0 1 0 0 1 we have a complete picture of intrinsic camera parameters: principal point coordinates focal length pixel magnification factors skew (non-rectangular pixels) radial distortion September 29, 2015 27 / 73

Extrinsic camera parameters Extrinsic camera parameters Rotation and translation the camera coordinate frame is related to the world coordinate frame by a rotation and a translation Intrinsic camera parameters Extrinsic camera parameters we can first write X cam = R( X C) where X cam are the scene coordinates in the camera reference frame, X are the corodinates in the world reference frame, and C is the coordinates of the camera center in the world frame September 29, 2015 28 / 73

Extrinsic camera parameters Extrinsic camera parameters in homogenous coordinates Intrinsic camera parameters Extrinsic camera parameters in homogenous coordinates this reads as X cam = [ R R C 0 1 ] ( X 1 ) so that x = K [I 0]X cam = K [R R C]X = PX, where P = K [R t], t = R C R and t are the so called extrinsic camera parameters September 29, 2015 29 / 73

camera : given n points with known 3D coordinates X i and known image s x i, estimate the camera parameters Intrinsic camera parameters Extrinsic camera parameters we cannot do this from one image! depth is lost, the matrix can only tell us the ray through an image point x we will talk about it later when discussing multiple view geometry September 29, 2015 30 / 73

Outline of the Lecture Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 31 / 73

Lenses vs pinhole camera model Pinhole camera model perspective is often called pinhole camera model for it models a very simple cardbox camera with a pinhole through which light can pass Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF the pinhole model is often used to model processes in vision September 29, 2015 32 / 73

Lenses vs pinhole camera model Lenses vs pinhole camera model Using however, a pinhole camera has a serious flaw: very little light is captured to form the image only photons coming from a single direction are collected at each image point are thus used to improve image intensity, as they collect light from originally diverging directions Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 33 / 73

Depth of field Focus of for a given lens the ratio scene size X to image size x is fixed lens is in focus only for objects at a certain ratio fx = const x thus an object is in focus only if it is at a certain distance X in front of the lens Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF for different distances X X the image goes out of focus September 29, 2015 34 / 73

Depth of field Depth of field of the range of distances for which the focusing is reasonably good is call depth of field distances are measured from the lens Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 35 / 73

Depth of field Depth of field Objects at infinity for objects at infinity (a long way off, really).... light rays are parallel to the lens axis.. Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF.. and are focussed at the shortest distance behind the lens September 29, 2015 36 / 73

Depth of field Lenses vs pinhole camera model Circle of confusion for objects out of focus for all scene points far from the optimal distance X, light rays are not focussed onto the image plane, but a bit before or after this generates a circle on the image plane, rather than a point: the circle of confusion (blurring) human eye detects blur for circles above about 0.33mm in diameter Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 37 / 73

Depth of field Lenses vs pinhole camera model Field of view can be made with a large or small field of view a typical camera has field of view of 25 to 40 degrees a wide angle lens has a short focal length and a wide field of view a telephoto lens has the reverse Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 38 / 73

A bit of physics of light How a lens works - refraction work by changing the direction of incoming light rays, in order to focus them onto the image place light rays are bent as they enter the lens by a physical process called refraction Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 39 / 73

A bit of physics of light Refraction A bit of physics refraction is described by the refractive index: µ = sin i law), where µ = n 1 n 2 air and glass) sin r (Snell s is a function of the nature of the two media (e.g. Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 40 / 73

A bit of physics of light Refractive index as a ray passes from less dense to more dense it is bent towards the perpendicular Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF the refractive index can be seen as the factor by which the speed and the wavelength of the radiation are reduced with respect to their vacuum value the refractive index varies with the wavelength of light ( dispersion ) September 29, 2015 41 / 73

A bit of physics of light Electromagnetic radiation light is an energy wave at the visible part of the electro-magnetic spectrum Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF it can be described as both a set of particles ( photons ) and as an energy wave fluctuating in space the energy wave itself is invisible - light is only apparent when it hits something and is reflected back to hit our retina September 29, 2015 42 / 73

A bit of physics of light Lenses vs pinhole camera model Depth of field A bit of physics of light Brightness, illumination and reflectance the intensity or brightness F(x, y) = I(x, y)r(x, y) of a point (x, y) S of a surface S is determined by the illumination at the point I(x, y) (amount of light hitting that spot) the reflectance R(x, y) of the surface S at the point (x, y) light can be reflected in several ways depending on surface properties e.g. specular reflection is the mirror-like property of some surfaces such as polished metals reflected light retains the color of the original light source Lambertian surfaces and BRDF September 29, 2015 43 / 73

A bit of physics of light Lenses vs pinhole camera model Diffuse reflection other case: diffuse reflection disperses the reflected light in many directions the surface absorbs some frequencies (colours), while reflecting others: the apparent colour of the surface is determined by the frequencies which are not absorbed Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 44 / 73

A bit of physics of light Lenses vs pinhole camera model Chromatic aberration or dispersion different frequencies in the spectrum travels at different speeds the refractive index of two materials varies with the frequency (color) of the incoming light light rays of different colour are bent by different angles Depth of field A bit of physics of light Lambertian surfaces and BRDF http://www.vanwalree.com/optics/chromatic.html September 29, 2015 45 / 73

Lambertian surfaces and BRDF Lenses vs pinhole camera model Lambertian surfaces a particular point P on a Lambertian (perfectly matte) surface appears to have the same brightness no matter what angle it is viewed from examples: piece of paper matte paint does not depend on incident light angle Depth of field A bit of physics of light Lambertian surfaces and BRDF what does this say about how they emit light? September 29, 2015 46 / 73

Lambertian surfaces and BRDF Cosine law for Lambertian surfaces in Lambertian surfaces the elative magnitude of light scattered in each direction is proportional to cos(θ), where θ is the angle of scatter this is called the cosine law Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, 2015 47 / 73

Lambertian surfaces and BRDF Lenses vs pinhole camera model Bidirectional Reflectance Distribution Function (BRDF) in general, how light is reflected at an opaque surface is described by a function of four variables called bidirectional reflectance distribution function or BRDF the function takes an incoming light direction, ω i, and an outgoing direction, ω r (taken in a coordinate system where the surface normal n lies along the z-axis).. Depth of field A bit of physics of light Lambertian surfaces and BRDF.. and returns the ratio of reflected radiance exiting along ω r to the irradiance incident on the surface from direction ω i September 29, 2015 48 / 73

Lambertian surfaces and BRDF Bidirectional Reflectance Distribution Function (BRDF) each direction ω is itself parameterized by azimuth angle φ and zenith angle θ -> the BRDF as a whole ρ(θ r, φ r, θ i, φ i ) is a function of 4 variables: Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF http://escience.anu.edu.au//cg/ GlobalIllumination/BRDF.en.html September 29, 2015 49 / 73

Lambertian surfaces and BRDF MERL s BRDF database the MERL BRDF database contains reflectance functions of 100 different materials Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF each reflectance function is stored as a densely measured Bidirectional Reflectance Distribution Function (BRDF) September 29, 2015 50 / 73

Outline of the Lecture Basics of camera optics The thin lens model Aperture and exposure regulation 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 51 / 73

Basics of camera optics Basics of camera optics Basics of camera optics The thin lens model Aperture and exposure regulation a pinhole camera will have everything in focus for every object in the field of view (there is no lens!) the smaller the area of the camera lens ( aperture ) the better the focus (i.e. it is less important how distant the object is) this however lets less light onto the sensor -> there is a relationship between aperture and shutter speed (how long you need to keep the hole open to acquire enough light) this in turn is related to depth of field basic principles apply to both film and digital the size of the hole letting light into a camera is controlled by the aperture - the smaller the aperture the more similar the camera is to a pinhole camera and more of the image is in focus September 29, 2015 52 / 73

Basics of camera optics Basics of camera optics the smaller the aperture the smaller the amount of light entering the camera but the larger the depth of field Basics of camera optics The thin lens model Aperture and exposure regulation film has to be exposed for longer - exposure time increases you have problems for camera shake and subject movement aperture indicated by an F-number - high numbers indicate smaller aperture aperture - depth of field - shutter speed are all linked September 29, 2015 53 / 73

The thin lens model Basics of camera optics The thin lens model Aperture and exposure regulation Thin lens formula and camera constant thin lens formula (in "Gaussian form") 1 o + 1 i = 1 f i = f o o f o = distance of object (scene) to the optical origin i = distance of image plane from the optical origin ( camera constant ) f = focal length of camera when o + (far away objects) then i f i = f means that objects at infinity are always on focus simple have fixed i (that is why it is called constant ) September 29, 2015 54 / 73

The thin lens model Week 2 Focussing when o changes the image goes out of focus can be partly avoided by having a small lens and aperture, allowing only a limited amount of light onto the film/sensor opening the shutter longer will lead to camera shake blurring in the old times, a fast (quick acting) Basics of camera optics film was used but this had lower resolution (larger grains of the chemical silver nitrate compound) The thin lens model Aperture and exposure regulation focussing is the process of altering i to accommodate scenes at different distances o from the optical origin September 29, 2015 55 / 73

The thin lens model Basics of camera optics The thin lens model Aperture and exposure regulation Numerical example Objects in focus/out of focus camera with 50mm focal length, distance to object 20m thin lens formula 1 o + 1 i = 1 f, i = f o o f i = distance to object 1m i = 50 20000 20000 50 = 1000000 19950 = 50.13mm 50 1000 1000 50 = 50000 950 = 52.63mm if o increases by a large amount, i will increase of a smaller amount September 29, 2015 56 / 73

Aperture and exposure Basics of camera optics The thin lens model Aperture and exposure regulation Aperture camera can control how much light reaches the film by changing the area (aperture) of the lens - determined by the lens diameter aperture settings are described as F-numbers or F-stops 22 16 11 8 5.6 4 2.8 F-number = f d, where f is the focal length, d the diameter of the aperture the larger the number, the smaller the aperture (size of hole) every stop increase doubles the amount of light entering the camera September 29, 2015 57 / 73

Aperture and exposure Manual regulation of aperture in a manual camera typical view on the top of a manual camera example: setting F = 5.6 corresponds to... Basics of camera optics The thin lens model Aperture and exposure regulation... a depth of field between 1.9m and 3.5m September 29, 2015 58 / 73

Aperture and exposure Exposure Basics of camera optics The thin lens model Aperture and exposure regulation the amount of light collected on a film (or sensor) depends on the intensity of the light and on exposure time, or shutter speed a camera lets light in by exposing the film/sensor when a shutter (see diagram of slide 39) is opened for a short period of time the amount of light energy ɛ falling on to an image is: ɛ(= EV ) = E t where E is the intensity of light and t is the opening shutter exposure duration September 29, 2015 59 / 73

Aperture and exposure Exposure and aperture Basics of camera optics The thin lens model Aperture and exposure regulation Now, since F = f /d and the area A = π(d/2) 2 of the hole/aperture is proportional to the energy ɛ absorbed (quite intuitive)....we have that F 1/ɛ conclusion: have a constant relationship between aperture exposure time depth of field aperture (or exposure) can be fixed to allow exposure (or aperture) to be set to the photographers preference depending on the nature of the photograph you may consult http://www.nikonians.org/html/ resources/guides/dof/hyperfocal1.html September 29, 2015 60 / 73

regulation Example: shrinking the aperture why not make the aperture as small as possible? less light gets in, diffraction effects this is an example of what happens Basics of camera optics The thin lens model Aperture and exposure regulation September 29, 2015 61 / 73

regulation Example: the Pentax P30t Exposure and aperture graph of AE settings for Pentax P30t camera Shutter / aperture metering range for a lens of f = 1.4 and diameter = 50mm with ISO 100 film Basics of camera optics The thin lens model Aperture and exposure regulation September 29, 2015 62 / 73

regulation regulation out of range the aperture needs reducing Example: the Pentax P30t Exposure and aperture Basics of camera optics The thin lens model Aperture and exposure regulation September 29, 2015 63 / 73

regulation Example: the Pentax P30t Exposure and aperture new regulation inside the admissible range Basics of camera optics The thin lens model Aperture and exposure regulation September 29, 2015 64 / 73

Outline of the Lecture Sensors images Range and plenoptic 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, 2015 65 / 73

Sensors the basics Sensors images Range and plenoptic machine vision equipment nowadays involve only digital components of a digital camera: lens system aperture size control exposure speed control light sensitive semiconductor storage medium CCD: Charged Coupled Device CMOS: Complementary Metal Oxide Semiconductors September 29, 2015 66 / 73

Sensors Charged Coupled Devices Sensors images Range and plenoptic CCDs are made up of discrete light sensing elements - photo-sites - forming an array sites formed into grid or just a single line each site acts as an optoelectric converter it becomes electrically charged in direct proportion to amount of light falling on it in the integration time the charges from the sites are linked or coupled for transfer out of the array outside the scope of this module: check U08884 Technology September 29, 2015 67 / 73

Sensors Charged Coupled Devices How do they look like example: a 2.1 MP CCD from a HP camera Sensors images Range and plenoptic licensed under the Creative Commons Attribution 3.0 License September 29, 2015 68 / 73

Sensors CCDs versus CMOS CMOS devices are also very popular the main difference is where conversion from charge to voltage takes place Sensors images Range and plenoptic courtesy http://www.teledynedalsa.com/public/corp/ CCD_vs_CMOS_Litwiller_2005.pdf September 29, 2015 69 / 73

images images two-dimensional arrays of pixels pixel coordinates are denoted by (x, y), x row index, y column index e.g., 1280 rows 768 columns grey-scale images: each pixel has a single brightness value [0, 255] Sensors images Range and plenoptic colour images: each pixel (x, y) is associated with three distinct values R(x, y), G(x, y), B(x, y) red, blue and green are the three basis colours in the RGB representation (there are others, such as LUV, CMYK.. see http://www.cambridgeincolour.com/tutorials/ color-spaces.htm) September 29, 2015 70 / 73

Range and plenoptic Range time-of-flight : similar to radar principle measure the time-of-flight of a light signal between the camera and the subject for each point of the image most famous example: Kinect Sensors images Range and plenoptic September 29, 2015 71 / 73

Range and plenoptic Plenoptic also called light field, for they measure an entire light field capture information about the intensity of light in a scene but also about the direction that the light rays are traveling in space typically use an array of micro- placed in front of an otherwise conventional image sensor Sensors images Range and plenoptic micro separate the converging rays into an image on the photosensor behind them September 29, 2015 72 / 73

Week 2 Range and plenoptic Plenoptic Example image Sensors images Range and plenoptic a plenoptic image is a collection of tiny images, collected at different focal lengths - check this amazing demo http://blog.theincredibleholk.org/plenopticjs/ September 29, 2015 73 / 73