Dr F. Cuzzolin 1. September 29, 2015

Size: px
Start display at page:

Download "Dr F. Cuzzolin 1. September 29, 2015"

Transcription

1 P00407 Principles of Computer Vision 1 1 Department of Computing and Communication Technologies Oxford Brookes University, UK September 29, 2015 September 29, / 73

2 Outline of the Lecture 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

3 Recalling what computer vision is about (see our discussion in Week 1) human vision allows us to observe the environment, and interact with it (e.g. explore a new place, grasp and object) purpose of computer vision: to reproduce this behavior in intelligent machines....in order to allow them to interact with the environment in the same successful way first step in any computer vision process: acquire images and videos we can process and reason upon! September 29, / 73

4 What are images then? images form the fundamental data on which vision operates they are (possibly continuous) collections of dots of light however, machine can only work with digital images - grids of "pixels" with a certain number of rows and columns each pixel is characterised by a level of brightness (usually in the range [0,255]) colour images have three separate colour channels images are acquired through, using which focus light rays onto sensors this process can be described by the perspective model September 29, / 73

5 What you will learn today what digital images are, and the mechanism by which they are generated (image formation) how the two main models: perspective (or pinhole camera ) and thin lens work what it means to calibrate a camera, and what are its intrinsic and extrinsic parameters how work, and the physical principles of refraction and reflection how BRDFs describe the reflectance properties of surfaces how work, and what are their main working parameters September 29, / 73

6 Outline of the Lecture Basics of perspective Effects of perspective Orthographic 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

7 Basics of perspective Basics of perspective Effects of perspective Orthographic Plane perspective Main elements derived from physical construction of early - mathematics is very straightforward points in the 3D space with coordinates X, Y, X.... are mapped to an image plane origin of the coordinate axes: optical center September 29, / 73

8 Basics of perspective Basics of perspective Plane perspective How does it work the distance from the optical center to the image plane is called focal length each 3D (scene) point (X, Y, Z ) is mapped to a point (x, y, z) on the image plane... Effects of perspective Orthographic... by tracing a ray through the optical center until it intersects the image plane September 29, / 73

9 Basics of perspective Basics of perspective Plane perspective vs world coordinates the relation between coordinates in 3D and in the image plane comes from the geometry of the process Effects of perspective Orthographic it can be obtained from the similarity of the two triangles: y f = Y Z, y = fy Z September 29, / 73

10 Basics of perspective Basics of perspective Plane perspective Alternative representation puts the image plane in front of the (optical) center of Effects of perspective Orthographic nevertheless, the geometry stays the same September 29, / 73

11 Basics of perspective Basics of perspective Example of coordinate calculation if a camera has a focal length of 50mm how (vertically) big is the camera image of a man 2m tall, standing 12m away? Effects of perspective Orthographic using the similarity relation (in millimeters) we get: y = fy Z = = 8.3mm September 29, / 73

12 Effects of perspective Basics of perspective effects Point mapping a 3D point maps to a single image point BUT the inverse mapping of an image point is an entire ray in 3D Effects of perspective Orthographic September 29, / 73

13 Effects of perspective Basics of perspective effects Mapping of lines a 3D line maps to an image line BUT the inverse mapping of an image line is an entire plane in 3D Effects of perspective Orthographic September 29, / 73

14 Effects of perspective effects Midpoints midpoints are also not preserved in the image plane Basics of perspective Effects of perspective Orthographic bottom line: perspective introduces deformations, therefore images are a deformed representation of reality! September 29, / 73

15 Effects of perspective Basics of perspective effects Mapping of circles ellipses are mapped to ellipses, with different eccentricities Effects of perspective Orthographic the rotation of a circle in the scene changes the eccentricity of the projected ellipse September 29, / 73

16 Effects of perspective Basics of perspective Effects of perspective effects Vanishing points parallel lines in the 3D scene appear to disappear (in the image) at a point on the horizon - the vanishing point this mechanism was only understood in the 15th century Orthographic hockneyoptics/post/tyler2. html Masaccio, Trinity, Santa Maria Novella, Florence, September 29, / 73

17 Effects of perspective Basics of perspective effects One vanishing point if in the 3D scene there is a single set of parallel lines then there is only one vanishing point Effects of perspective Orthographic September 29, / 73

18 Effects of perspective Basics of perspective effects Two vanishing points if in the 3D scene there are two sets of parallel lines there are also two vanishing points Effects of perspective Orthographic September 29, / 73

19 Effects of perspective effects Some conclusions Basics of perspective Effects of perspective Orthographic acquiring images is necessary to understand the surrounding environment, and interact with it however, we need to keep in mind that the perspective mechanism is not a faithful representation of the external world information is lost, when passing from a 3D space to a 2D one as we will see later, we can compensate for this by acquiring more images from different viewpoints, and/or in time! September 29, / 73

20 Week 2 Effects of perspective Optical illusions effects perspective generates all sort of optical illusion: Basics of perspective Effects of perspective Orthographic September 29, / 73

21 Orthographic Basics of perspective Effects of perspective Orthographic Orthographic the geometry of perspective is complex, leading to non-linear equations the model can be simplified by assuming all rays are parallel - the is orthographic e.g., a telephoto lens approximates to orthographic September 29, / 73

22 Outline of the Lecture Intrinsic camera parameters Extrinsic camera parameters 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

23 Intrinsic camera parameters Extrinsic camera parameters problem: we need to place the camera in the wider world (with respect to its environment) to do this we need to be able to model: translation t, a 3D vector rotation R, a 3 3 matrix (R + t = configuration of the camera in the world) perspective (pinhole) (how world points are mapped to image points) September 29, / 73

24 Intrinsic camera parameters Intrinsic camera parameters Extrinsic camera parameters Pinhole camera in formulas Homogenous coordinates recall the pinhole model: (X, Y, Z ) ( fx, fy ) Z Z is this a linear transformation? No! - division by z is nonlinear then it cannot be expressed via matrices (see Week 1, P00405) little trick: we can add one more coordinate (just a constant 1 at the bottom really), called homogeneous coordinates becomes a matrix multiplication in homogenous coordinates: X Y Z 1 fx fy Z = f f X Y Z 1 or, x = PX September 29, / 73

25 Intrinsic camera parameters coordinate system Intrinsic camera parameters Extrinsic camera parameters the main elements of the pinhole model are: principal axis: line from the camera center perpendicular to the image plane (camera) coordinate system: camera center is at the origin and the principal axis is the z-axis principal point (p): point where principal axis intersects the image plane (origin of normalized coordinate system) September 29, / 73

26 Intrinsic camera parameters The matrix Intrinsic camera parameters Extrinsic camera parameters to take care of the fact that image coordinates are measured from the top left corner, we need to offset the principal point ( fx (X, Y, Z ) Z + px, fy ) Z + py where p = (p x, p y) are the coordinates of the principal point in matrix form: X X Y fx + p x f 0 p x 0 Z fy + p y = 0 f p y 0 Y Z Z the matrix P = K [I 0], where K is the first three columns of P, is called the matrix September 29, / 73

27 Intrinsic camera parameters Intrinsic camera parameters Extrinsic camera parameters Intrinsic camera parameters Pixel coordinates K can be expressed in pixels by multiplying for the appropriate values of pixels per unit (e.g., millimeters or inches) m x 0 0 f 0 p x α x 0 β x K = 0 m y 0 0 f p y = 0 α y β y we have a complete picture of intrinsic camera parameters: principal point coordinates focal length pixel magnification factors skew (non-rectangular pixels) radial distortion September 29, / 73

28 Extrinsic camera parameters Extrinsic camera parameters Rotation and translation the camera coordinate frame is related to the world coordinate frame by a rotation and a translation Intrinsic camera parameters Extrinsic camera parameters we can first write X cam = R( X C) where X cam are the scene coordinates in the camera reference frame, X are the corodinates in the world reference frame, and C is the coordinates of the camera center in the world frame September 29, / 73

29 Extrinsic camera parameters Extrinsic camera parameters in homogenous coordinates Intrinsic camera parameters Extrinsic camera parameters in homogenous coordinates this reads as X cam = [ R R C 0 1 ] ( X 1 ) so that x = K [I 0]X cam = K [R R C]X = PX, where P = K [R t], t = R C R and t are the so called extrinsic camera parameters September 29, / 73

30 camera : given n points with known 3D coordinates X i and known image s x i, estimate the camera parameters Intrinsic camera parameters Extrinsic camera parameters we cannot do this from one image! depth is lost, the matrix can only tell us the ray through an image point x we will talk about it later when discussing multiple view geometry September 29, / 73

31 Outline of the Lecture Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

32 Lenses vs pinhole camera model Pinhole camera model perspective is often called pinhole camera model for it models a very simple cardbox camera with a pinhole through which light can pass Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF the pinhole model is often used to model processes in vision September 29, / 73

33 Lenses vs pinhole camera model Lenses vs pinhole camera model Using however, a pinhole camera has a serious flaw: very little light is captured to form the image only photons coming from a single direction are collected at each image point are thus used to improve image intensity, as they collect light from originally diverging directions Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

34 Depth of field Focus of for a given lens the ratio scene size X to image size x is fixed lens is in focus only for objects at a certain ratio fx = const x thus an object is in focus only if it is at a certain distance X in front of the lens Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF for different distances X X the image goes out of focus September 29, / 73

35 Depth of field Depth of field of the range of distances for which the focusing is reasonably good is call depth of field distances are measured from the lens Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

36 Depth of field Depth of field Objects at infinity for objects at infinity (a long way off, really).... light rays are parallel to the lens axis.. Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF.. and are focussed at the shortest distance behind the lens September 29, / 73

37 Depth of field Lenses vs pinhole camera model Circle of confusion for objects out of focus for all scene points far from the optimal distance X, light rays are not focussed onto the image plane, but a bit before or after this generates a circle on the image plane, rather than a point: the circle of confusion (blurring) human eye detects blur for circles above about 0.33mm in diameter Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

38 Depth of field Lenses vs pinhole camera model Field of view can be made with a large or small field of view a typical camera has field of view of 25 to 40 degrees a wide angle lens has a short focal length and a wide field of view a telephoto lens has the reverse Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

39 A bit of physics of light How a lens works - refraction work by changing the direction of incoming light rays, in order to focus them onto the image place light rays are bent as they enter the lens by a physical process called refraction Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

40 A bit of physics of light Refraction A bit of physics refraction is described by the refractive index: µ = sin i law), where µ = n 1 n 2 air and glass) sin r (Snell s is a function of the nature of the two media (e.g. Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

41 A bit of physics of light Refractive index as a ray passes from less dense to more dense it is bent towards the perpendicular Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF the refractive index can be seen as the factor by which the speed and the wavelength of the radiation are reduced with respect to their vacuum value the refractive index varies with the wavelength of light ( dispersion ) September 29, / 73

42 A bit of physics of light Electromagnetic radiation light is an energy wave at the visible part of the electro-magnetic spectrum Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF it can be described as both a set of particles ( photons ) and as an energy wave fluctuating in space the energy wave itself is invisible - light is only apparent when it hits something and is reflected back to hit our retina September 29, / 73

43 A bit of physics of light Lenses vs pinhole camera model Depth of field A bit of physics of light Brightness, illumination and reflectance the intensity or brightness F(x, y) = I(x, y)r(x, y) of a point (x, y) S of a surface S is determined by the illumination at the point I(x, y) (amount of light hitting that spot) the reflectance R(x, y) of the surface S at the point (x, y) light can be reflected in several ways depending on surface properties e.g. specular reflection is the mirror-like property of some surfaces such as polished metals reflected light retains the color of the original light source Lambertian surfaces and BRDF September 29, / 73

44 A bit of physics of light Lenses vs pinhole camera model Diffuse reflection other case: diffuse reflection disperses the reflected light in many directions the surface absorbs some frequencies (colours), while reflecting others: the apparent colour of the surface is determined by the frequencies which are not absorbed Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

45 A bit of physics of light Lenses vs pinhole camera model Chromatic aberration or dispersion different frequencies in the spectrum travels at different speeds the refractive index of two materials varies with the frequency (color) of the incoming light light rays of different colour are bent by different angles Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

46 Lambertian surfaces and BRDF Lenses vs pinhole camera model Lambertian surfaces a particular point P on a Lambertian (perfectly matte) surface appears to have the same brightness no matter what angle it is viewed from examples: piece of paper matte paint does not depend on incident light angle Depth of field A bit of physics of light Lambertian surfaces and BRDF what does this say about how they emit light? September 29, / 73

47 Lambertian surfaces and BRDF Cosine law for Lambertian surfaces in Lambertian surfaces the elative magnitude of light scattered in each direction is proportional to cos(θ), where θ is the angle of scatter this is called the cosine law Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF September 29, / 73

48 Lambertian surfaces and BRDF Lenses vs pinhole camera model Bidirectional Reflectance Distribution Function (BRDF) in general, how light is reflected at an opaque surface is described by a function of four variables called bidirectional reflectance distribution function or BRDF the function takes an incoming light direction, ω i, and an outgoing direction, ω r (taken in a coordinate system where the surface normal n lies along the z-axis).. Depth of field A bit of physics of light Lambertian surfaces and BRDF.. and returns the ratio of reflected radiance exiting along ω r to the irradiance incident on the surface from direction ω i September 29, / 73

49 Lambertian surfaces and BRDF Bidirectional Reflectance Distribution Function (BRDF) each direction ω is itself parameterized by azimuth angle φ and zenith angle θ -> the BRDF as a whole ρ(θ r, φ r, θ i, φ i ) is a function of 4 variables: Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF GlobalIllumination/BRDF.en.html September 29, / 73

50 Lambertian surfaces and BRDF MERL s BRDF database the MERL BRDF database contains reflectance functions of 100 different materials Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF each reflectance function is stored as a densely measured Bidirectional Reflectance Distribution Function (BRDF) September 29, / 73

51 Outline of the Lecture Basics of camera optics The thin lens model Aperture and exposure regulation 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

52 Basics of camera optics Basics of camera optics Basics of camera optics The thin lens model Aperture and exposure regulation a pinhole camera will have everything in focus for every object in the field of view (there is no lens!) the smaller the area of the camera lens ( aperture ) the better the focus (i.e. it is less important how distant the object is) this however lets less light onto the sensor -> there is a relationship between aperture and shutter speed (how long you need to keep the hole open to acquire enough light) this in turn is related to depth of field basic principles apply to both film and digital the size of the hole letting light into a camera is controlled by the aperture - the smaller the aperture the more similar the camera is to a pinhole camera and more of the image is in focus September 29, / 73

53 Basics of camera optics Basics of camera optics the smaller the aperture the smaller the amount of light entering the camera but the larger the depth of field Basics of camera optics The thin lens model Aperture and exposure regulation film has to be exposed for longer - exposure time increases you have problems for camera shake and subject movement aperture indicated by an F-number - high numbers indicate smaller aperture aperture - depth of field - shutter speed are all linked September 29, / 73

54 The thin lens model Basics of camera optics The thin lens model Aperture and exposure regulation Thin lens formula and camera constant thin lens formula (in "Gaussian form") 1 o + 1 i = 1 f i = f o o f o = distance of object (scene) to the optical origin i = distance of image plane from the optical origin ( camera constant ) f = focal length of camera when o + (far away objects) then i f i = f means that objects at infinity are always on focus simple have fixed i (that is why it is called constant ) September 29, / 73

55 The thin lens model Week 2 Focussing when o changes the image goes out of focus can be partly avoided by having a small lens and aperture, allowing only a limited amount of light onto the film/sensor opening the shutter longer will lead to camera shake blurring in the old times, a fast (quick acting) Basics of camera optics film was used but this had lower resolution (larger grains of the chemical silver nitrate compound) The thin lens model Aperture and exposure regulation focussing is the process of altering i to accommodate scenes at different distances o from the optical origin September 29, / 73

56 The thin lens model Basics of camera optics The thin lens model Aperture and exposure regulation Numerical example Objects in focus/out of focus camera with 50mm focal length, distance to object 20m thin lens formula 1 o + 1 i = 1 f, i = f o o f i = distance to object 1m i = = = 50.13mm = = 52.63mm if o increases by a large amount, i will increase of a smaller amount September 29, / 73

57 Aperture and exposure Basics of camera optics The thin lens model Aperture and exposure regulation Aperture camera can control how much light reaches the film by changing the area (aperture) of the lens - determined by the lens diameter aperture settings are described as F-numbers or F-stops F-number = f d, where f is the focal length, d the diameter of the aperture the larger the number, the smaller the aperture (size of hole) every stop increase doubles the amount of light entering the camera September 29, / 73

58 Aperture and exposure Manual regulation of aperture in a manual camera typical view on the top of a manual camera example: setting F = 5.6 corresponds to... Basics of camera optics The thin lens model Aperture and exposure regulation... a depth of field between 1.9m and 3.5m September 29, / 73

59 Aperture and exposure Exposure Basics of camera optics The thin lens model Aperture and exposure regulation the amount of light collected on a film (or sensor) depends on the intensity of the light and on exposure time, or shutter speed a camera lets light in by exposing the film/sensor when a shutter (see diagram of slide 39) is opened for a short period of time the amount of light energy ɛ falling on to an image is: ɛ(= EV ) = E t where E is the intensity of light and t is the opening shutter exposure duration September 29, / 73

60 Aperture and exposure Exposure and aperture Basics of camera optics The thin lens model Aperture and exposure regulation Now, since F = f /d and the area A = π(d/2) 2 of the hole/aperture is proportional to the energy ɛ absorbed (quite intuitive)....we have that F 1/ɛ conclusion: have a constant relationship between aperture exposure time depth of field aperture (or exposure) can be fixed to allow exposure (or aperture) to be set to the photographers preference depending on the nature of the photograph you may consult resources/guides/dof/hyperfocal1.html September 29, / 73

61 regulation Example: shrinking the aperture why not make the aperture as small as possible? less light gets in, diffraction effects this is an example of what happens Basics of camera optics The thin lens model Aperture and exposure regulation September 29, / 73

62 regulation Example: the Pentax P30t Exposure and aperture graph of AE settings for Pentax P30t camera Shutter / aperture metering range for a lens of f = 1.4 and diameter = 50mm with ISO 100 film Basics of camera optics The thin lens model Aperture and exposure regulation September 29, / 73

63 regulation regulation out of range the aperture needs reducing Example: the Pentax P30t Exposure and aperture Basics of camera optics The thin lens model Aperture and exposure regulation September 29, / 73

64 regulation Example: the Pentax P30t Exposure and aperture new regulation inside the admissible range Basics of camera optics The thin lens model Aperture and exposure regulation September 29, / 73

65 Outline of the Lecture Sensors images Range and plenoptic 1 2 Basics of perspective Effects of perspective Orthographic 3 Intrinsic camera parameters Extrinsic camera parameters 4 Lenses vs pinhole camera model Depth of field A bit of physics of light Lambertian surfaces and BRDF 5 Basics of camera optics The thin lens model Aperture and exposure regulation 6 Sensors images Range and plenoptic September 29, / 73

66 Sensors the basics Sensors images Range and plenoptic machine vision equipment nowadays involve only digital components of a digital camera: lens system aperture size control exposure speed control light sensitive semiconductor storage medium CCD: Charged Coupled Device CMOS: Complementary Metal Oxide Semiconductors September 29, / 73

67 Sensors Charged Coupled Devices Sensors images Range and plenoptic CCDs are made up of discrete light sensing elements - photo-sites - forming an array sites formed into grid or just a single line each site acts as an optoelectric converter it becomes electrically charged in direct proportion to amount of light falling on it in the integration time the charges from the sites are linked or coupled for transfer out of the array outside the scope of this module: check U08884 Technology September 29, / 73

68 Sensors Charged Coupled Devices How do they look like example: a 2.1 MP CCD from a HP camera Sensors images Range and plenoptic licensed under the Creative Commons Attribution 3.0 License September 29, / 73

69 Sensors CCDs versus CMOS CMOS devices are also very popular the main difference is where conversion from charge to voltage takes place Sensors images Range and plenoptic courtesy CCD_vs_CMOS_Litwiller_2005.pdf September 29, / 73

70 images images two-dimensional arrays of pixels pixel coordinates are denoted by (x, y), x row index, y column index e.g., 1280 rows 768 columns grey-scale images: each pixel has a single brightness value [0, 255] Sensors images Range and plenoptic colour images: each pixel (x, y) is associated with three distinct values R(x, y), G(x, y), B(x, y) red, blue and green are the three basis colours in the RGB representation (there are others, such as LUV, CMYK.. see color-spaces.htm) September 29, / 73

71 Range and plenoptic Range time-of-flight : similar to radar principle measure the time-of-flight of a light signal between the camera and the subject for each point of the image most famous example: Kinect Sensors images Range and plenoptic September 29, / 73

72 Range and plenoptic Plenoptic also called light field, for they measure an entire light field capture information about the intensity of light in a scene but also about the direction that the light rays are traveling in space typically use an array of micro- placed in front of an otherwise conventional image sensor Sensors images Range and plenoptic micro separate the converging rays into an image on the photosensor behind them September 29, / 73

73 Week 2 Range and plenoptic Plenoptic Example image Sensors images Range and plenoptic a plenoptic image is a collection of tiny images, collected at different focal lengths - check this amazing demo September 29, / 73

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

More information

Unit 1: Image Formation

Unit 1: Image Formation Unit 1: Image Formation 1. Geometry 2. Optics 3. Photometry 4. Sensor Readings Szeliski 2.1-2.3 & 6.3.5 1 Physical parameters of image formation Geometric Type of projection Camera pose Optical Sensor

More information

Cameras. CSE 455, Winter 2010 January 25, 2010

Cameras. CSE 455, Winter 2010 January 25, 2010 Cameras CSE 455, Winter 2010 January 25, 2010 Announcements New Lecturer! Neel Joshi, Ph.D. Post-Doctoral Researcher Microsoft Research neel@cs Project 1b (seam carving) was due on Friday the 22 nd Project

More information

CSE 527: Introduction to Computer Vision

CSE 527: Introduction to Computer Vision CSE 527: Introduction to Computer Vision Week 2 - Class 2: Vision, Physics, Cameras September 7th, 2017 Today Physics Human Vision Eye Brain Perspective Projection Camera Models Image Formation Digital

More information

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1 TSBB09 Image Sensors 2018-HT2 Image Formation Part 1 Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal

More information

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

More information

LENSES. INEL 6088 Computer Vision

LENSES. INEL 6088 Computer Vision LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

More information

Computer Vision. The Pinhole Camera Model

Computer Vision. The Pinhole Camera Model Computer Vision The Pinhole Camera Model Filippo Bergamasco (filippo.bergamasco@unive.it) http://www.dais.unive.it/~bergamasco DAIS, Ca Foscari University of Venice Academic year 2017/2018 Imaging device

More information

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13 Projection Readings Szeliski 2.1 Projection Readings Szeliski 2.1 Müller-Lyer Illusion by Pravin Bhat Müller-Lyer Illusion by Pravin Bhat http://www.michaelbach.de/ot/sze_muelue/index.html Müller-Lyer

More information

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

More information

CSE 473/573 Computer Vision and Image Processing (CVIP)

CSE 473/573 Computer Vision and Image Processing (CVIP) CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu inwogu@buffalo.edu Lecture 4 Image formation(part I) Schedule Last class linear algebra overview Today Image formation and camera properties

More information

CPSC 425: Computer Vision

CPSC 425: Computer Vision 1 / 55 CPSC 425: Computer Vision Instructor: Fred Tung ftung@cs.ubc.ca Department of Computer Science University of British Columbia Lecture Notes 2015/2016 Term 2 2 / 55 Menu January 7, 2016 Topics: Image

More information

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2. Projection Projection Readings Szeliski 2.1 Readings Szeliski 2.1 Müller-Lyer Illusion Image formation object film by Pravin Bhat http://www.michaelbach.de/ot/sze_muelue/index.html Let s design a camera

More information

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1 Announcements Mailing list (you should have received messages) Project 1 additional test sequences online Projection Readings Nalwa 2.1 Müller-Lyer Illusion Image formation object film by Pravin Bhat http://www.michaelbach.de/ot/sze_muelue/index.html

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

VC 11/12 T2 Image Formation

VC 11/12 T2 Image Formation VC 11/12 T2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3 Image Formation Dr. Gerhard Roth COMP 4102A Winter 2015 Version 3 1 Image Formation Two type of images Intensity image encodes light intensities (passive sensor) Range (depth) image encodes shape and distance

More information

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Real world Optics Sensor Devices Sources of Error

More information

Lenses, exposure, and (de)focus

Lenses, exposure, and (de)focus Lenses, exposure, and (de)focus http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 15 Course announcements Homework 4 is out. - Due October 26

More information

VC 14/15 TP2 Image Formation

VC 14/15 TP2 Image Formation VC 14/15 TP2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Building a Real Camera. Slides Credit: Svetlana Lazebnik Building a Real Camera Slides Credit: Svetlana Lazebnik Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible?

More information

Overview. Image formation - 1

Overview. Image formation - 1 Overview perspective imaging Image formation Refraction of light Thin-lens equation Optical power and accommodation Image irradiance and scene radiance Digital images Introduction to MATLAB Image formation

More information

Image Formation and Camera Design

Image Formation and Camera Design Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

More information

CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

More information

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134 PHY 112: Light, Color and Vision Lecture 26 Prof. Clark McGrew Physics D 134 Finalities Final: Thursday May 19, 2:15 to 4:45 pm ESS 079 (this room) Lecture 26 PHY 112 Lecture 1 Introductory Chapters Chapters

More information

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008 The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2008 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

More information

Basic principles of photography. David Capel 346B IST

Basic principles of photography. David Capel 346B IST Basic principles of photography David Capel 346B IST Latin Camera Obscura = Dark Room Light passing through a small hole produces an inverted image on the opposite wall Safely observing the solar eclipse

More information

Building a Real Camera

Building a Real Camera Building a Real Camera Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible? Less light gets through Diffraction

More information

ME 6406 MACHINE VISION. Georgia Institute of Technology

ME 6406 MACHINE VISION. Georgia Institute of Technology ME 6406 MACHINE VISION Georgia Institute of Technology Class Information Instructor Professor Kok-Meng Lee MARC 474 Office hours: Tues/Thurs 1:00-2:00 pm kokmeng.lee@me.gatech.edu (404)-894-7402 Class

More information

Lecture 2 Camera Models

Lecture 2 Camera Models Lecture 2 Camera Models Professor Silvio Savarese Computational Vision and Geometr Lab Silvio Savarese Lecture 2 - -Jan-8 Lecture 2 Camera Models Pinhole cameras Cameras lenses The geometr of pinhole cameras

More information

Two strategies for realistic rendering capture real world data synthesize from bottom up

Two strategies for realistic rendering capture real world data synthesize from bottom up Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are successful. Attempts to take the best of both world

More information

Image Formation and Capture

Image Formation and Capture Figure credits: B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, A. Theuwissen, and J. Malik Image Formation and Capture COS 429: Computer Vision Image Formation and Capture Real world Optics Sensor Devices

More information

VC 16/17 TP2 Image Formation

VC 16/17 TP2 Image Formation VC 16/17 TP2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Hélder Filipe Pinto de Oliveira Outline Computer Vision? The Human Visual

More information

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Lecture 22: Cameras & Lenses III Computer Graphics and Imaging UC Berkeley, Spring 2017 F-Number For Lens vs. Photo A lens s F-Number is the maximum for that lens E.g. 50 mm F/1.4 is a high-quality telephoto

More information

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Bill Freeman Frédo Durand MIT - EECS Administrivia PSet 1 is out Due Thursday February 23 Digital SLR initiation? During

More information

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals.

This experiment is under development and thus we appreciate any and all comments as we design an interesting and achievable set of goals. Experiment 7 Geometrical Optics You will be introduced to ray optics and image formation in this experiment. We will use the optical rail, lenses, and the camera body to quantify image formation and magnification;

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Camera Simulation Effect Cause Field of view Film size, focal length Depth of field Aperture, focal length Exposure Film speed, aperture, shutter Motion blur Shutter References Photography, B. London and

More information

Chapter 18 Optical Elements

Chapter 18 Optical Elements Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational

More information

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems Chapter 9 OPTICAL INSTRUMENTS Introduction Thin lenses Double-lens systems Aberrations Camera Human eye Compound microscope Summary INTRODUCTION Knowledge of geometrical optics, diffraction and interference,

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005 The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2005 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

More information

OPTICAL SYSTEMS OBJECTIVES

OPTICAL SYSTEMS OBJECTIVES 101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms

More information

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens. Image Formation Light (Energy) Source Surface Imaging Plane Pinhole Lens World Optics Sensor Signal B&W Film Color Film TV Camera Silver Density Silver density in three color layers Electrical Today Optics:

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

OPTICS DIVISION B. School/#: Names:

OPTICS DIVISION B. School/#: Names: OPTICS DIVISION B School/#: Names: Directions: Fill in your response for each question in the space provided. All questions are worth two points. Multiple Choice (2 points each question) 1. Which of the

More information

Astronomical Cameras

Astronomical Cameras Astronomical Cameras I. The Pinhole Camera Pinhole Camera (or Camera Obscura) Whenever light passes through a small hole or aperture it creates an image opposite the hole This is an effect wherever apertures

More information

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations. Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

More information

Lecture 02 Image Formation 1

Lecture 02 Image Formation 1 Institute of Informatics Institute of Neuroinformatics Lecture 02 Image Formation 1 Davide Scaramuzza http://rpg.ifi.uzh.ch 1 Lab Exercise 1 - Today afternoon Room ETH HG E 1.1 from 13:15 to 15:00 Work

More information

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science. Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Sensors and Image Formation Imaging sensors and models of image formation Coordinate systems Digital

More information

Test Review # 8. Physics R: Form TR8.17A. Primary colors of light

Test Review # 8. Physics R: Form TR8.17A. Primary colors of light Physics R: Form TR8.17A TEST 8 REVIEW Name Date Period Test Review # 8 Light and Color. Color comes from light, an electromagnetic wave that travels in straight lines in all directions from a light source

More information

Image Formation: Camera Model

Image Formation: Camera Model Image Formation: Camera Model Ruigang Yang COMP 684 Fall 2005, CS684-IBMR Outline Camera Models Pinhole Perspective Projection Affine Projection Camera with Lenses Digital Image Formation The Human Eye

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Notation for Mirrors and Lenses The object distance is the distance from the object to the mirror or lens Denoted by p The image distance is the distance from the image to the

More information

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36 Light from distant things Chapter 36 We learn about a distant thing from the light it generates or redirects. The lenses in our eyes create images of objects our brains can process. This chapter concerns

More information

Basic Optics System OS-8515C

Basic Optics System OS-8515C 40 50 30 60 20 70 10 80 0 90 80 10 20 70 T 30 60 40 50 50 40 60 30 70 20 80 90 90 80 BASIC OPTICS RAY TABLE 10 0 10 70 20 60 50 40 30 Instruction Manual with Experiment Guide and Teachers Notes 012-09900B

More information

General Imaging System

General Imaging System General Imaging System Lecture Slides ME 4060 Machine Vision and Vision-based Control Chapter 5 Image Sensing and Acquisition By Dr. Debao Zhou 1 2 Light, Color, and Electromagnetic Spectrum Penetrate

More information

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8

Vision 1. Physical Properties of Light. Overview of Topics. Light, Optics, & The Eye Chaudhuri, Chapter 8 Vision 1 Light, Optics, & The Eye Chaudhuri, Chapter 8 1 1 Overview of Topics Physical Properties of Light Physical properties of light Interaction of light with objects Anatomy of the eye 2 3 Light A

More information

Intorduction to light sources, pinhole cameras, and lenses

Intorduction to light sources, pinhole cameras, and lenses Intorduction to light sources, pinhole cameras, and lenses Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 October 26, 2011 Abstract 1 1 Analyzing

More information

Chapter 25. Optical Instruments

Chapter 25. Optical Instruments Chapter 25 Optical Instruments Optical Instruments Analysis generally involves the laws of reflection and refraction Analysis uses the procedures of geometric optics To explain certain phenomena, the wave

More information

Lecture 7: Camera Models

Lecture 7: Camera Models Lecture 7: Camera Models Professor Stanford Vision Lab 1 What we will learn toda? Pinhole cameras Cameras & lenses The geometr of pinhole cameras Reading: [FP]Chapters 1 3 [HZ] Chapter 6 2 What we will

More information

Lecture 3: Geometrical Optics 1. Spherical Waves. From Waves to Rays. Lenses. Chromatic Aberrations. Mirrors. Outline

Lecture 3: Geometrical Optics 1. Spherical Waves. From Waves to Rays. Lenses. Chromatic Aberrations. Mirrors. Outline Lecture 3: Geometrical Optics 1 Outline 1 Spherical Waves 2 From Waves to Rays 3 Lenses 4 Chromatic Aberrations 5 Mirrors Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl Lecture 3: Geometrical

More information

Chapter Ray and Wave Optics

Chapter Ray and Wave Optics 109 Chapter Ray and Wave Optics 1. An astronomical telescope has a large aperture to [2002] reduce spherical aberration have high resolution increase span of observation have low dispersion. 2. If two

More information

Lecture 2 Camera Models

Lecture 2 Camera Models Lecture 2 Camera Models Professor Silvio Savarese Computational Vision and Geometr Lab Silvio Savarese Lecture 2-4-Jan-4 Announcements Prerequisites: an questions? This course requires knowledge of linear

More information

6.A44 Computational Photography

6.A44 Computational Photography Add date: Friday 6.A44 Computational Photography Depth of Field Frédo Durand We allow for some tolerance What happens when we close the aperture by two stop? Aperture diameter is divided by two is doubled

More information

Algebra Based Physics. Reflection. Slide 1 / 66 Slide 2 / 66. Slide 3 / 66. Slide 4 / 66. Slide 5 / 66. Slide 6 / 66.

Algebra Based Physics. Reflection. Slide 1 / 66 Slide 2 / 66. Slide 3 / 66. Slide 4 / 66. Slide 5 / 66. Slide 6 / 66. Slide 1 / 66 Slide 2 / 66 Algebra Based Physics Geometric Optics 2015-12-01 www.njctl.org Slide 3 / 66 Slide 4 / 66 Table of ontents lick on the topic to go to that section Reflection Refraction and Snell's

More information

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points

LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points WRITE ON SCANTRON WITH NUMBER 2 PENCIL DO NOT WRITE ON THIS TEST LlIGHT REVIEW PART 2 DOWNLOAD, PRINT and submit for 100 points Multiple Choice Identify the choice that best completes the statement or

More information

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros Acquisition Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros Image Acquisition Digital Camera Film Outline Pinhole camera Lens Lens aberrations Exposure Sensors Noise

More information

The Law of Reflection

The Law of Reflection PHY132H1F Introduction to Physics II Class 5 Outline: Reflection and Refraction Fibre-Optics Colour and Dispersion Thin Lens Equation Image Formation Quick reading quiz.. virtual image is. the cause of

More information

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc. Lecture Outline Chapter 27 Physics, 4 th Edition James S. Walker Chapter 27 Optical Instruments Units of Chapter 27 The Human Eye and the Camera Lenses in Combination and Corrective Optics The Magnifying

More information

Chapter 34: Geometric Optics

Chapter 34: Geometric Optics Chapter 34: Geometric Optics It is all about images How we can make different kinds of images using optical devices Optical device example: mirror, a piece of glass, telescope, microscope, kaleidoscope,

More information

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1 Image Formation Dr. Gerhard Roth COMP 4102A Winter 2014 Version 1 Image Formation Two type of images Intensity image encodes light intensities (passive sensor) Range (depth) image encodes shape and distance

More information

Digital Image Processing COSC 6380/4393

Digital Image Processing COSC 6380/4393 Digital Image Processing COSC 6380/4393 Lecture 2 Aug 24 th, 2017 Slides from Dr. Shishir K Shah, Rajesh Rao and Frank (Qingzhong) Liu 1 Instructor TA Digital Image Processing COSC 6380/4393 Pranav Mantini

More information

CS6670: Computer Vision

CS6670: Computer Vision CS6670: Computer Vision Noah Snavely Lecture 5: Cameras and Projection Szeliski 2.1.3-2.1.6 Reading Announcements Project 1 assigned, see projects page: http://www.cs.cornell.edu/courses/cs6670/2011sp/projects/projects.html

More information

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object. Camera trial #1 Cameras Digital Visual Effects Yung-Yu Chuang scene film with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Put a piece of film in front of an object. Pinhole camera

More information

Sensors and Sensing Cameras and Camera Calibration

Sensors and Sensing Cameras and Camera Calibration Sensors and Sensing Cameras and Camera Calibration Todor Stoyanov Mobile Robotics and Olfaction Lab Center for Applied Autonomous Sensor Systems Örebro University, Sweden todor.stoyanov@oru.se 20.11.2014

More information

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT PHYSICS FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E Chapter 35 Lecture RANDALL D. KNIGHT Chapter 35 Optical Instruments IN THIS CHAPTER, you will learn about some common optical instruments and

More information

Astronomy 80 B: Light. Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson

Astronomy 80 B: Light. Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson Astronomy 80 B: Light Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson Sensitive Countries LLNL field trip 2003 April 29 80B-Light 2 Topics for Today Optical illusion Reflections

More information

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses.

Mirrors and Lenses. Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses. Mirrors and Lenses Images can be formed by reflection from mirrors. Images can be formed by refraction through lenses. Notation for Mirrors and Lenses The object distance is the distance from the object

More information

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved.

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved. PHYSICS NOTES ON A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved. Types of There are two types of basic lenses. (1.)

More information

PHYS 202 OUTLINE FOR PART III LIGHT & OPTICS

PHYS 202 OUTLINE FOR PART III LIGHT & OPTICS PHYS 202 OUTLINE FOR PART III LIGHT & OPTICS Electromagnetic Waves A. Electromagnetic waves S-23,24 1. speed of waves = 1/( o o ) ½ = 3 x 10 8 m/s = c 2. waves and frequency: the spectrum (a) radio red

More information

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

More information

Chapter 23. Light Geometric Optics

Chapter 23. Light Geometric Optics Chapter 23. Light Geometric Optics There are 3 basic ways to gather light and focus it to make an image. Pinhole - Simple geometry Mirror - Reflection Lens - Refraction Pinhole Camera Image Formation (the

More information

UNIT 12 LIGHT and OPTICS

UNIT 12 LIGHT and OPTICS UNIT 12 LIGHT and OPTICS What is light? Light is simply a name for a range of electromagnetic radiation that can be detected by the human eye. What characteristic does light have? Light is electromagnetic

More information

Applied Optics. , Physics Department (Room #36-401) , ,

Applied Optics. , Physics Department (Room #36-401) , , Applied Optics Professor, Physics Department (Room #36-401) 2290-0923, 019-539-0923, shsong@hanyang.ac.kr Office Hours Mondays 15:00-16:30, Wednesdays 15:00-16:30 TA (Ph.D. student, Room #36-415) 2290-0921,

More information

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

IMAGE SENSOR SOLUTIONS. KAC-96-1/5 Lens Kit. KODAK KAC-96-1/5 Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2 KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

More information

OPTICS I LENSES AND IMAGES

OPTICS I LENSES AND IMAGES APAS Laboratory Optics I OPTICS I LENSES AND IMAGES If at first you don t succeed try, try again. Then give up- there s no sense in being foolish about it. -W.C. Fields SYNOPSIS: In Optics I you will learn

More information

Reading. Angel. Chapter 5. Optional

Reading. Angel. Chapter 5. Optional Projections Reading Angel. Chapter 5 Optional David F. Rogers and J. Alan Adams, Mathematical Elements for Computer Graphics, Second edition, McGraw-Hill, New York, 1990, Chapter 3. The 3D synthetic camera

More information

Lenses. Images. Difference between Real and Virtual Images

Lenses. Images. Difference between Real and Virtual Images Linear Magnification (m) This is the factor by which the size of the object has been magnified by the lens in a direction which is perpendicular to the axis of the lens. Linear magnification can be calculated

More information

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction About the course Instructors: Haibin Ling (hbling@temple, Wachman 35) Hours Lecture: Tuesda 5:3-8:pm, TTLMAN 43B Office hour: Tuesda 3: - 5:pm, or b appointment Textbook Computer Vision: Models, Learning,

More information

Digital Image Processing COSC 6380/4393

Digital Image Processing COSC 6380/4393 Digital Image Processing COSC 6380/4393 Lecture 2 Aug 23 rd, 2018 Slides from Dr. Shishir K Shah, Rajesh Rao and Frank (Qingzhong) Liu 1 Instructor Digital Image Processing COSC 6380/4393 Pranav Mantini

More information

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens Lecture Notes 10 Image Sensor Optics Imaging optics Space-invariant model Space-varying model Pixel optics Transmission Vignetting Microlens EE 392B: Image Sensor Optics 10-1 Image Sensor Optics Microlens

More information

Physics 1230 Homework 8 Due Friday June 24, 2016

Physics 1230 Homework 8 Due Friday June 24, 2016 At this point, you know lots about mirrors and lenses and can predict how they interact with light from objects to form images for observers. In the next part of the course, we consider applications of

More information

GEOMETRICAL OPTICS AND OPTICAL DESIGN

GEOMETRICAL OPTICS AND OPTICAL DESIGN GEOMETRICAL OPTICS AND OPTICAL DESIGN Pantazis Mouroulis Associate Professor Center for Imaging Science Rochester Institute of Technology John Macdonald Senior Lecturer Physics Department University of

More information

Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics Spring 2018 Douglas Fields

Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics Spring 2018 Douglas Fields Lecture 19 (Geometric Optics I Plane and Spherical Optics) Physics 262-01 Spring 2018 Douglas Fields Optics -Wikipedia Optics is the branch of physics which involves the behavior and properties of light,

More information

Geometry of Aerial Photographs

Geometry of Aerial Photographs Geometry of Aerial Photographs Aerial Cameras Aerial cameras must be (details in lectures): Geometrically stable Have fast and efficient shutters Have high geometric and optical quality lenses They can

More information

Cameras, lenses, and sensors

Cameras, lenses, and sensors Cameras, lenses, and sensors Reading: Chapter 1, Forsyth & Ponce Optional: Section 2.1, 2.3, Horn. 6.801/6.866 Profs. Bill Freeman and Trevor Darrell Sept. 10, 2002 Today s lecture How many people would

More information

CAMERA BASICS. Stops of light

CAMERA BASICS. Stops of light CAMERA BASICS Stops of light A stop of light isn t a quantifiable measurement it s a relative measurement. A stop of light is defined as a doubling or halving of any quantity of light. The word stop is

More information

Study guide for Graduate Computer Vision

Study guide for Graduate Computer Vision Study guide for Graduate Computer Vision Erik G. Learned-Miller Department of Computer Science University of Massachusetts, Amherst Amherst, MA 01003 November 23, 2011 Abstract 1 1. Know Bayes rule. What

More information

Digital Image Processing

Digital Image Processing Digital Image Processing Digital Imaging Fundamentals Christophoros Nikou cnikou@cs.uoi.gr Images taken from: R. Gonzalez and R. Woods. Digital Image Processing, Prentice Hall, 2008. Digital Image Processing

More information

1) An electromagnetic wave is a result of electric and magnetic fields acting together. T 1)

1) An electromagnetic wave is a result of electric and magnetic fields acting together. T 1) Exam 3 Review Name TRUE/FALSE. Write 'T' if the statement is true and 'F' if the statement is false. 1) An electromagnetic wave is a result of electric and magnetic fields acting together. T 1) 2) Electromagnetic

More information