Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Similar documents
Image and Multidimensional Signal Processing

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Computer Vision. The Pinhole Camera Model

Cameras. CSE 455, Winter 2010 January 25, 2010

VC 16/17 TP2 Image Formation

Unit 1: Image Formation

VC 14/15 TP2 Image Formation

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

VC 11/12 T2 Image Formation

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

Intorduction to light sources, pinhole cameras, and lenses

Phys 531 Lecture 9 30 September 2004 Ray Optics II. + 1 s i. = 1 f

CS6670: Computer Vision

Lecture 02 Image Formation 1

Overview. Image formation - 1

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Image Formation and Capture

Image Acquisition and Representation. Camera. CCD Camera. Image Acquisition Hardware

Image Acquisition and Representation

Building a Real Camera. Slides Credit: Svetlana Lazebnik

LENSES. INEL 6088 Computer Vision

Chapter 36. Image Formation

How do we see the world?

Chapter 36. Image Formation

CSE 527: Introduction to Computer Vision

ME 6406 MACHINE VISION. Georgia Institute of Technology

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

Cameras, lenses and sensors

Image Formation: Camera Model

Image Acquisition and Representation. Image Acquisition Hardware. Camera. how digital images are produced how digital images are represented

Building a Real Camera

Dr F. Cuzzolin 1. September 29, 2015

Sensors and Sensing Cameras and Camera Calibration

A Simple Camera Model

Chapter 25. Optical Instruments

Chapter 24 Geometrical Optics. Copyright 2010 Pearson Education, Inc.

Basic principles of photography. David Capel 346B IST

CSE 473/573 Computer Vision and Image Processing (CVIP)

The eye & corrective lenses

General Imaging System

Chapter 18 Optical Elements

Image Acquisition Hardware. Image Acquisition and Representation. CCD Camera. Camera. how digital images are produced

OPTICAL SYSTEMS OBJECTIVES

Section 3. Imaging With A Thin Lens

Image Formation and Camera Design

CS6670: Computer Vision

Imaging Instruments (part I)

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

30 Lenses. Lenses change the paths of light.

Physics 6C. Cameras and the Human Eye. Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB

Astronomical Cameras

Image Formation III Chapter 1 (Forsyth&Ponce) Cameras Lenses & Sensors

The Human Visual System. Lecture 1. The Human Visual System. The Human Eye. The Human Retina. cones. rods. horizontal. bipolar. amacrine.

Visual perception basics. Image aquisition system. IE PŁ P. Strumiłło

A 3D Multi-Aperture Image Sensor Architecture

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Physics 1230: Light and Color. Guest Lecture, Jack again. Lecture 23: More about cameras

1 Image Formation. 1.1 Optics 1.1

CS 443: Imaging and Multimedia Cameras and Lenses

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Digital Image Processing COSC 6380/4393

Computer Vision. Thursday, August 30

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Chapter 25: Applied Optics. PHY2054: Chapter 25

Computer Vision Slides curtesy of Professor Gregory Dudek

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Optics: Lenses & Mirrors

The Human Eye and a Camera 12.1

Lenses. A lens is any glass, plastic or transparent refractive medium with two opposite faces, and at least one of the faces must be curved.

Getting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes

Lenses. Images. Difference between Real and Virtual Images

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Topic 4: Lenses and Vision. Lens a curved transparent material through which light passes (transmit) Ex) glass, plastic

Image Processing - Intro. Tamás Szirányi

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Two strategies for realistic rendering capture real world data synthesize from bottom up

Telescopes and their configurations. Quick review at the GO level

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

EC-433 Digital Image Processing

Digital Image Processing COSC 6380/4393

Geometry of Aerial Photographs

CS 428: Fall Introduction to. Image formation Color and perception. Andrew Nealen, Rutgers, /8/2010 1

Chapter 25 Optical Instruments

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

2015 EdExcel A Level Physics EdExcel A Level Physics. Lenses

PHY385H1F Introductory Optics. Practicals Session 7 Studying for Test 2

[ Summary. 3i = 1* 6i = 4J;

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Chapter 3 Op,cal Instrumenta,on

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

General Physics II. Optical Instruments

Charged Coupled Device (CCD) S.Vidhya

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Digital Image Processing

Transcription:

Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1

Sensors and Image Formation Imaging sensors and models of image formation Coordinate systems

Digital Images Digital images are stored as arrays of numbers Numbers can represent Intensity (gray level, or each color band) Range X-ray absorption coefficient etc 164 5 47 56 0 181 13 162 62 16 11 186 176 172 13 158 4 78 118 18 187 153 4 177 162 174 18 188 13 2 155 185 184 27 11 164 168 178 12 18 175 150 174 185 166 43 16 170 168 173 188 185 17 174 178 186 117 20 11 162 161 167 177 181 186 186 183 165 37 11 151 154 144 155 171 177 176 15 43 11 155 150 142 114 8 8 7 35 5 43 84 135 146 51 16 65 73 15 20 2 208 151 48 34 47 45 2 84 47 11 11 13 203 20 2 188 141 2 138 123 65 22 12 10 18 208 204 186 140 43 77 154 166 1 34 16 173 183 202 20 18 0 14 28 135 171 171 147 6 18 138 172 10 201 172 36 1 44 115 170 175 134 5 16 167 175 178 154 68 25 35 77 158 180 165 8 26 12 183 177 166 1 1 40 3 35 114 3 33 2 88 16 14 3 15 175 171 115 3

Intensity Image Sensors Basic elements of an imaging device Aperture An opening (or pupil ) to limit amount of light, and angle of incoming light rays Optical system Lenses - purpose is to focus light from a scene point to a single image point Imaging photosensitive surface Film or sensors, usually a plane Image plane Optical system aperture Optical axis 4

Biological Vision A Guide to Developing? Unfortunately, it is pre-attentive Difficult to measure and study detailed function Some information from experiments with Animals (electrophysiology) People s perception (psychophysics) A great deal of processing is done right in the retina Data reduction of ~0 million receptors down to ~1 million optic nerve channels Further processing done in the visual cortex Specialized detectors for motion, shape, color, binocular disparity Evidence for maps in the cortex in correspondence to the image Still, it is an existence proof that it can be done, and well Notes: Eyeball is about 20 mm in diameter Retina contains both rods and cones Fovea is about 1.5 mm in diameter, contains about 337,000 cones Focal length about 17 mm 5

Digital Camera Image plane is a 2D array of sensor elements CCD type (Charge coupled device) Charge accumulates during exposure Charges are transferred out to shift registers, digitized and read out sequentially CMOS type (complementary metal oxide on silicon) Light affects the conductivity (or gain) of each photodetector Digitized and read out using a multiplexing scheme Main design factors Number and size of sensor elements Chip size ADC resolution 6

Thin Lens Rays parallel to the optical axis are deflected to go through the focus Rays passing through the center are undeflected World point P is in focus at image point p Equation of a thin lens: 1 1 1 + = Z z' f 7

Pinhole Camera Model A good lens can be modeled by a pinhole camera; ie., each ray from the scene passes undeflected to the image plane Simple equations describe projection of a scene point onto the image plane ( perspective projection ) We will use the pinhole camera model exclusively, except for a little later in the course where we model lens distortion in real cameras The pinhole camera ( camera obscura ) was used by Renaissance painters to help them understand perspective projection 8

Perspective Projection Equations For convenience (to avoid an inverted image) we treat the image plane as if it were in front of the pinhole The XYZ coordinates of the point are with respect to the camera origin P(X,Y,Z) X p(x,y) Z f = focal length We define the origin of the camera s coordinate system at the pinhole (note this is a 3D XYZ coordinate frame Field of view (θ)? By similar triangles, x=f X/Z, y=f Y/Z θ/2 f w/2 tan(θ/2) = (w/2)/f

Camera vs Image Plane Coords C P(X,Y,Z) Camera coordinate system {C} A 3D coordinate system (X,Y,Z) units say, in meters Origin at the center of projection Z axis points outward along optical axis X points right, Y points down f {C} x=f X/Z y=f Y/Z {π} Image plane coordinate system {π} A 2D coordinate system (x,y) units in mm Origin at the intersection of the optical axis with the image plane In real systems, this is where the CCD or CMOS plane is

Examples Assume focal length = 5 mm x = f X/Z = (5 mm) (1 m)/(5 m) = 1 mm y = f Y/Z = (5 mm) (2 m)/(5 m) = 2 mm A scene point is located at (X,Y,Z) = (1m, 2m, 5m) What are the image plane coordinates (x,y) in mm? If the image plane is mm x mm, what is the field of view? θ/2 5 mm tan(θ/2) = (w/2)/f = 5/5 = 1 y x mm A building is 0m wide. How far away do we have to be in order that it fills the field of view? so θ/2 = 45 deg, fov is 0x0 deg 50 m 45 deg Z tan(45 deg) = (W/2)/Z= 50/Z so Ζ = 50 m 11

Image Buffer Image plane The real image is formed on the CCD plane (x,y) units in mm Origin in center (principal point) (0,0) y x {π} Image buffer Digital (or pixel) image (row, col) indices We can also use (x im, y im ) Origin in upper left (1,1) row (or y im ) column (or x im ) {I} Center at (c x,c y ) 12

13

Conversion between real image and pixel image coordinates Assume The image center (principal point) is located at pixel (c x,c y ) in the pixel image The spacing of the pixels is (s x,s y ) in millimeters x im y im (c x, c y ) y x Then x = (x im c x ) s x y = (y im c y ) s y x im = x/s x + c x y im = y/s y + c y 14

A star is located at pixel (r,c)=(0,200) in a telescope image Example rows columns What is the 3D unit vector pointing at the star? Assume: Image is 00x00 pixels Optical center is in the center of the pixel image CCD plane is mm x mm Focal length is 1 m 15

Note on focal length Recall x = (x im c x ) s x y = (y im c y ) s y or x im = x/s x + c x y im = y/s y + c y and x = f X/Z y = f Y/Z So x im = (f / s x ) X/Z + c x y im = (f / s y ) Y/Z + c y All we really need is f x = (f / s x ) f y = (f / s y ) We don t need to know the actual values of f and s x,s y ; just their ratios We can alternatively express focal length in units of pixels 16

Example A camera observes a rectangle 1m away The rectangle is known to be 20cm x cm In the image, the rectangle measures 200 x 0 pixels Focal length in x, focal length in y (pixels)? If image size is 640x480 pixels, what is field of view (horiz, vert)? 17

Camera Parameters Intrinsic parameters Those parameters needed to relate an image point (in pixels) to a direction in the camera frame f x, f y, c x, c y Also lens distortion parameters (will discuss later) Extrinsic parameters Define the position and orientation (pose) of the camera in the world 18

Frames of Reference Image frames are 2D; others are 3D The pose (position and orientation) of a 3D rigid body has 6 degrees of freedom {Camera} Frame buffer or pixel image Image plane or real image {Model} {World} 1