Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Similar documents
Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2014 Version 1

How do we see the world?

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Unit 1: Image Formation

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Two strategies for realistic rendering capture real world data synthesize from bottom up

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Basic principles of photography. David Capel 346B IST

Building a Real Camera

Cameras. CSE 455, Winter 2010 January 25, 2010

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

CS6670: Computer Vision

LENSES. INEL 6088 Computer Vision

Image Formation III Chapter 1 (Forsyth&Ponce) Cameras Lenses & Sensors

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

CS6670: Computer Vision

CSE 473/573 Computer Vision and Image Processing (CVIP)

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Lecture 02 Image Formation 1

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Lenses, exposure, and (de)focus

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Computer Vision. The Pinhole Camera Model

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

What will be on the midterm?

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Lecture 7: Camera Models

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

CPSC 425: Computer Vision

Physics 1230 Homework 8 Due Friday June 24, 2016

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

CS 443: Imaging and Multimedia Cameras and Lenses

Chapter 25 Optical Instruments

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Cameras and Sensors. Today. Today. It receives light from all directions. BIL721: Computational Photography! Spring 2015, Lecture 2!

Image Formation and Capture

Lens Openings & Shutter Speeds

Dr F. Cuzzolin 1. September 29, 2015

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Computational Photography and Video. Prof. Marc Pollefeys

Shaw Academy. Lesson 2 Course Notes. Diploma in Smartphone Photography

VC 16/17 TP2 Image Formation

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Image Acquisition and Representation. Camera. CCD Camera. Image Acquisition Hardware

VC 14/15 TP2 Image Formation

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Lecture 2 Camera Models

Image Formation: Camera Model

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Image Acquisition and Representation

Image Acquisition and Representation. Image Acquisition Hardware. Camera. how digital images are produced how digital images are represented

6.A44 Computational Photography

Astronomical Cameras

Applications of Optics

Geometrical Optics Optical systems

Image Acquisition Hardware. Image Acquisition and Representation. CCD Camera. Camera. how digital images are produced

VC 11/12 T2 Image Formation

ME 6406 MACHINE VISION. Georgia Institute of Technology

Understanding Focal Length

Name: Date: Math in Special Effects: Try Other Challenges. Student Handout

Single-view Metrology and Cameras

APPLICATIONS FOR TELECENTRIC LIGHTING

Reading: Lenses and Mirrors; Applications Key concepts: Focal points and lengths; real images; virtual images; magnification; angular magnification.

GEOMETRICAL OPTICS AND OPTICAL DESIGN

Prof. Feng Liu. Spring /05/2017

Image Formation by Lenses

Overview. Image formation - 1

CSE 527: Introduction to Computer Vision

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Computer Vision. Thursday, August 30

Exposure settings & Lens choices

Lens Aperture. South Pasadena High School Final Exam Study Guide- 1 st Semester Photo ½. Study Guide Topics that will be on the Final Exam

Cameras, lenses and sensors

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Topic 6 - Optics Depth of Field and Circle Of Confusion

Announcements. Image Formation: Outline. The course. How Cameras Produce Images. Earliest Surviving Photograph. Image Formation and Cameras

Design of Temporally Dithered Codes for Increased Depth of Field in Structured Light Systems

Lab 10: Lenses & Telescopes

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Physics 3340 Spring Fourier Optics

INTRODUCTION THIN LENSES. Introduction. given by the paraxial refraction equation derived last lecture: Thin lenses (19.1) = 1. Double-lens systems

Sensors and Sensing Cameras and Camera Calibration

Photography PreTest Boyer Valley Mallory

Getting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes

Lecture 7: Camera Models

Adding Realistic Camera Effects to the Computer Graphics Camera Model

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Reflectors vs. Refractors

Chapter 36. Image Formation

Image Formation and Camera Design

Chapter 24 Geometrical Optics. Copyright 2010 Pearson Education, Inc.

Lecture 2 Camera Models

CAMERA BASICS. Stops of light

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Properties of optical instruments. Projection optical systems

Transcription:

Image Formation Dr. Gerhard Roth COMP 4102A Winter 2015 Version 3 1

Image Formation Two type of images Intensity image encodes light intensities (passive sensor) Range (depth) image encodes shape and distance Created from processing passive images or by an active sensor Intensity image is a function of three things Optical parameters of the lens Lens type, focal length, field of view, angular apertures Photogrammetric (Radiometric) parameters Type, direction and intensity of the illumination Reflectance properties of the viewed surface Characteristics of the image sensor Geometric parameters Type of projection, position and orientation of camera 2

Elements of a real imaging device Light rays coming from outside world and falling on the photoreceptors in the retina. Aperture lets in light and size can vary Screen represents any sensor that can capture light such as a photographic plate or a film negative or an electronic sensor, a 2d array of pixels. Aperture usually opened for a small amount of time. 3

Pinhole camera Pinhole is the aperture in this case Changing pinhole size changes amount of light that is let in 4

Perspective Projection Draughtsman Drawing a Lute, Albrecht Dürer, 1525 5

Camera Obscura Camera Obscura, Reinerus Gemma Frisius, 1544 Camera Obscura: Latin dark chamber 6

Camera Obscura Contemporary artist Madison Cawein rented studio space in an old factory building where many of the windows were boarded up or painted over. A random small hole in one of those windows turned one room into a camera obscura. 7

Photographic Camera Photographic camera: Joseph Nicéphore Niepce, 1816 8

First Photograph First photograph on record, la table servie, obtained by Niepce in 1822. 9

Why Lenses? Gather more light from each scene point and also reduce blurring 10

Why Lenses? Pinhole too big - many directions are averaged, blurring the image Pinhole too small - diffraction effects blur the image Generally, pinhole cameras are dark, because a very small set of rays from a particular point hits the screen. 11

Adding a lens focal point A lens focuses light onto the film Thin lens model: Rays passing through the center are not deviated (pinhole projection model still holds) All parallel rays converge to one point on a plane located at the focal length f f 12 Slide by Steve Seitz

Adding a lens Object Lens Film circle of confusion A lens focuses light onto the film There is a specific distance at which objects are in focus other points project to a circle of confusion in the image Changing the shape of the lens changes this distance 13

Camera with Lens - Thin Lens Model Lens thickness small compared to focal length Basic properties 1. Any ray entering the lens parallel to the axis on one side goes through the focal point on the other side. 2. Any ray entering the lens from the focal point on one side emerges parallel to the axis on the other side. 14

Thin Lens 15

Thin Lens 16

Thin Lens 17

Fundamental Equation of Thin Lenses Object Lens Focal Point Image sensor Z Z f z z f Effective Focal length 1 Z 1 z 1 f Any point satisfying this equation is in focus Proof uses similar triangles: PSFl~ORFl and QOFr~spFr and fact that PS = QO and sp = OR 18

Effect of changing value of Z on z Look at equations 1 1 1 Z z f As you move farther from lens Z increases This affects value of z, which is where this point is in focus on the other side of lens If Z goes to infinity then z goes to zero If Z goes to zero then z goes to infinity Z Thin lens applet: http://www.phys.hawaii.edu/~teb/java/nt nujava/lens/lens_e.html Z f z z f 19

Thin Lenses As the point goes to infinity the focal point approaches f, the value for a pin hole camera For a lens we can adjust focus ring to move the lens and aperture ring to change aperture Both of these adjustments affect what is called the depth of field (explained by model) 20

Depth of field Point is in focus over a given distance Z This range of Z is called depth of field Depth of field changes with the lens focal length f In focus region has less than one pixel of blur 21

Depth of field 22

Depth of field Pin hole camera has infinite depth of field Thin lens implies there is a finite depth of field Can change depth of field by changing lens or aperture 23

Aperture size also affects dof Change aperture size =changes depth of field Blurriness of out of focus objects depends on the aperture size Larger aperture means smaller depth of field but it also lets in more light 24

Varying the aperture Large apeture = small DOF Small apeture = large DOF 25

Nice Depth of Field effect 26

Depth of field Aperture Film f / 5.6 f / 32 Changing the aperture size affects depth of field A smaller aperture increases the range in which the object is approximately in focus Flower images from Wikipedia http://en.wikipedia.org/wiki/depth_of_field 27

Field of View (Zoom) 28

Field of View (Zoom) 29

FOV depends of Focal Length f Smaller FOV = larger Focal Length 30

Field of View / Focal Length Large FOV, small f Camera close to car Small FOV, large f Camera far from the car Small field of view has wide angle, but more perspective distortion 31

Effect of change in focal length Small f is wide angle, large f is telescopic 32

Zoom Lens 33

Camera parameters Focus Shifts the depth that is in focus. Controlled by focus ring. This is a ring on lens elements which moves the lens body. Focal length Adjusts the zoom, i.e., wide angle or telephoto lens. Internally a mechanical assembly of lens elements. A fixed focal length lens only has one lens element. Aperture Adjusts the depth of field and the amount of light let into the sensor. Controlled by changing the f-stop. Exposure time How long an image is exposed. The longer an image is exposed the more light, but could result in motion blur. ISO Adjusts the sensitivity of the film. Basically a gain function for digital cameras. Increasing ISO also increases noise. 34

Autofocus Uses sensor, control system and motor to focus on a selected point or area Can get sharp images over large depth variation Intelligently adjust camera lens to maintain focus on an object (another definition) Two approaches, passive and active Active Triangulation using an active sensor such as laser, ultrasound, or infrared light Passive Phase detection (similar to stereo) to find depth Contrast detection uses blur or lack of it to find depth 35

Passive Autofocus Basic technology Camera lens projects image onto sensor AF module gives portion of image to CPU in order to process the contrast information CPU controls focus motor to move lens 36

Autofocus On all high end cameras, and now on many low end cameras (webcams) and phones In Android can have fixed focus, autofocus (it does focus once), or continuous autofocus Most sophisticated image processing applications require an in-focus image Requires autofocus QRTag and OCR (including my chess application) Not require autofocus ARTag, a tag system with much less information Such applications work on wider variety of devices 37

QRTag versus ARTag QRTag Lot of info, small regions Can encode entire URL ARTag Less info, large regions Only 10 bits of encoding 38

Basic radiometry Image Irradiance: the power of light, per unit area and at each point p of the image plane. Scene (surface) Radiance: the power of the light, per unit area, ideally emitted by each point p of a surface in 3-D space in a given direction. 39

Surface Reflectance Model A model of the way in which the surface reflects incident light is called a surface reflectance model There are a number of different types of surface reflectance models Fix the lighting, and the object and then move the camera while looking a single surface point The changes in appearance of that surface point defines the specularity Plain sheet of paper is non-specular (no change) Desktop is semi-specular (some change) Mirror is very specular (a great deal of change) 40

Surface Reflectance for Lambertian L I T n is called surface albedo and it depends on the surface material And L is scene irradiance (no d vector term) Lambertian model: each surface point appears equally bright from all viewing directions (no term with d). Non specular surface. Specular model: this is not true, looks brighter from some viewing directions (mirrors are very specular). These models are much more complex than the lambertain model (more parameters) 41

Human Eye 42

CCD (Charge-Coupled Device) Cameras Small solid state cells convert light energy into electrical charge (sensing elements always rectangles and are usually square) The image plane acts as a digital memory that can be read row by row by a computer 43

Image Digitization Sampling measuring the value of an image at a finite number of points. Quantization representing the measured value at the sampled point, by an integer. Pixel picture element, usually in the range [0,255] 44

Grayscale Image 10 5 9 100 A digital image is represented by an integer array E of m-by-n. E(i,j), a pixel, is an integer in the range [0, 255]. 45

Color Image B G R 46

Geometric Model of Camera Perspective projection P P(X,Y,Z) p(x,y) optical center y p x principal point image plane principal axis x f X Z y f Y Z 47

Funny things happen 48

Parallel lines aren t 49 Figure by David Forsyth

Lengths can t be trusted... B C A 50 Figure by David Forsyth