Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Similar documents
Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Image Formation: Camera Model

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Two strategies for realistic rendering capture real world data synthesize from bottom up

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Building a Real Camera

Unit 1: Image Formation

LENSES. INEL 6088 Computer Vision

Cameras. CSE 455, Winter 2010 January 25, 2010

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Cameras, lenses and sensors

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

VC 11/12 T2 Image Formation

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

VC 14/15 TP2 Image Formation

Image Formation and Capture

How do we see the world?

CSE 473/573 Computer Vision and Image Processing (CVIP)

Image Formation and Camera Design

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Detectors for microscopy - CCDs, APDs and PMTs. Antonia Göhler. Nov 2014

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

VC 16/17 TP2 Image Formation

CS6670: Computer Vision

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Reflectors vs. Refractors

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Lecture 30: Image Sensors (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

Lenses, exposure, and (de)focus

Announcements. Image Formation: Outline. The course. How Cameras Produce Images. Earliest Surviving Photograph. Image Formation and Cameras

General Imaging System

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

The Human Visual System. Lecture 1. The Human Visual System. The Human Eye. The Human Retina. cones. rods. horizontal. bipolar. amacrine.

A Simple Camera Model

Virtual and Digital Cameras

Sensors and Sensing Cameras and Camera Calibration

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

Basic principles of photography. David Capel 346B IST

Optical Components for Laser Applications. Günter Toesko - Laserseminar BLZ im Dezember

ECEN 4606, UNDERGRADUATE OPTICS LAB

Cameras and Sensors. Today. Today. It receives light from all directions. BIL721: Computational Photography! Spring 2015, Lecture 2!

Image Formation III Chapter 1 (Forsyth&Ponce) Cameras Lenses & Sensors

Chapter 18 Optical Elements

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Waves & Oscillations

Dr F. Cuzzolin 1. September 29, 2015

Chapter 25. Optical Instruments

Advanced Camera and Image Sensor Technology. Steve Kinney Imaging Professional Camera Link Chairman

What will be on the midterm?

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

brief history of photography foveon X3 imager technology description

Computational Photography and Video. Prof. Marc Pollefeys

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Camera Test Protocol. Introduction TABLE OF CONTENTS. Camera Test Protocol Technical Note Technical Note

Photons and solid state detection

Cvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro

CS6670: Computer Vision

Notes from Lens Lecture with Graham Reed

Imaging Optics Fundamentals

Lecture 02 Image Formation 1

Optical basics for machine vision systems. Lars Fermum Chief instructor STEMMER IMAGING GmbH

Charged Coupled Device (CCD) S.Vidhya

Applied Optics. , Physics Department (Room #36-401) , ,

CPSC 425: Computer Vision

Camera Selection Criteria. Richard Crisp May 25, 2011

Point Spread Function. Confocal Laser Scanning Microscopy. Confocal Aperture. Optical aberrations. Alternative Scanning Microscopy

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Properties of a Detector

Astronomy 80 B: Light. Lecture 9: curved mirrors, lenses, aberrations 29 April 2003 Jerry Nelson

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

OPTICAL SYSTEMS OBJECTIVES

Overview. Charge-coupled Devices. MOS capacitor. Charge-coupled devices. Charge-coupled devices:

Image Formation. Light from distant things. Geometrical optics. Pinhole camera. Chapter 36

Lenses. Overview. Terminology. The pinhole camera. Pinhole camera Lenses Principles of operation Limitations

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Fundamentals of CMOS Image Sensors

Prof. Feng Liu. Spring /05/2017

Digital Camera Technologies for Scientific Bio-Imaging. Part 2: Sampling and Signal

Computer Vision. The Pinhole Camera Model

Lecture 29: Image Sensors. Computer Graphics and Imaging UC Berkeley CS184/284A

CCD Characteristics Lab

2013 LMIC Imaging Workshop. Sidney L. Shaw Technical Director. - Light and the Image - Detectors - Signal and Noise

CS 443: Imaging and Multimedia Cameras and Lenses

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Imaging Overview. For understanding work in computational photography and computational illumination

Cameras As Computing Systems

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Examination, TEN1, in courses SK2500/SK2501, Physics of Biomedical Microscopy,

Design of a digital holographic interferometer for the. ZaP Flow Z-Pinch

Image Formation. World Optics Sensor Signal. Computer Vision. Introduction to. Light (Energy) Source. Surface Imaging Plane. Pinhole Lens.

Transcription:

Acquisition Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

Image Acquisition Digital Camera Film

Outline Pinhole camera Lens Lens aberrations Exposure Sensors Noise Sensing color

Camera trial #1 scene film Put a piece of film in front of an object. source: Yung-Yu Chuang

Pinhole camera pinhole camera scene barrier film Add a barrier to block off most of the rays It reduces blurring The pinhole is known as the aperture The image is inverted

Modeling projection The coordinate system Put the optical center (Center Of Projection) at the origin Put the image plane (Projection Plane) in front of the COP The camera looks down the negative z axis we need this if we want right-handed-coordinates

Modeling projection Projection equations Compute intersection with PP of ray from (x,y,z) to COP Derived using similar triangles (on board) We get the projection by throwing out the last coordinate:

In Homogenous Coordinates Projection is a matrix multiply using homogeneous coordinates:

Projection

Projection http://users.skynet.be/j.beever/pave.htm

Pinhole camera pinhole camera scene barrier film Add a barrier to block off most of the rays It reduces blurring The pinhole is known as the aperture The image is inverted

Shrinking the Pinhole Aperture Why make the aperture as small as possible? Less light gets through Diffraction effect

Shrinking the Pinhole Aperture Sharpest image is obtained when: pinhole diameter d = 2 f λ Example: If f = 50mm, λ = 600nm (red), d = 0.36mm

High-end commercial pinhole cameras ~$200

Pinhole Images Exposure 4 seconds Exposure 96 minutes Images copyright 2000 Zero Image Co.

Outline Pinholecamera Lens Lens Aberrations Exposure Sensors Noise Sensing color

Adding a lens scene film

Adding a lens circle of confusion scene lens film A lens focuses light onto the film There is a specific distance at which objects are in focus Other points project to a circle of confusion in the image

(Thin) Lens Thin lens equation: Any object point satisfying this equation is in focus

Circle of Confusion aperture Blur Circle, b aperture diameter d o i o' i' Blur Circle Diameter b : Derive using similar triangles b = d i' ( i ' i)

Aperture controls Depth of Field f / 5.6 Changing the aperture affects depth of field Smaller aperture: better DOF increased exposure f / 32

Depth of Field http://www.cambridgeincolour.com/tutorials/depth-of-field.htm

Thick Lens Corrects aberrations Change zoom

Field of View (Zoom)

Field of View (Zoom)

FOV depends on Focal Length ϕ/2 d/2 f f d = image size 2

FOV depends on Focal Length d = image size 2

FOV depends on Focal Length d = image size 2 Smaller FOV larger Focal Length

FOV depends on Focal Length focal point image size For closer objects: if focal point is larger but image distance and size remain unchanged the objects in focus are more distant.

Simplified Zoom Lens in Operation From wikipedia

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color

Radial Distortion No distortion Pin cushion Barrel Radial distortion of the image Caused by imperfect lenses Deviations are most noticeable for rays that pass through the edge of the lens

Correcting radial distortion from Helmut Dersch

Radial Distortions No Distortion Barrel Distortion Pincushion Distortion Radial distance from Image Center: r u = r d + k 1 r d 3 r u = undistorted radius r d = distorted radius

Correcting Radial Distortions Before After http://www.grasshopperonline.com/barrel_distortion_correction_software.html

Vignetting photo by Robert Johnes

Vignetting L3 L2 L1 B A More light passes through lens L3 for scene point A than scene point B Results in spatially non-uniform brightness (in the periphery of the image)

Chromatic Aberration longitudinal chromatic aberration (axial) transverse chromatic aberration (lateral)

Chromatic Aberration longitudinal chromatic aberration (axial) Canon EF 85/1.2 L USM transverse chromatic aberration (lateral) Cosina 3.5-4.5/19-35 @ 20 mm Good lens Carl Zeiss Distagon 2.8/21 http://www.vanwalree.com/optics/chromatic.html

Chromatic Aberration Near Lens Center Near Lens Outer Edge

Spherical aberration Rays parallel to the axis do not converge Outer portions of the lens yield smaller focal lengths

Spherical aberration Spherical lens are free of chromatic aberration but do not focus well. Parabolic lens does.

Spherical aberration

Astigmatism Different focal length for inclined rays

Astigmatism Change in size and shape of blur patches

Coma point off the axis depicted as comet shaped blob

Lens Glare Stray inter-reflections of light within the optical lens system Happens when very bright sources are present in the scene Reading: http://www.dpreview.com

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color

Exposure Two main parameters: Aperture (in f stop) Shutter speed (in fraction of a second)

Shutter

Leaf Shutter Advantages Uniform illumination Entire frame illuminated at once Disadvantages Illumination not constant over time Limitations on shutter speed

Focal Plane Shutter Advantages Cost effective (one shutter needed for all lenses Can achieve very fast shutter speeds (~1/10000 sec) Disadvantages May cause time distortion

Aperture Aperture is the diameter of the lens opening, usually specified by f-stop, f/d, a fraction of the focal length. f/2.0 on a 50mm means that the aperture is 25mm f/2.0 on a 100mm means that the aperture is 50mm When a change in f-stop occurs, the light is either doubled or cut in half. Lower f-stop, more light (larger lens opening) Higher f-stop, less light (smaller lens opening)

Aperture Constant speed.

Shutter speed Rule of Thumb 1 step in the shutter speed scale corresponds to 1 stop in the aperture scale. Handheld camera: shutter speed = 1 / f Stabilized gear: 2-3 shutter speeds slower Typical speeds: 1/1000 s, 1/500 s, 1/250 s, 1/125 s, 1/60 s, 1/30 s, 1/15 s, 1/8 s, 1/4 s, 1/2 s 1 s

Aperture vs. Shutter Depth of Field f/22 f/4 Small Aperture Large Aperture (Low speed) (High speed)

Aperture vs. Shutter Motion Blur 1/30 sec. @ f/22 1/6400 sec. @ f/2.5 Small Aperture (Low speed) Large Aperture (High speed)

Dynamic Range

Short exposure Real world radiance Picture intensity 10-6 10 6 dynamic range 10-6 10 6 Pixel value 0 to 255

Long exposure Real world radiance Picture intensity 10-6 10 6 dynamic range 10-6 10 6 Pixel value 0 to 255

Varying shutter speeds

HDR High Dynamic Range

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color

Spatial Sampling When a continuous scene is imaged on the sensor, the continuous image is divided into discrete elements - picture elements (pixels)

Spatial Sampling

Sampling The density of the sampling denotes the separation capability of the resulting image Image resolution defines the finest details that are still visible by the image We use a cyclic pattern to test the separation capability of an image 0 x

Sampling Frequency

Sampling Frequency

Nyquist Frequency Nyquist Rule: To observe details at frequency f (wavelength d) one must sample at frequency > 2f (sampling intervals < d/2) The Frequency 2f is the NYQUIST frequency. Aliasing: If the pattern wavelength is less than 2d erroneous patterns may be produced. 1D Example: 0

Aliasing - Moiré Patterns

Quantization

Digitizers (Quantization)

Image Sensors CCD Charge Coupled Device CMOS Complementary Metal Oxide Semiconductor

MOS (Metal Oxide Semiconductor) Photosensitive element Charge acquired depends on the number of photons which reach the element CCD devices are arrays of this basic element

Photoelectric Effect photon photon Increasing energy Conduction Band Valence Band 1.26eV Hole Electron Thermally generated electrons are indistinguishable from photo-generated electrons Dark Current.

Quantum Efficiency Not every photon hitting a pixel creates a free electron Quantum Efficiency (QE) = electrons collected / photons hitting the pixel QE heavily depends on the wavelength QE [%] blue green red QE < 100% degrades the SNR of a camera lambda [nm] SNR e = QE SNR p Typical max QE values : 25% (CMOS) 60% (CCD)

CCD (Charge Coupled Device) Boyle and Smith, 1969, Bell Labs Converts light into electrical signal (pixels)

CCD Readout Bucket Brigade Integration Charge Shift and Read-out Charge Amplifier

CMOS (Complementary Metal-Oxide Semiconductor) Each pixel owns its own charge-voltage conversion No need for external shutter (electronic shutter) The chip outputs digital bits Much faster than CCD devices

CCD vs. CMOS Mature technology Specific technology High production cost High power consumption Higher fill factor Blooming Sequential readout Recent technology Standard IC technology Cheap Low power Less sensitive Per pixel amplification Random pixel access Smart pixels On chip integration with other components

Sensor Parameters Fill factor The area in the sensor that is truly sensitive to light Shift registers and others can reduce it up to a 30% Well capacity The quantity of charge that can be stored in each pixel Close relation with pixel dimensions Integration time: Exposure time that is required to excite the CCD elements Depends on the scene brightness Acquisition time: Time needed to transfer the information gathered by the CCD Depends on the number of pixels in the sensor

Fill Factor The ratio between the light sensitive pixel area and the total pixel area. Total pixel area: 5µm x 5µm Photo Sensing Area Fill factor 40%

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color

Sensor noise Noise Sources Photon noise / Shot Noise (Poisson) Dark Noise (Constant) Thermal noise (Poisson) Resetting (fixed) Read-out noise Blooming (After T. Lomheim, The Aerospace Corporation) http://www.stw.tu-ilmenau.de/~ff/beruf_cc/cmos/cmos_noise.pdf

Photon Shot Noise Light is quantum in nature Noise due to statistics of the detected photons themselves The probability distribution for N photons to be counted in an observation time T is Poisson P ( N F, T ) = ( FT ) N FT e N! F = fixed average flux (photons/sec)

Poisson Distribution, FT = 5 0.2 0.15 0.1 0.05 0 0 20 40 60 80 100 N Poisson Distribution : std equals sqrt of Mean. σ = FT = 2 shot N photons

Poisson Distribution, FT = 10 0.14 0.12 0.1 0.08 0.06 0.04 0.02 0 0 20 40 60 80 100 N

Poisson Distribution, FT = 20 0.1 0.08 0.06 0.04 0.02 0 0 20 40 60 80 100 N

Poisson Distribution, FT = 50 0.06 0.05 0.04 0.03 0.02 0.01 0 0 20 40 60 80 100 As FT grows, Poisson distribution approaches Gaussian distribution. Signal To Noise (SNR) Increases with Mean. SNR = N σ 2 2 N = N N 2 = N

Photon Noise More noise in bright parts of the image You can identify the white and black regions from the noise image

Photon Noise Photon Noise more noticeable in dark images.

Dark Current Noise Electron emission when no light Dark current noise is high for long exposures To remove (some) of it Calibrate the camera (make response linear) Capture the image of the scene as usual Cover the lens with the lens cap and take another picture Subtract the second image from the first image

Dark Current Noise Original image + Dark Current Noise Image with lens cap on Result of subtraction Copyright Timo Autiokari,

Sensor noise ideal relationship between electrons and impinging photons CCD capacity limit Light Signal (QE = 50) Photon Noise (QE = 50)

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color

Sensing Color light beam splitter 3 CCD Bayer pattern Foveon X3 TM

Multi-Chip wavelength dependent

Field Sequential

Field Sequential

Field Sequential

Color Filter Array (CFA) Fuji Corporation

Color filter array Bayer pattern Color filter arrays (CFAs)/color filter mosaics

Bayer s pattern

Demosaicking CFA s

Color filter array red green blue output

X3 technology red green blue output

Foveon X3 sensor Bayer CFA X3 sensor

Cameras with X3 Sigma SD14 Polaroid X530 Hanvision HVDUO 5M/10M Out of production

Color processing After color values are recorded, more color processing usually happens: White balance Non-linearity to approximate film response or match TV monitor gamma

White Balance warmer +3 automatic white balance

Gamma Correction Gamma correction applied by the converter redistributes the pixel luminance values so that limited brightness range captured by the sensor is mapped to match our eye s sensitivity. Gamma = 2.2 is a good match to distribute relative brightness in a print or in a video display.

Gamma =1 vs. Gamma = 2.2

Space of response curves

Outline Pinholecamera Lens Lens aberrations Exposure Sensors Noise Sensing color Summary

Camera pipeline

Sensor Response The response of a sensor is proportional to the radiance and the throughput

Measurement Equation Scene Radiance Optics Pixel Response Shutter L(x,ω,t,λ) (x,ω ) = T(x,ω,λ) P(x,λ) S(x,ω,t)

Degradation Due To Sampling Sampling in space Pixels Sampling in intensity Quantization Sampling in color Color Filter Array (CFA) Sampling in time Exposure Sampling in frequency Lens and pixel PSF (point-spread-function)