Image Formation and Capture

Similar documents
Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Building a Real Camera

Building a Real Camera. Slides Credit: Svetlana Lazebnik

LENSES. INEL 6088 Computer Vision

Unit 1: Image Formation

Cameras. CSE 455, Winter 2010 January 25, 2010

VC 14/15 TP2 Image Formation

Cameras, lenses and sensors

VC 11/12 T2 Image Formation

Getting light to imager. Capturing Images. Depth and Distance. Ideal Imaging. CS559 Lecture 2 Lights, Cameras, Eyes

Light. Path of Light. Looking at things. Depth and Distance. Getting light to imager. CS559 Lecture 2 Lights, Cameras, Eyes

CS 443: Imaging and Multimedia Cameras and Lenses

Image Formation: Camera Model

CS6670: Computer Vision

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

VC 16/17 TP2 Image Formation

CSE 473/573 Computer Vision and Image Processing (CVIP)

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

CPSC 425: Computer Vision

Vision and Color. Reading. The lensmaker s formula. Lenses. Brian Curless CSEP 557 Autumn Good resources:

CSE 527: Introduction to Computer Vision

Lenses. Overview. Terminology. The pinhole camera. Pinhole camera Lenses Principles of operation Limitations

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSEP 557 Fall Good resources:

Vision and Color. Brian Curless CSEP 557 Fall 2016

Image Formation III Chapter 1 (Forsyth&Ponce) Cameras Lenses & Sensors

Lenses, exposure, and (de)focus

Vision and Color. Reading. Optics, cont d. Lenses. d d f. Brian Curless CSE 557 Autumn Good resources:

Vision and Color. Brian Curless CSE 557 Autumn 2015

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Projection. Announcements. Müller-Lyer Illusion. Image formation. Readings Nalwa 2.1

Megapixels and more. The basics of image processing in digital cameras. Construction of a digital camera

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

OPTICAL SYSTEMS OBJECTIVES

Chapter 25. Optical Instruments

Announcements. Image Formation: Outline. The course. How Cameras Produce Images. Earliest Surviving Photograph. Image Formation and Cameras

CS6670: Computer Vision

Image Formation and Camera Design

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

Chapter 25 Optical Instruments

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Charged Coupled Device (CCD) S.Vidhya

Two strategies for realistic rendering capture real world data synthesize from bottom up

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera High dynamic range imaging

Reading. 1. Visual perception. Outline. Forming an image. Optional: Glassner, Principles of Digital Image Synthesis, sections

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Basic principles of photography. David Capel 346B IST

Digital Image Processing COSC 6380/4393

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Applications of Optics

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Dr F. Cuzzolin 1. September 29, 2015

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

What will be on the midterm?

A Simple Camera Model

Overview. Charge-coupled Devices. MOS capacitor. Charge-coupled devices. Charge-coupled devices:

Lecture 7: Camera Models

EC-433 Digital Image Processing

Further reading. 1. Visual perception. Restricting the light. Forming an image. Angel, section 1.4

The Human Visual System. Lecture 1. The Human Visual System. The Human Eye. The Human Retina. cones. rods. horizontal. bipolar. amacrine.

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Optical Systems: Pinhole Camera Pinhole camera: simple hole in a box: Called Camera Obscura Aristotle discussed, Al-Hazen analyzed in Book of Optics

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Digital photography , , Computational Photography Fall 2017, Lecture 2

Lecture PowerPoint. Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Last class. This class. CCDs Fancy CCDs. Camera specs scmos

Capturing Light in man and machine

1 Image Formation. 1.1 Optics 1.1

Visual Perception. Readings and References. Forming an image. Pinhole camera. Readings. Other References. CSE 457, Autumn 2004 Computer Graphics

Lecture 02 Image Formation 1

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

Photography (cont d)

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Applied Optics. , Physics Department (Room #36-401) , ,

The eye & corrective lenses

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Problems. How do cameras measure light and color? How do humans perceive light and color?

Astronomical Cameras

Reading. Lenses, cont d. Lenses. Vision and color. d d f. Good resources: Glassner, Principles of Digital Image Synthesis, pp

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

R.B.V.R.R. WOMEN S COLLEGE (AUTONOMOUS) Narayanaguda, Hyderabad.

How do we see the world?

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 9. Lecture 9. t (min)

Lecture 26. PHY 112: Light, Color and Vision. Finalities. Final: Thursday May 19, 2:15 to 4:45 pm. Prof. Clark McGrew Physics D 134

Capturing Light in man and machine

General Imaging System

Computer Vision. The Pinhole Camera Model

BROADCAST ENGINEERING 5/05 WHITE PAPER TUTORIAL. HEADLINE: HDTV Lens Design: Management of Light Transmission

What Are The Basic Part Of A Film Camera

Digital Image Processing COSC 6380/4393

Chapter 36. Image Formation

Reflectors vs. Refractors

Transcription:

Figure credits: B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, A. Theuwissen, and J. Malik Image Formation and Capture COS 429: Computer Vision

Image Formation and Capture Real world Optics Sensor Devices Sources of Error

Optics Pinhole camera Lenses Focus, aperture, distortion

Pinhole Camera Camera obscura ( dark room ) known since antiquity

Pinhole Camera Each point on image plane illuminated by light from one direction Image plane Image Object Pinhole Pinhole camera Joseph Nicéphore Niépce: first recording onto pewter plate coated with bitumen

Perspective Projection Phenomena

Straight Lines Remain Straight

Parallel Lines Converge at Vanishing Points

Parallel Lines Converge at Vanishing Points Each family of parallel lines has its own vanishing point

Nearer Objects Appear Bigger size in image ~ 1/distance

Pinhole Camera Limitations Aperture too big: blurry image Aperture too small: requires long exposure or high intensity Aperture much too small: diffraction through pinhole blurry image

Lenses Focus a bundle of rays from a scene point onto a single point on the imager Result: can make clear images with bigger aperture But only one distance in focus

Ideal Thin Lens Law Relationship between focal distance and focal length of lens: 1/d o + 1/d i = 1/f

Camera Adjustments Focus? Changes d i Iris? Zoom?

Focus and Depth of Field For a given d i, perfect focus at only one d o In practice, OK for some range of depths Circle of confusion smaller than a pixel Better depth of field with smaller apertures Better approximation to pinhole camera Also better depth of field with wide-angle lenses

Camera Adjustments Focus? Changes d i Iris? Changes aperture Zoom?

Aperture Controls amount of light Affects depth of field Affects distortion (since thin-lens approximation is better near center of lens stay tuned) f/1.4 f/5.6 f/16

Aperture Aperture typically given as f-number What is f /4? Aperture diameter is ¼ the focal length One f-stop equals change of f-number by 2 Equals change in aperture area by factor of 2 Equals change in amount of light by factor of 2 Example: f/2 f/2.8 f/4 (each one doubles light)

Camera Adjustments Focus? Changes d i Iris? Changes aperture Zoom? Changes f and sometimes d i

Zoom Lenses Varifocal

Zoom Lenses Parfocal

Field of View Q: What does field of view of camera depend on? Focal length of lens Size of imager Object distance?

Computing Field of View 1/d o + 1/d i = 1/f tan θ /2 = ½ x o / d o x o θ x i x o / d o = x i / d i θ = 2 tan -1 ½ x i (1/f 1/d o ) d o d i Since typically d o >> f, θ 2 tan -1 ½ x i / f θ x i / f

Photoreceptors Human retina Vidicon CCD and CMOS imagers

Photoreceptors in Human Retina Two types of receptors: rods and cones Rods and cones Cones in fovea (central part of retina)

Rods and Cones Rods More sensitive in low light: scotopic vision More dense near periphery Cones Only function with higher light levels: photopic vision Densely packed at center of eye: fovea Different types of cones color vision

Color Perception M L Spectral-response functions of the three types of cones (including absorption due to cornea and lens) S FvDFH

Tristimulus Color Any distribution of light can be summarized by its effect on 3 types of cones Therefore, human perception of color is a 3-dimensional space Metamerism: different spectra, same response Color blindness: fewer than 3 types of cones Most commonly L cone = M cone

Electronic Photoreceptors Analog technologies: Coated plates Film Digital technologies Vidicon CCD CMOS imagers Produce regular grid of pixels Measures light power integrated over some time period, over some area on image plane

Vidicon Best-known in family of photoconductive video cameras Basically television in reverse + + + + Electron Gun Scanning Electron Beam Lens System Photoconductive Plate

MOS Capacitors MOS = Metal Oxide Semiconductor Gate (wire) SiO 2 (insulator) p-type silicon

MOS Capacitors Voltage applied to gate repels positive holes in the semiconductor +V + + + + + + Depletion region (electron bucket )

MOS Capacitors Photon striking the material creates electron-hole pair Photon +V + + + + + + + Depletion region (electron bucket )

Charge Transfer CCDs (Charge-Coupled Devices) move charge from one bucket to another by manipulating voltages

CMOS Imagers Recently, can manufacture chips that combine photosensitive elements and processing elements Benefits: Partial readout Signal processing Eliminate some supporting chips low cost

Color Cameras CCD sensitivity does not match human eye Use band-pass color filters to adapt

3-Chip Color Cameras Use prisms and filters to split image across 3 sensors Expensive, hard to align

1-Chip Color Cameras Bayer grid Estimate missing components from neighboring values (demosaicing) Why more green? Human Luminance Sensitivity Function Seitz

Errors in Digital Images What are some sources of error in this image?

Sources of Error Geometric (focus, distortion) Color (1-chip artifacts, chromatic aberration) Radiometric (cosine falloff, vignetting) Bright areas (flare, bloom, clamping) Signal processing (gamma, compression) Noise

Monochromatic Aberrations Real lenses do not follow thin lens approximation because surfaces are spherical (manufacturing constraints) Result: thin-lens approximation only valid iff sin ϕ ϕ

Spherical Aberration Results in blurring of image, focus shifts when aperture is stopped down Can vary with the way lenses are oriented

Distortion Pincushion or barrel radial distortion Straight lines in the world no longer straight in image

Distortion Varies with placement of aperture

Distortion Varies with placement of aperture

Distortion Varies with placement of aperture

First-Order Radial Distortion Goal: mathematical formula for distortion If small, can be approximated by first-order formula (like Taylor series expansion): r = r (1 + κ r 2 ) r = ideal distance to center of image r = distorted distance to center of image Higher-order models are possible

Chromatic Aberration Due to dispersion in glass (focal length varies with the wavelength of light) Result: color fringes Worst at edges of image Correct by building lens systems with multiple kinds of glass

Correcting for Aberrations High-quality compound lenses use multiple lens elements to cancel out distortion and aberration Often 5-10 elements, more for zooms

Other Limitations of Lenses Optical vignetting: less power per unit area for light at an oblique angle Approximate falloff ~ cos 4 ϕ Result: darkening of edges Also mechanical vignetting due to multiple apertures

Other Limitations of Lenses Flare: light reflecting (often multiple times) from glass-air interface Results in ghost images or haziness Worse in multi-lens systems Ameliorated by optical coatings (thin-film interference) Bloom: overflow of charge in CCD buckets Spills to adjacent buckets Streaks (usually vertical) next to bright areas Some cameras have anti-bloom circuitry

Flare and Bloom Tanaka

Dynamic Range Most common cameras have 8-bit (per color channel) dynamic range Can be nonlinear: more than 255:1 intensity range Too bright: clamp to maximum Too dim: clamp to 0 Specialty cameras with higher dynamic range (usually 10-, 12-, and 16-bit)

High Dynamic Range (HDR) from Ordinary Cameras Take pictures of same scene with different shutter speeds Identify regions clamped to 0 or 255 Average other pixels, scaled by 1 / shutter speed Can extend dynamic range, but limitations of optics and imager (noise, flare, bloom) still apply

Gamma Vidicon tube naturally has signal that varies with light intensity according to a power law: Signal = E γ, γ 1 / 2.5 CRT (televisions) naturally obey a power law with gamma 2.3 2.5 Result: video signal standard has gamma of 1/2.5 CCDs and CMOS linear, but gamma 2.2 almost always applied

Consequences for Vision Output of most camera systems is not linear Know what it is! (Sometimes system automagically applies gamma correction ) Necessary to correct raw pixel values for: Reflectance measurements Shape from shading Photometric stereo Recognition under variable lighting

Consequences for Vision What about e.g. edge detection? Often want perceptually significant edges Standard nonlinear signal close to (inverse of) human response Using nonlinear signal often the right thing

Noise Thermal noise: in all electronics Noise at all frequencies Proportional to temperature Special cooled cameras available for low noise Shot noise: discrete photons / electrons Shows up at extremely low intensities CCDs / CMOS can have high efficiency approaching 1 electron per photon

Noise 1/f noise inversely proportional to frequency Amount depends on quality, manufacturing techniques Can be dominant source of noise All of the above apply for imager and amplifier

Filtering Noise Most common method simple blur e.g., convolution with Gaussian Adaptive filters to prevent bleed across intensity edges Other filters for specialized situations e.g., despeckling (median filters) for dead pixels Next time!

David Macaulay Great Moments in Architecture Plate XV: Locating the Vanishing Point (June 8, 1874)