Lenses, exposure, and (de)focus

Size: px
Start display at page:

Transcription

1 Lenses, exposure, and (de)focus , , Computational Photography Fall 2017, Lecture 15

2 Course announcements Homework 4 is out. - Due October 26 th. - Bilateral filter will take a very long time to run. - Final teams are on sign-up spreadsheet. - Drop by Yannis office to pick up cameras any time. Yannis has extra office hours on Wednesday, 2-4pm. - You can come to ask questions about HW4 (e.g., how do I use a DSLR camera? ). - You can come to ask questions about final project. Project ideas are due on Piazza on Friday 20 th.

3 Overview of today s lecture Motivation for using lenses. The thin lens model. Real lenses and aberrations. Field of view. Lens designations. Exposure control. Lens camera and pinhole camera. Telecentric lenses.

4 Slide credits Most of these slides were adapted from: Kris Kitani (15-463, Fall 2016). Fredo Durand (MIT). Some slides borrowed from: Gordon Wetzstein (Stanford).

5 Motivation for using lenses

6 Small (ideal) pinhole: 1. Image is sharp. 2. Signal-to-noise ratio is low. Pinhole camera

7 Pinhole camera Large pinhole: 1. Image is blurry. 2. Signal-to-noise ratio is high. Can we get best of both worlds?

8 Almost, by using lenses Lenses map bundles of rays from points on the scene to the sensor. How does this mapping work exactly?

9 The thin lens model

10 Thin lens model Simplification of geometric optics for well-designed lenses. Two assumptions: 1. Rays passing through lens center are unaffected.

11 Thin lens model Simplification of geometric optics for well-designed lenses. Two assumptions: focal length f 1. Rays passing through lens center are unaffected. 2. Parallel rays converge to a single point located on focal plane.

12 Thin lens model Simplification of geometric optics for well-designed lenses. Two assumptions: focal length f 1. Rays passing through lens center are unaffected. 2. Parallel rays converge to a single point located on focal plane.

13 Thin lens model Simplification of geometric optics for well-designed lenses. Two assumptions: focal length f 1. Rays passing through lens center are unaffected. 2. Parallel rays converge to a single point located on focal plane.

14 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? object distance D focal length f

15 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. object distance D focal length f

16 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: object distance D focal length f

17 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. object distance D focal length f

18 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

19 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

20 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

21 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

22 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

23 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

24 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

25 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f

26 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. object distance D focal length f Focusing property: 1. Rays emitted from a point on one side converge to a point on the other side.

27 Tracing rays through a thin lens Consider an object emitting a bundle of rays. How do they propagate through the lens? 1. Trace rays through lens center. 2. For all other rays: a. Trace their parallel through lens center. b. Connect on focal plane. Focusing property: object distance D focal length f 1. Rays emitted from a point on one side converge to a point on the other side. 2. Bundles emitted from a plane parallel to the lens converge on a common plane.

28 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? image height y object height y object distance D focal length f focus distance D

29 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? image height y object height y Use similar triangles object distance D focal length f focus distance D

30 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? y y =? image height y object height y Use similar triangles object distance D focal length f focus distance D

31 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? y y = D D image height y object height y Use similar triangles object distance D focal length f focus distance D

32 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? y y = D D y y =? image height y object height y Use similar triangles object distance D focal length f focus distance D

33 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? y y = D D y y = f D f image height y object height y Use similar triangles object distance D focal length f focus distance D

34 Thin lens formula How can we relate scene-space (D, y) and image space (D, y ) quantities? m = 1 f D f D + 1 D = 1 f image height y object height y object distance D Use similar triangles We call m = y / y the magnification focal length f focus distance D

35 Special focus distances D = f, D =?, m =? m = f D f 1 D + 1 D = 1 f

36 Special focus distances D = f, D =, m = infinity focus (parallel rays) m = f D f 1 D + 1 D = 1 f D = D =?, m =? D = f

37 Special focus distances D = f, D =, m = infinity focus (parallel rays) m = f D f 1 D + 1 D = 1 f D = f D = D = 2 f, m = 1 object is reproduced in real-life size D = 2 f D = 2 f

38 Are all our problems solved?

39 Are all our problems solved?

40 Are all our problems solved? objects at a one depth are in focus objects at all other depths are out of focus

41 Are all our problems solved? circle of confusion (i.e., blur kernel) objects at one depth are in focus objects at all other depths are out of focus Is the circle of confusion constant?

42 Are all our problems solved? circle of confusion (i.e., blur kernel) objects at one depth are in focus objects at all other depths are out of focus How do we change the depth where objects are in focus?

43 Real lenses and aberrations

44 Thin lenses are a fiction The thin lens model assumes that the lens has no thickness, but this is never true To make real lenses behave like ideal thin lenses, we have to use combinations of multiple lens elements (compound lenses).

45 Thin lenses are a fiction The thin lens model assumes that the lens has no thickness, but this is never true To make real lenses behave like ideal thin lenses, we have to use combinations of multiple lens elements (compound lenses).

46 Aberrations Deviations from ideal thin lens behavior (i.e., imperfect focus). Example: chromatic aberration. focal length shifts with wavelength one lens cancels out dispersion of other glass has dispersion (refractive index changes with wavelength) glasses of different refractive index Using a doublet (two-element compound lens), we can reduce chromatic aberration.

47 Aberrations Deviations from ideal thin lens behavior (i.e., imperfect focus). Example: chromatic aberration. Many other types (coma, spherical, astigmatism. Why do we wear glasses?

48 Aberrations Deviations from ideal thin lens behavior (i.e., imperfect focus). Example: chromatic aberration. Many other types (coma, spherical, astigmatism. Why do we wear glasses? We turn our eye into a compound lens to correct aberrations!

49 Field of view

50 What happens as you take a closer look?

51 Field of view also known as angle of view φ

52 Field of view change in focus What happens to field of view when we focus closer?

53 Field of view change in focus What happens to field of view when we focus closer? It decreases.

54 Field of view change in focal length Note: zooming means changing focal length, which is different from refocusing What happens to field of view when we focus closer? It decreases. What happens to field of view when we increase focal length?

55 Field of view change in focal length move sensor to keep focus at same distance Note: zooming means changing focal length, which is different from refocusing What happens to field of view when we focus closer? It decreases. What happens to field of view when we increase focal length? It decreases.

56 Field of view

57 Field of view Increasing the focal length is similar to cropping f =? f =? f =?

58 Field of view Increasing the focal length is similar to cropping f = 25 mm f =? f =?

59 Field of view Increasing the focal length is similar to cropping f = 25 mm f = 50 mm f =?

60 Field of view Increasing the focal length is similar to cropping f = 25 mm f = 50 mm Is this effect identical to cropping? f = 135 mm

61 Perspective distortion Different focal lengths introduce different perspective distortion at same magnification. short focal length mid focal length long focal length

62 Field of view also depends on sensor size What happens to field of view when we reduce sensor size?

63 Field of view also depends on sensor size change in focal length What happens to field of view when we reduce sensor size? It decreases.

64 Field of view also depends on sensor size Full frame corresponds to standard film size. Digital sensors come in smaller formats due to manufacturing limitations (now mostly overcome). Lenses are often described in terms of field of view on film instead of focal length. These descriptions are invalid when not using full-frame sensor.

65 Crop factor How much field of view is cropped when using a sensor smaller than full frame.

66 Lens designations

67 Designation based on field of view What focal lengths go to what category depends on sensor size. Here we assume full frame sensor (same as 35 mm film). Even then, there are no welldefined ranges for each category. wide-angle mid-range f = 25 mm f = 50 mm telephoto f = 135 mm

68 Wide-angle lenses Lenses with focal length 35 mm or smaller. They tend to have large and curvy frontal elements.

69 Wide-angle lenses Ultra-wide lenses can get impractically wide Fish-eye lens: can produce (near) hemispherical field of view.

70 Telephoto lenses Lenses with focal length 85 mm or larger. Technically speaking, telephoto refers to a specific lens design, not a focal length range. But that design is mostly useful for long focal lengths, so it has also come to mean any lens with such a focal length. Telephotos can get very big

71 Telephoto lenses What is this? What is its focal length? Telephotos can get very big

72 Prime vs zoom lenses focus ring: changes focus distance single focal length available focal length range focus ring: changes focus distance zoom ring: changes focal length Prime lens: fixed focal length Zoom lens: variable focal length Why use prime lenses and not always use the more versatile zoom lenses?

73 Prime vs zoom lenses focus ring: changes focus distance single focal length available focal length range focus ring: changes focus distance zoom ring: changes focal length Prime lens: fixed focal length Zoom lens: variable focal length Why use prime lenses and not always use the more versatile zoom lenses? Zoom lenses have larger aberrations due to the need to cover multiple focal lengths.

74 Other kinds of lens designations Macro lens: can achieve very large magnifications (typically at least 1:1). Macro photography: extremely close-up photography. Achromatic or apochromatic lens: corrected for chromatic aberration. Achromatic: two wavelengths have same focus. Apochromatic (better): three wavelengths have same focus. Aspherical lens: manufactured to have special (non-spherical) shape that reduces aberrations. Expensive, often only 1-2 elements in a compound lens are aspherical.

75 Exposure control

76 Exposure controls brightness of image Aperture Exposure Shutter ISO

77 Exposure controls brightness of image Aperture Exposure Shutter ISO

78 Shutter speed Controls the length of time that shutter remains open. incoming light shutter sensor closed shutter

79 Shutter speed Controls the length of time that shutter remains open. incoming light shutter sensor open shutter

80 Shutter speed

81 Shutter speed Controls the period of time that shutter remains open. incoming light shutter sensor What happens to the image as we increase shutter speed? open shutter

82 Side-effects of shutter speed Moving scene elements appear blurry. How can we simulate decreasing the shutter speed?

83 Motion deblurring Shah et al. High-quality Motion Deblurring from a Single Image, SIGGRAPH 2008

84 Exposure controls brightness of image Aperture Exposure Shutter ISO

85 controls area of lens that receives light Aperture

86 controls area of lens that receives light

87

88 Aperture size

89 Circle of confusion aperture also determines the size of circle of confusion for out of focus objects

90 Circle of confusion Aperture also controls size of circle of confusion for out of focus objects Take off your glasses and squint.

91 Depth of field Range of depths for which the circle of confusion is acceptable

92

93 Depth of field

94 Depth of field Sharp depth of field ( bokeh ) is often desirable

95 Depth of field Sharp depth of field ( bokeh ) is often desirable and not just for campaigning reasons

96 Depth of field Sharp depth of field ( bokeh ) is often desirable and not just for campaigning reasons

97 Depth of field Form of bokeh is determined by shape of aperture

98 Lens speed A fast lens is one that has a very large max aperture. Fast lenses tend to be bulky and expensive. Leica Noctilux 50mm f/0.95 (Price tag: > \$10,000

99 How can you simulate bokeh?

100 How can you simulate bokeh? Infer per-pixel depth, then blur with depth-dependent kernel. Example: Google camera lens blur feature Barron et al., Fast Bilateral-Space Stereo for Synthetic Defocus, CVPR 2015

101 Exposure controls brightness of image Aperture Exposure Shutter ISO

102 The (in-camera) image processing pipeline The sequence of image processing operations applied by the camera s image signal processor (ISP) to convert a RAW image into a conventional image. denoising CFA demosaicing analog frontend white balance RAW image (mosaiced, linear, 12-bit) color transforms tone reproduction compression final RGB image (nonlinear, 8-bit)

103 Analog front-end analog voltage analog voltage discrete signal discrete signal analog amplifier (gain): gets voltage in range needed by A/D converter. accommodates ISO settings. accounts for vignetting. analog-to-digital converter (ADC): depending on sensor, output has bits. most often (?) 12 bits. look-up table (LUT): corrects non-linearities in sensor s response function (within proper exposure). corrects defective pixels. ISO is an initialism for the International Organization for Standardization

104 Side-effects of increasing ISO Image becomes very grainy because noise is amplified.

105 Camera modes Aperture priority ( A ): you set aperture, camera sets everything else. Pros: Direct depth of field control. Cons: Can require impossible shutter speed (e.g. with f/1.4 for a bright scene). Shutter speed priority ( S ): you set shutter speed, camera sets everything else. Pros: Direct motion blur control. Cons: Can require impossible aperture (e.g. when requesting a 1/1000 speed for a dark scene) Automatic ( AUTO ): camera sets everything. Pros: Very fast, requires no experience. Cons: No control. Manual ( M ): you set everything. Pros: Full control. Cons: Very slow, requires a lot of experience. generic camera mode dial

106 Lens camera and pinhole camera

107 The pinhole camera image plane real-world object camera center focal length f

108 The (rearranged) pinhole camera image plane real-world object focal length f camera center

109 The (rearranged) pinhole camera image plane camera center principal axis Is this model valid for a camera using a lens?

110 Telecentric lenses

111 Orthographic vs pinhole camera image plane magnification does not change with depth What lens do we use for an orthographic camera? magnification changes with depth

112 Telecentric lens Place a pinhole at focal length, so that only rays parallel to primary ray pass through. object distance D focal length f focus distance D

113 Regular vs telecentric lens regular lens telecentric lens

114 References Basic reading: Szeliski textbook, Section Additional reading: London and Upton, Photography, Pearson a great book on photography, discussing in detail many of the issues addressed in this lecture. Ray, Applied Photographic Optics, Focal Press another nice book covering everything about photographic optics. Shah et al., High-quality Motion Deblurring from a Single Image, SIGGRAPH Fergus et al., Removing Camera Shake from a Single Image, SIGGRAPH two standard papers on motion deblurring for dealing with long shutter speeds. Barron et al., Fast Bilateral-Space Stereo for Synthetic Defocus, CVPR the lens blur paper.

High dynamic range imaging and tonemapping

High dynamic range imaging and tonemapping http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 12 Course announcements Homework 3 is out. - Due

Deconvolution , , Computational Photography Fall 2017, Lecture 17

Deconvolution http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 17 Course announcements Homework 4 is out. - Due October 26 th. - There was another

Computational Photography and Video. Prof. Marc Pollefeys

Computational Photography and Video Prof. Marc Pollefeys Today s schedule Introduction of Computational Photography Course facts Syllabus Digital Photography What is computational photography Convergence

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Bill Freeman Frédo Durand MIT - EECS Administrivia PSet 1 is out Due Thursday February 23 Digital SLR initiation? During

Coded photography , , Computational Photography Fall 2017, Lecture 18

Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 18 Course announcements Homework 5 delayed for Tuesday. - You will need cameras

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 22: Cameras & Lenses III Computer Graphics and Imaging UC Berkeley, Spring 2017 F-Number For Lens vs. Photo A lens s F-Number is the maximum for that lens E.g. 50 mm F/1.4 is a high-quality telephoto

Building a Real Camera. Slides Credit: Svetlana Lazebnik

Building a Real Camera Slides Credit: Svetlana Lazebnik Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible?

Coded photography , , Computational Photography Fall 2018, Lecture 14

Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 14 Overview of today s lecture The coded photography paradigm. Dealing with

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Deconvolution http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 12 Course announcements Homework 3 is out. - Due October 12 th. - Any questions?

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR)

COURSE NAME: PHOTOGRAPHY AND AUDIO VISUAL PRODUCTION (VOCATIONAL) FOR UNDER GRADUATE (FIRST YEAR) PAPER TITLE: BASIC PHOTOGRAPHIC UNIT - 3 : SIMPLE LENS TOPIC: LENS PROPERTIES AND DEFECTS OBJECTIVES By

Physics 1230 Homework 8 Due Friday June 24, 2016

At this point, you know lots about mirrors and lenses and can predict how they interact with light from objects to form images for observers. In the next part of the course, we consider applications of

Building a Real Camera

Building a Real Camera Home-made pinhole camera Slide by A. Efros http://www.debevec.org/pinhole/ Shrinking the aperture Why not make the aperture as small as possible? Less light gets through Diffraction

Understanding Focal Length

JANUARY 19, 2018 BEGINNER Understanding Focal Length Featuring DIANE BERKENFELD, DAVE BLACK, MIKE CORRADO & LINDSAY SILVERMAN Focal length, usually represented in millimeters (mm), is the basic description

Cameras and Sensors. Today. Today. It receives light from all directions. BIL721: Computational Photography! Spring 2015, Lecture 2!

!! Cameras and Sensors Today Pinhole camera! Lenses! Exposure! Sensors! photo by Abelardo Morell BIL721: Computational Photography! Spring 2015, Lecture 2! Aykut Erdem! Hacettepe University! Computer Vision

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Announcement A total of 5 (five) late days are allowed for projects. Office hours Me: 3:50-4:50pm Thursday (or by appointment) Jake: 12:30-1:30PM Monday and Wednesday Image Formation Digital Camera Film

Two strategies for realistic rendering capture real world data synthesize from bottom up

Recap from Wednesday Two strategies for realistic rendering capture real world data synthesize from bottom up Both have existed for 500 years. Both are successful. Attempts to take the best of both world

The Camera : Computational Photography Alexei Efros, CMU, Fall 2005

The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2005 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

Digital photography , , Computational Photography Fall 2017, Lecture 2

Digital photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 2 Course announcements To the 14 students who took the course survey on

Computational Approaches to Cameras

Computational Approaches to Cameras 11/16/17 Magritte, The False Mirror (1935) Computational Photography Derek Hoiem, University of Illinois Announcements Final project proposal due Monday (see links on

Telecentric Imaging Object space telecentricity stop source: edmund optics The 5 classical Seidel Aberrations First order aberrations Spherical Aberration (~r 4 ) Origin: different focal lengths for different

6.A44 Computational Photography

Add date: Friday 6.A44 Computational Photography Depth of Field Frédo Durand We allow for some tolerance What happens when we close the aperture by two stop? Aperture diameter is divided by two is doubled

Cameras. Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017

Cameras Steve Rotenberg CSE168: Rendering Algorithms UCSD, Spring 2017 Camera Focus Camera Focus So far, we have been simulating pinhole cameras with perfect focus Often times, we want to simulate more

The Camera : Computational Photography Alexei Efros, CMU, Fall 2008

The Camera 15-463: Computational Photography Alexei Efros, CMU, Fall 2008 How do we see the world? object film Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable

Tonemapping and bilateral filtering

Tonemapping and bilateral filtering http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 6 Course announcements Homework 2 is out. - Due September

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Camera trial #1 Cameras Digital Visual Effects Yung-Yu Chuang scene film with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Put a piece of film in front of an object. Pinhole camera

CSE 473/573 Computer Vision and Image Processing (CVIP)

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu inwogu@buffalo.edu Lecture 4 Image formation(part I) Schedule Last class linear algebra overview Today Image formation and camera properties

Chapters 1 & 2. Definitions and applications Conceptual basis of photogrammetric processing

Chapters 1 & 2 Chapter 1: Photogrammetry Definitions and applications Conceptual basis of photogrammetric processing Transition from two-dimensional imagery to three-dimensional information Automation

Introduction to Optical Modeling. Friedrich-Schiller-University Jena Institute of Applied Physics. Lecturer: Prof. U.D. Zeitner

Introduction to Optical Modeling Friedrich-Schiller-University Jena Institute of Applied Physics Lecturer: Prof. U.D. Zeitner The Nature of Light Fundamental Question: What is Light? Newton Huygens / Maxwell

What will be on the midterm?

What will be on the midterm? CS 178, Spring 2014 Marc Levoy Computer Science Department Stanford University General information 2 Monday, 7-9pm, Cubberly Auditorium (School of Edu) closed book, no notes

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Image Formation and Capture Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen Image Formation and Capture Real world Optics Sensor Devices Sources of Error

Lenses. Overview. Terminology. The pinhole camera. Pinhole camera Lenses Principles of operation Limitations

Overview Pinhole camera Principles of operation Limitations 1 Terminology The pinhole camera The first camera - camera obscura - known to Aristotle. In 3D, we can visualize the blur induced by the pinhole

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

LENSES. INEL 6088 Computer Vision

LENSES INEL 6088 Computer Vision Digital camera A digital camera replaces film with a sensor array Each cell in the array is a Charge Coupled Device light-sensitive diode that converts photons to electrons

Limitations of lenses

Limitations of lenses CS 448A, Winter 2010 Marc Levoy Computer Science Department Stanford University Outline misfocus & depth of field aberrations & distortion veiling glare flare and ghost images vignetting

Waves & Oscillations

Physics 42200 Waves & Oscillations Lecture 33 Geometric Optics Spring 2013 Semester Matthew Jones Aberrations We have continued to make approximations: Paraxial rays Spherical lenses Index of refraction

Color , , Computational Photography Fall 2017, Lecture 11

Color http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 11 Course announcements Homework 2 grades have been posted on Canvas. - Mean: 81.6% (HW1:

Digital photography , , Computational Photography Fall 2018, Lecture 2

Digital photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 2 Course announcements To the 26 students who took the start-of-semester

Lecture 2: Geometrical Optics. Geometrical Approximation. Lenses. Mirrors. Optical Systems. Images and Pupils. Aberrations.

Lecture 2: Geometrical Optics Outline 1 Geometrical Approximation 2 Lenses 3 Mirrors 4 Optical Systems 5 Images and Pupils 6 Aberrations Christoph U. Keller, Leiden Observatory, keller@strw.leidenuniv.nl

Basic principles of photography. David Capel 346B IST

Basic principles of photography David Capel 346B IST Latin Camera Obscura = Dark Room Light passing through a small hole produces an inverted image on the opposite wall Safely observing the solar eclipse

Cameras. CSE 455, Winter 2010 January 25, 2010

Cameras CSE 455, Winter 2010 January 25, 2010 Announcements New Lecturer! Neel Joshi, Ph.D. Post-Doctoral Researcher Microsoft Research neel@cs Project 1b (seam carving) was due on Friday the 22 nd Project

Cameras, lenses and sensors

Cameras, lenses and sensors Marc Pollefeys COMP 256 Cameras, lenses and sensors Camera Models Pinhole Perspective Projection Affine Projection Camera with Lenses Sensing The Human Eye Reading: Chapter.

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Camera Simulation Effect Cause Field of view Film size, focal length Depth of field Aperture, focal length Exposure Film speed, aperture, shutter Motion blur Shutter References Photography, B. London and

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Image Formation Dr. Gerhard Roth COMP 4102A Winter 2015 Version 3 1 Image Formation Two type of images Intensity image encodes light intensities (passive sensor) Range (depth) image encodes shape and distance

Image Formation and Camera Design

Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

Cameras. Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26. with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Cameras Digital Visual Effects, Spring 2008 Yung-Yu Chuang 2008/2/26 with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Camera trial #1 scene film Put a piece of film in front of

Unit 1: Image Formation

Unit 1: Image Formation 1. Geometry 2. Optics 3. Photometry 4. Sensor Readings Szeliski 2.1-2.3 & 6.3.5 1 Physical parameters of image formation Geometric Type of projection Camera pose Optical Sensor

Lecture Outline Chapter 27. Physics, 4 th Edition James S. Walker. Copyright 2010 Pearson Education, Inc.

Lecture Outline Chapter 27 Physics, 4 th Edition James S. Walker Chapter 27 Optical Instruments Units of Chapter 27 The Human Eye and the Camera Lenses in Combination and Corrective Optics The Magnifying

Exposure settings & Lens choices

Exposure settings & Lens choices Graham Relf Tynemouth Photographic Society September 2018 www.tynemouthps.org We will look at the 3 variables available for manual control of digital photos: Exposure time/duration,

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Ankit Mohan & Jack Tumblin Amit Agrawal, Mitsubishi Electric Research

CAMERA BASICS. Stops of light

CAMERA BASICS Stops of light A stop of light isn t a quantifiable measurement it s a relative measurement. A stop of light is defined as a doubling or halving of any quantity of light. The word stop is

Lecture PowerPoint. Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli

Lecture PowerPoint Chapter 25 Physics: Principles with Applications, 6 th edition Giancoli 2005 Pearson Prentice Hall This work is protected by United States copyright laws and is provided solely for the

Color , , Computational Photography Fall 2018, Lecture 7

Color http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 7 Course announcements Homework 2 is out. - Due September 28 th. - Requires camera and

Exam Preparation Guide Geometrical optics (TN3313)

Exam Preparation Guide Geometrical optics (TN3313) Lectures: September - December 2001 Version of 21.12.2001 When preparing for the exam, check on Blackboard for a possible newer version of this guide.

Image Formation and Capture

Figure credits: B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, A. Theuwissen, and J. Malik Image Formation and Capture COS 429: Computer Vision Image Formation and Capture Real world Optics Sensor Devices

9/19/16. A Closer Look. Danae Wolfe. What We ll Cover. Basics of photography & your camera. Technical. Macro & close-up techniques.

A Closer Look Danae Wolfe What We ll Cover Basics of photography & your camera Technical Macro & close-up techniques Creative 1 What is Photography? Photography: the art, science, & practice of creating

Lecture 4: Geometrical Optics 2. Optical Systems. Images and Pupils. Rays. Wavefronts. Aberrations. Outline

Lecture 4: Geometrical Optics 2 Outline 1 Optical Systems 2 Images and Pupils 3 Rays 4 Wavefronts 5 Aberrations Christoph U. Keller, Leiden University, keller@strw.leidenuniv.nl Lecture 4: Geometrical

Chapter 29/30. Wave Fronts and Rays. Refraction of Sound. Dispersion in a Prism. Index of Refraction. Refraction and Lenses

Chapter 29/30 Refraction and Lenses Refraction Refraction the bending of waves as they pass from one medium into another. Caused by a change in the average speed of light. Analogy A car that drives off

CS 443: Imaging and Multimedia Cameras and Lenses

CS 443: Imaging and Multimedia Cameras and Lenses Spring 2008 Ahmed Elgammal Dept of Computer Science Rutgers University Outlines Cameras and lenses! 1 They are formed by the projection of 3D objects.

Image Formation: Camera Model

Image Formation: Camera Model Ruigang Yang COMP 684 Fall 2005, CS684-IBMR Outline Camera Models Pinhole Perspective Projection Affine Projection Camera with Lenses Digital Image Formation The Human Eye

Camera Mechanics & camera function. Daily independent reading:pgs. 1-5 Silently read for 10 min. Note taking led by Mr. Hiller

Camera Mechanics & camera function Daily independent reading:pgs. 1-5 Silently read for 10 min. Note taking led by Mr. Hiller Focused Learning Target: We will be able to identify the various parts of the

Chapter 25 Optical Instruments

Chapter 25 Optical Instruments Units of Chapter 25 Cameras, Film, and Digital The Human Eye; Corrective Lenses Magnifying Glass Telescopes Compound Microscope Aberrations of Lenses and Mirrors Limits of

CPSC 425: Computer Vision

1 / 55 CPSC 425: Computer Vision Instructor: Fred Tung ftung@cs.ubc.ca Department of Computer Science University of British Columbia Lecture Notes 2015/2016 Term 2 2 / 55 Menu January 7, 2016 Topics: Image

Projection. Readings. Szeliski 2.1. Wednesday, October 23, 13

Projection Readings Szeliski 2.1 Projection Readings Szeliski 2.1 Müller-Lyer Illusion by Pravin Bhat Müller-Lyer Illusion by Pravin Bhat http://www.michaelbach.de/ot/sze_muelue/index.html Müller-Lyer

Image formation - Cameras. Grading & Project. About the course. Tentative Schedule. Course Content. Students introduction

About the course Instructors: Haibin Ling (hbling@temple, Wachman 35) Hours Lecture: Tuesda 5:3-8:pm, TTLMAN 43B Office hour: Tuesda 3: - 5:pm, or b appointment Textbook Computer Vision: Models, Learning,

Applied Optics. , Physics Department (Room #36-401) , ,

Applied Optics Professor, Physics Department (Room #36-401) 2290-0923, 019-539-0923, shsong@hanyang.ac.kr Office Hours Mondays 15:00-16:30, Wednesdays 15:00-16:30 TA (Ph.D. student, Room #36-415) 2290-0921,

How do we see the world?

The Camera 1 How do we see the world? Let s design a camera Idea 1: put a piece of film in front of an object Do we get a reasonable image? Credit: Steve Seitz 2 Pinhole camera Idea 2: Add a barrier to

ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

DSLR Cameras have a wide variety of lenses that can be used.

Chapter 8-Lenses DSLR Cameras have a wide variety of lenses that can be used. The camera lens is very important in making great photographs. It controls what the sensor sees, how much of the scene is included,

Coded Computational Photography!

Coded Computational Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 9! Gordon Wetzstein! Stanford University! Coded Computational Photography - Overview!!

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

Aperture and Digi scoping. Thoughts on the value of the aperture of a scope digital camera combination.

Aperture and Digi scoping. Thoughts on the value of the aperture of a scope digital camera combination. Before entering the heart of the matter, let s do a few reminders. 1. Entrance pupil. It is the image

Performance Factors. Technical Assistance. Fundamental Optics

Performance Factors After paraxial formulas have been used to select values for component focal length(s) and diameter(s), the final step is to select actual lenses. As in any engineering problem, this

OPTICAL SYSTEMS OBJECTIVES

101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms

Mastering Y our Your Digital Camera

Mastering Your Digital Camera The Exposure Triangle The ISO setting on your camera defines how sensitive it is to light. Normally ISO 100 is the least sensitive setting on your camera and as the ISO numbers

Computational Cameras. Rahul Raguram COMP

Computational Cameras Rahul Raguram COMP 790-090 What is a computational camera? Camera optics Camera sensor 3D scene Traditional camera Final image Modified optics Camera sensor Image Compute 3D scene

Cameras. Outline. Pinhole camera. Camera trial #1. Pinhole camera Film camera Digital camera Video camera

Outline Cameras Pinhole camera Film camera Digital camera Video camera Digital Visual Effects, Spring 2007 Yung-Yu Chuang 2007/3/6 with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros

Introduction. Related Work

Introduction Depth of field is a natural phenomenon when it comes to both sight and photography. The basic ray tracing camera model is insufficient at representing this essential visual element and will

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

Topic 6 - Optics Depth of Field and Circle Of Confusion

Topic 6 - Optics Depth of Field and Circle Of Confusion Learning Outcomes In this lesson, we will learn all about depth of field and a concept known as the Circle of Confusion. By the end of this lesson,

TAKING GREAT PICTURES. A Modest Introduction

TAKING GREAT PICTURES A Modest Introduction HOW TO CHOOSE THE RIGHT CAMERA EQUIPMENT WE ARE NOW LIVING THROUGH THE GOLDEN AGE OF PHOTOGRAPHY Rapid innovation gives us much better cameras and photo software...

Projection. Projection. Image formation. Müller-Lyer Illusion. Readings. Readings. Let s design a camera. Szeliski 2.1. Szeliski 2.

Projection Projection Readings Szeliski 2.1 Readings Szeliski 2.1 Müller-Lyer Illusion Image formation object film by Pravin Bhat http://www.michaelbach.de/ot/sze_muelue/index.html Let s design a camera

Why learn about photography in this course?

Why learn about photography in this course? Geri's Game: Note the background is blurred. - photography: model of image formation - Many computer graphics methods use existing photographs e.g. texture &

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response

lecture 24 image capture - photography: model of image formation - image blur - camera settings (f-number, shutter speed) - exposure - camera response - application: high dynamic range imaging Why learn

Dr F. Cuzzolin 1. September 29, 2015

P00407 Principles of Computer Vision 1 1 Department of Computing and Communication Technologies Oxford Brookes University, UK September 29, 2015 September 29, 2015 1 / 73 Outline of the Lecture 1 2 Basics

GEOMETRICAL OPTICS AND OPTICAL DESIGN

GEOMETRICAL OPTICS AND OPTICAL DESIGN Pantazis Mouroulis Associate Professor Center for Imaging Science Rochester Institute of Technology John Macdonald Senior Lecturer Physics Department University of

Optical Design with Zemax

Optical Design with Zemax Lecture : Correction II 3--9 Herbert Gross Summer term www.iap.uni-jena.de Correction II Preliminary time schedule 6.. Introduction Introduction, Zemax interface, menues, file

Chapter 18 Optical Elements

Chapter 18 Optical Elements GOALS When you have mastered the content of this chapter, you will be able to achieve the following goals: Definitions Define each of the following terms and use it in an operational

To start there are three key properties that you need to understand: ISO (sensitivity)

Some Photo Fundamentals Photography is at once relatively simple and technically confusing at the same time. The camera is basically a black box with a hole in its side camera comes from camera obscura,

Get the Shot! Photography + Instagram Workshop September 21, 2013 BlogPodium. Saturday, 21 September, 13

Get the Shot! Photography + Instagram Workshop September 21, 2013 BlogPodium Part One: Taking your camera off manual Technical details Common problems and how to fix them Practice Ways to make your photos

Lens Design I. Lecture 3: Properties of optical systems II Herbert Gross. Summer term

Lens Design I Lecture 3: Properties of optical systems II 205-04-8 Herbert Gross Summer term 206 www.iap.uni-jena.de 2 Preliminary Schedule 04.04. Basics 2.04. Properties of optical systrems I 3 8.04.

Lens Design I. Lecture 3: Properties of optical systems II Herbert Gross. Summer term

Lens Design I Lecture 3: Properties of optical systems II 207-04-20 Herbert Gross Summer term 207 www.iap.uni-jena.de 2 Preliminary Schedule - Lens Design I 207 06.04. Basics 2 3.04. Properties of optical

CH. 23 Mirrors and Lenses HW# 6, 7, 9, 11, 13, 21, 25, 31, 33, 35

CH. 23 Mirrors and Lenses HW# 6, 7, 9, 11, 13, 21, 25, 31, 33, 35 Mirrors Rays of light reflect off of mirrors, and where the reflected rays either intersect or appear to originate from, will be the location

Macro and Close-up Photography

Photo by Daniel Schwen Macro and Close-up Photography Digital Photography DeCal 2010 Nathan Yan Kellen Freeman Some slides adapted from Zexi Eric Yan What Is Macro Photography? Macro commonly refers to

30 Lenses. Lenses change the paths of light.

Lenses change the paths of light. A light ray bends as it enters glass and bends again as it leaves. Light passing through glass of a certain shape can form an image that appears larger, smaller, closer,

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

KODAK for use with the KODAK CMOS Image Sensors November 2004 Revision 2 1.1 Introduction Choosing the right lens is a critical aspect of designing an imaging system. Typically the trade off between image

Modeling and Synthesis of Aperture Effects in Cameras

Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1 Outline Introduction and Related Work Modeling Vignetting

Intro to Digital SLR and ILC Photography Week 1 The Camera Body

Intro to Digital SLR and ILC Photography Week 1 The Camera Body Instructor: Roger Buchanan Class notes are available at www.thenerdworks.com Course Outline: Week 1 Camera Body; Week 2 Lenses; Week 3 Accessories,