Coded Computational Imaging: Light Fields and Applications

Size: px
Start display at page:

Download "Coded Computational Imaging: Light Fields and Applications"

Transcription

1 Coded Computational Imaging: Light Fields and Applications Ankit Mohan MIT Media Lab

2 Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule Introduction Assorted Pixels Coding and Modulation in Cameras Break Light Fields and Applications Break Computational Illumination Future Trends Discussion Srinivasa, 10 mins Srinivasa, 20 mins Amit, 45 mins 10 min Ankit, 60 mins 10 min Srinivasa, 45 mins Amit, 15 mins

3 Light Field Basics

4 The Plenoptic Function Figure by Leonard McMillan Q: What is the set of all things that we can ever see? A: The Plenoptic Function [Adelson & Bergen] Let s start with a stationary person and try to parameterize everything that she can see Slide adapted from Michael Cohen and Rick Szeliski

5 Grayscale Snapshot is intensity of light Seen from a single view point At a single time P(θ,φ) Figure by Leonard McMillan Averaged over the wavelengths of the visible spectrum Slide adapted from Michael Cohen and Rick Szeliski

6 Color Snapshot is intensity of light P(θ,φ,λ) Seen from a single view point At a single time As a function of wavelength Figure by Leonard McMillan Slide adapted from Michael Cohen and Rick Szeliski

7 A Movie is intensity of light P(θ,φ,λ, θ,φ,λ,t) Seen from a single view point Over time As a function of wavelength Figure by Leonard McMillan Slide adapted from Michael Cohen and Rick Szeliski

8 A Holographic Movie is intensity of light seen from ANY viewpoint over time as a function of wavelength Figure by Leonard McMillan P(θ,φ,λ,,t,V X,V Y,V Z ) Slide adapted from Michael Cohen and Rick Szeliski

9 The Plenoptic Function Figure by Leonard McMillan P(θ,φ,λ,,t,V X,V Y,V Z ) Can reconstruct every possible view, at every moment, from every position, at every wavelength Slide adapted from Michael Cohen and Rick Szeliski

10 Ray of Light Let s ignore time and color: 5D 3D position 2D direction P(θ,φ, θ,φ,v X,V Y,V Z ) Slide adapted from Michael Cohen and Rick Szeliski

11 Ray of Light in Free Space No Occluding Objects 4D 2D position 2D direction P(θ,φ, θ,φ,v X,V Y,V Z ) Slide adapted from Michael Cohen and Rick Szeliski

12 Light Field [Levoy & Hanrahan 1996] Radiance as a function of position and direction 4D in 3D free space (u, v, s, t) 2D in flatland (u, v, s, t) θ x u s θ θ position-angle parameterization two plane parameterization

13 Light Field Generation

14

15

16 visible light barcodes space time angle [UPC Code, QR Code, Data Matrix Code, Shot Code, Microsoft Tag, ] [IR remote, Sony ID CAM] + standard camera focused at infinity Bokode: Imperceptible Visual Tags for Camera Based Interaction from a Distance, Ankit Mohan, Grace Woo, Shinsaku Hiura, Quinn Smithwick and Ramesh Raskar, in SIGGRAPH Bokode

17 camera Bokode (angle) sensor

18 camera Bokode (angle) sensor ahh circle of confusion circle of information - Kurt Akeley

19 generate directionally encoded information Bokode f b

20 capture directionally encoded information camera Bokode f b f c

21 infinity-corrected microscope camera Bokode f b f c magnification = f c /f b (microscope); focus always at infinity

22 camera Bokode f b less distance more of Bokode imaged f c

23 camera Bokode f b f c less distance more of Bokode imaged

24 MIT Media Lab Camera Culture Bokode image depends on camera angle camera Bokode fb

25 MIT Media Lab Camera Culture Bokode image depends on camera angle camera Bokode fb fc

26 x id=42,x=8,y=5 y id=42 x=7 y=5 camera id=42 x=9 y=5 id=42 x=7 y=6 id=42 x=9 y=6 id=42 x=7 y=7 id=42 x=9 y=7 id=42,x=8,y=7

27 digital angle from Bokode id = (42,10,7)

28 prototype assembled

29 prototype exploded led: 120 view angle, 1350mcd pattern: 15µm resolution lenslet: f=8mm, Φ=3mm cost: ~$5

30 capturing Bokodes focus blur (85mm f/1.8; infinity focus) motion blur (50mm f/8; ~2cm motion)

31 MIT Media Lab Camera Culture street-view tagging

32 capturing Bokodes cell-phone camera close to the Bokode (10,000+ bytes of data)

33

34 traditional AR markers ARToolKit [Kato and Billinghurst 1999] ARTag [Fiala 2005] skew of marker

35 angle estimation robustness

36 wide field of view Bokode via Krill eye compound superposition optics Krill-eye: Superposition Compound Eye for Wide-Angle Imaging via GRIN Lenses, Shinsaku Hiura, Ankit Mohan, Ramesh Raskar, in OMNIVIS 2009.

37 barcode RFID Bokode encoding spatial rf modulation angular decoder camera dedicated reader camera geometry no no yes physical size ~ cm ~ cm ~ mm cost ~ free ~ $0.05 ~ $0.05 (currently $5) range ~ cm ~ cm ~ m (with large aperture lens) line of sight yes no yes

38 tabletop/surface interaction stylus based interaction identity position angle

39 multi-user interaction Bokode laser pointers

40 Bokode next to the eye eye Bokode f e

41 multiple Bokodes next to the eye Bokode A eye Bokode B f e Bokode images overlap

42 relaxed perfect eye focused at infinity Bokode A eye virtual point at infinity A B image points overlap Bokode B f e

43 relaxed eye with myopia Bokode A eye virtual point at infinity A B distinct image points Bokode B f e eye unable to focus at infinity

44 relaxed eye with myopia Bokode A eye virtual point at finite distance A B Bokode B image points overlap f e move points towards each other

45 relaxed eye with hyperopia Bokode A eye virtual point at infinity A B distinct image points Bokode B f e

46 eye with hyperopia Bokode A eye virtual point beyond infinity A B image points overlap Bokode B f e move points away from each other

47 NETRA: interactively measure prescription pinhole or microlens array eye patterns on an LCD f b array of Bokodes f e

48 NETRA: interactively measure prescription pinhole or microlens array eye patterns on an LCD f b array of Bokodes f e user interactio n

49 Shack-Hartmann wave-front sensor laser laser wavefront aberroemter expensive; requires trained professionals

50 interactive self-evaluation of eye s refractive error NETRA: Interactive Display for Estimating Refractive Errors and Focal Range, Vitor Pamplona, Ankit Mohan, Manuel Oliveira, and Ramesh Raskar, in SIGGRAPH 2010.

51 cell phone prototype lcd: 180dpi controls pinhole: a=3mm, Φ~100µm display patterns audio feedback pinhole or microlens array with spacer lenslet: f=20mm, a=3mm resolution: 0.71D cost: ~$2 (pinhole)

52 vs trial lenses Snellen chart NETRA smaller, less bulky, easier to carry phoropter little no training required avoids cycloplegic eye drops allows self-evaluation cheaper (if phone already exists)

53 interaction session with camera displayed patterns camera/subject view

54 patterns displayed subject view alignment accommodation accommodation displayed subject view alignment alignment

55 astigmatism: radially non-symmetric error cross or points may never meet with a 1d search

56 astigmatism lines reduce the problem to a 1d search

57 jittered pinholes reduce crosstalk jittered pinholes -5D 0D +5D display patterns viewmaster inspired prototype +3D to -5D with accommodation

58 interactive self-evaluation of eye s refractive error DEMO at 5PM TODAY in Workshop on cameras for Visually Impaired (Pacific Concourse) NETRA: Interactive Display for Estimating Refractive Errors and Focal Range, Vitor Pamplona, Ankit Mohan, Manuel Oliveira, and Ramesh Raskar, in SIGGRAPH 2010.

59 Light Field Capture

60 Camera Arrays [Wilburn 2005] High Performance Imaging Using Large Camera Arrays, Bennett Wilburn, Neel Joshi, Vaibhav Vaish, Eino-Ville Talvala, Emilio Antunez, Adam Barth, Andrew Adams, Mark Horowitz, Marc Levoy, in SIGGRAPH 2005.

61 Synthetic aperture photography Camera array is far away from these bushes, yet it sees

62 Focus Adjustment: Sum of Bundles [Vaish et al. 2004]

63 Light Field Inside a Camera u s Lenslet-based Light Field camera / Integral Photography s u [Lippman 1908, Adelson and Wang, 1992, Ng et al. 2005]

64 Stanford Plenoptic Camera [Ng et al 2005] Contax medium format camera Kodak 16-megapixel sensor Adaptive Optics microlens array 125μ square-sided microlenses pixels lenses = pixels per lens

65 Captured Image Behind Microlens 4/2/2010 Slide from Ren Ng and Marc Levoy

66 Digital Refocusing [Ng et al 2005] Light Field Photography with a Hand-Held Plenoptic Camera, Ren Ng, Marc Levoy, Mathieu Bredif, Gene Duval, Mark Horowitz, Pat Hanrahan, in Stanford Tech Report 2005.

67 Extended Depth of Field Light field focal stack extended DOF

68 Light Field Capture with a Programmable Aperture Nikon D70 Liquid crystal array Programmable Aperture Photography: Multiplexed Light Field Acquisition, Chia-Kai Liang, Tai-Hsu Lin, Bing-Yi Wong, Chi Liu, Homer Chen, in SIGGRAPH cm

69 Multiplexing to improve SNR 9 aperture patterns for capturing a light field with 3x3 angular resolution 9 multiplexed aperture patterns O(N) SNR improvement Comparable to previous single-shot light field cameras SNR is a function of W (aperture patterns) The camera noise characteristics

70 Can we capture the Light-Field using a static mask?

71 Mask? Sensor Mask Mask Sensor Full Resolution Digital Refocusing: Coded Aperture Camera 4D Light Field from 2D Photo: Heterodyne Light Field Camera

72 mask based light-field camera lens mask sensor IR filter mask digital back camera body

73 Fourier Slice Theorem θ l(x,θ) 2-D FFT f θ L(f x,f θ ) x f x Line Integral Central Slice 1-D FFT

74 Two-Plane Parameterization of Light Field Object Main Lens 1D Sensor θ -plane x-plane Levoy and Hanrahan 1996 Gortler et al θ 0 x 0 θ x

75 Optical Heterodyning High Freq Carrier 100 MHz Receiver: Demodulation Incoming Signal Baseband Audio Signal 99 MHz Reference Carrier Main Lens Object Mask Sensor Software Demodulation Recovered Light Field Photographic Signal (Light Field) Carrier Incident Modulated Signal Reference Carrier

76 How to Capture 2D Light Field with 1D Sensor? f θ f θ 0 Band-limited Light Field f x0 f x Sensor Slice Fourier Slice Theorem Fourier Light Field Space

77 Extra sensor bandwidth cannot capture extra dimension of the light field f θ f θ 0 Extra sensor bandwidth f x0 f x Sensor Slice

78 ??? f θ????????? f x

79 Solution: Modulation Theorem Make spectral copies of light field f θ f θ0 f x0 f x Modulation Function

80 Sensor Slice captures entire Light Field f θ f θ0 f x0 f x Modulation Function Modulated Light Field

81 Demodulation to recover Light Field 1D Fourier Transform of Sensor Signal f θ f x Rearrange 1D Fourier Transform into 2D

82 Narrowband Cosine Mask Used Mask Tile 1/f 0

83 Where to place the Mask? Sensor Sensor Mask Mask f θ Mask Modulation Function Mask Modulation Function f x

84 Where to place the Mask? Mask Sensor f θ f x Mask Modulation Function

85 Where to place the Mask? Mask Sensor v d Mask Modulation Function α α = (d/v) (π/2)

86 MERL, Northwestern Univ. Captured 2D Photo Mask-Enhanced Cameras: Heterodyned Light Fields & Coded Aperture Veeraraghavan, Raskar, Agrawal, Mohan & Tumblin Encoding due to Cosine Mask

87 Computing 4D Light Field 2D Sensor Photo, 1800*1800 2D Fourier Transform, 1800*1800 2D FFT 9*9=81 spectral copies 4D IFFT Rearrange 2D tiles into 4D planes 200*200*9*9 4D Light Field 200*200*9*9

88 4d light-field capture results Captured Photo Refocusing Changing Views Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing, Ashok Veeraraghavan, Ramesh Raskar, Amit Agrawal, Ankit Mohan, and Jack Tumblin, in SIGGRAPH Non-refractive modulators for encoding and capturing scene appearance and depth, Ashok Veeraraghavan, Ramesh Raskar, Amit Agrawal, Ankit Mohan, and Jack Tumblin, in IEEE CVPR 2008.

89 4/2/2010

90 Which Heterodyne Mask to Use? Conditions for heterodyne light field detection Mask spectrum must be a (windowed) 2D impulse train Can be achieved (approximately) with a pinhole array y fy fx m pinhole (x,y) x M pinhole (f x,f y )

91 y Which Heterodyne Mask to Use? Conditions for heterodyne light field detection Mask spectrum must be a (windowed) 2D impulse train Can be achieved exactly by Sum-of-Sinusoids (SoS) [Veeraraghavan et al. 2007] fy fx m SoS (x,y) M SoS (f x,f y ) x

92 y Which Heterodyne Mask to Use? Conditions for heterodyne light field detection Both pinhole array and SoS are periodic functions What other tiles lead to impulse trains????????????????????????????????????????????????????????????????????????????????????????????????? m general (x,y) M general (f x,f y ) x fy fx

93 General Tiled-broadband Masks Conditions for heterodyne light field detection (Almost) any 2D tile can be used (tiling impulse train) Amplitude of impulses given by Fourier series of tile y fy fx m general (x,y) M general (f x,f y ) x

94 y Specific Choice: Tiled-MURA Conditions for optimal heterodyne light field detection Modified Uniformly Redundant Array (MURA) Shield Fields: Modeling and Capturing 3D Occluders, Douglas Lanman, Ramesh Raskar, Amit Agrawal, Gabriel Taubin, in SIGGRAPH Asia fy fx M MURA (x,y) M MURA (f x,f y ) x

95 Benefits of New Heterodyne Codes Tiled-Broadband Code MURA Sum-of-Sinusoids Pinholes Average Transmission (%) Pinholes Sum-of-Sinusoids MURA 11x11 23x23 43x43 Angular Resolution Benefits and Limitations Angular Resolution Sum-of-Sinusoids converges to 18% transmission Tiled-MURA near 50% (but only for prime-lengths) Binary vs. continuous-tone process (quantization)

96 Prototype Implementation LED Array Mask + Diffuser Camera Subject Components 8.0 megapixel Canon EOS Digital Rebel XT 6x6 array of Philips Luxeon Rebel LEDs [1.2x1.2 m] 5,080 DPI mask and a paper vellum diffuser [75x55 cm]

97 Tiled-MURA Results: Sensor Image High-Resolution Sensor Image (0.25 sec. Exposure)

98 Tiled-MURA Results: Shadowgrams Light Field Reconstruction

99 Reconstruction Light Field Reconstruction Visual Hull Reconstruction

100 Tiled-MURA Results: Dynamic Scene Components and Limitations 1600x fps Point Grey Grasshopper camera 6x6 array of Philips Luxeon Rebel LEDs [1.2x1.2 m] 5,080 DPI mask and a paper vellum diffuser [75x55 cm] Light Field Reconstruction Individual shadowgrams only 75x50 pixels

101 Bi-Di Screen: Light Field capture with a flat display for User Interaction BiDi Screen: A Thin, Depth-Sensing LCD for 3D Interaction using Light Fields, Matthew Hirsch, Douglas Lanman, Henry Holtzman, Ramesh Raskar, in SIGGRAPH Asia 2009.

102 BiDi Screen Sharp Microelectronics Optical Multi-touch Prototype Display with embedded optical sensors

103 BiDi Screen: Design Overview ~50 cm ~2.5 cm Display with embedded optical sensors Optical sensor array LCD, displaying mask

104 BiDi Screen: Design Vision Spatial Light Modulator Bare Sensor Object Collocated Capture and Display

105 Reinterpretable Imaging Coded Aperture Static Aperture Mask Sensor Optical Heterodyning Static Mask Sensor Veeraraghavan et al. SIGGRAPH 2007 Digital Refocusing Veeraraghavan et al. SIGGRAPH 2007 Light Field Capture

106 Reinterpretable Imager Coded Aperture Static Aperture Mask Sensor Optical Heterodyning Static Mask Sensor Reinterpretable Imager Dynamic Aperture Mask Static Mask Sensor Veeraraghavan et al. SIGGRAPH 2007 Veeraraghavan et al. SIGGRAPH 2007 Agrawal et al. Eurographics 2009 Digital Refocusing Light Field Capture Post-Capture Resolution Control Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography, Amit Agrawal, Ashok Veeraraghavan, Ramesh Raskar, in Eurographics 2010.

107 Temporally changing mask in Aperture

108 Captured 2D Photo Static Scene Parts Dynamic Scene Parts In-Focus Out of Focus In-Focus Out of Focus High Resolution 2D Image 4D Light Field Video 1D Parallax + Motion

109 Captured Photo

110 Video from Single-Shot (Temporal Frames)

111 Reconstructed Sub-Aperture Views (3 by 3 Light Field)

112 Time Time For Rotating Doll

113 Angle Angle For Static Scene Parts

114 Light Field Modulation

115 Agile Spectrum Imaging I A B C x I x 400nm 550nm 700nm λ Arbitrary white 1D signal Agile Spectrum Imaging: Programmable Wavelength Modulation for Cameras and Projectors, Ankit Mohan, Ramesh Raskar and Jack Tumblin, in Eurographics 2008.

116 Pinhole Camera x C A I B B I A Object C Image x Pinhole

117 x C A B B I A Object Lens L 1 C Prism Pinhole

118 Spectral Light-Field Prism A λ x B λ λ C λ x

119 Prism I A λ B λ λ C λ x Screen I x p

120 Prism t = I A λ p B λ λ x C λ x I t

121 I Prism A λ Lens L 2 C B λ B x C λ x A t S

122 C B A

123 Sensor plane (t=t s ) C B λ A x t t S I p

124 Rainbow plane (t=t R ) C p B λ A x I t t R

125 Rainbow plane (t=t R ) C B A λ position t t R

126 Mask in the Rainbow plane C 0 B λ 0 A x t t R t S I p

127 Rainbow plane (t=t R ) C Control the spectral sensitivity of the B A sensor by placing an appropriate grayscale masks in the R-plane. t t R t S

128 Lens L 1 Diffraction R-plane Grating Lens L 2 mask Sensor

129 m(λ ) 400nm 550nm 700nm λ

130 m(λ ) 400nm 550nm 700nm λ

131 Pinhole multi-spectral camera Pinhole x Scene C B A θ 2 θ 0 θ 1 x Lens L 1 x A B C Prism λ λ λ Light field camera

132 Mask based multi-spectral camera x Scene C A Lens L 1 x θ x Mask, m(θ) Prism Light field camera

133 Glare separation camera Glare Aware Photography: 4D Ray Sampling for Reducing Glare Effects of Camera Lenses, Ramesh Raskar, Amit Agrawal, Cyrus Wilson and Ashok Veeraraghavan, in SIGGRAPH 2008.

134 Effects of Glare on Image Hard to model, Low Frequency in 2D But reflection glare is outlier in 4D ray-space Glare coherence to recover full resolution Sensor b a Angular Variation at pixel a

135

136

137 Reducing Glare Conventional Photo After removing outliers Glare Reduced Image

138 Enhancing Glare Conventional Photo Glare Enhanced Image

139 Conclusions Light Field Capture Heterodyne Camera Shield Fields BiDi Screen Reinterpretable Imager Light Field Modulation Spectral Light Fields Glare Camera Light Field Generation Bokode NETRA

140

141 camera barcode (spatial) sensor

142 camera barcode (space) sensor

143 camera barcode (space) sensor image much smaller; refocus if distance changes

144 limitations overlapping Bokodes auto-exposure / motion blur angular range (+/-20 ) thickness holographic Bokode 20 25

145 retro-reflector for passive Bokode

146 focusing range and refractive errors perfect vision eye ~25mm myopia hyperopia presbyopia infinity ~10cm cornea (~40D) crystalline lens (0~10D)

147 accommodation eye ~25mm infinity ~10cm focusing at infinity cornea (~40D) crystalline lens (~0D)

148 accommodation eye ~25mm infinity ~10cm focusing close to eye cornea (~40D) crystalline lens (~10D)

149 Impact / PerfectSight sending 4 prototypes over the summer MIT IDEAS award to deploy in Mwanda, Malawi local testing

Introduction to Light Fields

Introduction to Light Fields MIT Media Lab Introduction to Light Fields Camera Culture Ramesh Raskar MIT Media Lab http://cameraculture.media.mit.edu/ Introduction to Light Fields Ray Concepts for 4D and 5D Functions Propagation of

More information

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f) Capturing Light Rooms by the Sea, Edward Hopper, 1951 The Penitent Magdalen, Georges de La Tour, c. 1640 Some slides from M. Agrawala, F. Durand, P. Debevec, A. Efros, R. Fergus, D. Forsyth, M. Levoy,

More information

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene Admin Lightfields Projects due by the end of today Email me source code, result images and short report Lecture 13 Overview Lightfield representation of a scene Unified representation of all rays Overview

More information

Coding and Modulation in Cameras

Coding and Modulation in Cameras Coding and Modulation in Cameras Amit Agrawal June 2010 Mitsubishi Electric Research Labs (MERL) Cambridge, MA, USA Coded Computational Imaging Agrawal, Veeraraghavan, Narasimhan & Mohan Schedule Introduction

More information

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Light field sensing. Marc Levoy. Computer Science Department Stanford University Light field sensing Marc Levoy Computer Science Department Stanford University The scalar light field (in geometrical optics) Radiance as a function of position and direction in a static scene with fixed

More information

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Ankit Mohan & Jack Tumblin Amit Agrawal, Mitsubishi Electric Research

More information

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab

Raskar, Camera Culture, MIT Media Lab. Ramesh Raskar. Camera Culture. Associate Professor, MIT Media Lab Raskar, Camera Culture, MIT Media Lab Camera Culture Ramesh Raskar C C lt Camera Culture Associate Professor, MIT Media Lab Where are the camera s? Where are the camera s? We focus on creating tools to

More information

Computational Approaches to Cameras

Computational Approaches to Cameras Computational Approaches to Cameras 11/16/17 Magritte, The False Mirror (1935) Computational Photography Derek Hoiem, University of Illinois Announcements Final project proposal due Monday (see links on

More information

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013 Lecture 18: Light field cameras (plenoptic cameras) Visual Computing Systems Continuing theme: computational photography Cameras capture light, then extensive processing produces the desired image Today:

More information

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera Advanced Computer Graphics CSE 163 [Spring 2017], Lecture 14 Ravi Ramamoorthi http://www.cs.ucsd.edu/~ravir To Do Assignment 2 due May 19 Any last minute issues or questions? Next two lectures: Imaging,

More information

Light field photography and microscopy

Light field photography and microscopy Light field photography and microscopy Marc Levoy Computer Science Department Stanford University The light field (in geometrical optics) Radiance as a function of position and direction in a static scene

More information

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Final projects Send your slides by noon on Thrusday. Send final report Refocusing & Light Fields Frédo Durand Bill Freeman

More information

Coded Computational Photography!

Coded Computational Photography! Coded Computational Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 9! Gordon Wetzstein! Stanford University! Coded Computational Photography - Overview!!

More information

Coded Aperture and Coded Exposure Photography

Coded Aperture and Coded Exposure Photography Coded Aperture and Coded Exposure Photography Martin Wilson University of Cape Town Cape Town, South Africa Email: Martin.Wilson@uct.ac.za Fred Nicolls University of Cape Town Cape Town, South Africa Email:

More information

Computational Photography: Principles and Practice

Computational Photography: Principles and Practice Computational Photography: Principles and Practice HCI & Robotics (HCI 및로봇응용공학 ) Ig-Jae Kim, Korea Institute of Science and Technology ( 한국과학기술연구원김익재 ) Jaewon Kim, Korea Institute of Science and Technology

More information

Simulated Programmable Apertures with Lytro

Simulated Programmable Apertures with Lytro Simulated Programmable Apertures with Lytro Yangyang Yu Stanford University yyu10@stanford.edu Abstract This paper presents a simulation method using the commercial light field camera Lytro, which allows

More information

Coded photography , , Computational Photography Fall 2018, Lecture 14

Coded photography , , Computational Photography Fall 2018, Lecture 14 Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 14 Overview of today s lecture The coded photography paradigm. Dealing with

More information

Sensing Increased Image Resolution Using Aperture Masks

Sensing Increased Image Resolution Using Aperture Masks Sensing Increased Image Resolution Using Aperture Masks Ankit Mohan, Xiang Huang, Jack Tumblin Northwestern University Ramesh Raskar MIT Media Lab CVPR 2008 Supplemental Material Contributions Achieve

More information

Computational Camera & Photography: Coded Imaging

Computational Camera & Photography: Coded Imaging Computational Camera & Photography: Coded Imaging Camera Culture Ramesh Raskar MIT Media Lab http://cameraculture.media.mit.edu/ Image removed due to copyright restrictions. See Fig. 1, Eight major types

More information

Lenses, exposure, and (de)focus

Lenses, exposure, and (de)focus Lenses, exposure, and (de)focus http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 15 Course announcements Homework 4 is out. - Due October 26

More information

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do? Computational Photography The ultimate camera What does it do? Image from Durand & Freeman s MIT Course on Computational Photography Today s reading Szeliski Chapter 9 The ultimate camera Infinite resolution

More information

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com Dappled Photography: Mask Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing Ashok Veeraraghavan, Ramesh Raskar, Amit

More information

Coded photography , , Computational Photography Fall 2017, Lecture 18

Coded photography , , Computational Photography Fall 2017, Lecture 18 Coded photography http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 18 Course announcements Homework 5 delayed for Tuesday. - You will need cameras

More information

Computational Cameras. Rahul Raguram COMP

Computational Cameras. Rahul Raguram COMP Computational Cameras Rahul Raguram COMP 790-090 What is a computational camera? Camera optics Camera sensor 3D scene Traditional camera Final image Modified optics Camera sensor Image Compute 3D scene

More information

Ultra-shallow DoF imaging using faced paraboloidal mirrors

Ultra-shallow DoF imaging using faced paraboloidal mirrors Ultra-shallow DoF imaging using faced paraboloidal mirrors Ryoichiro Nishi, Takahito Aoto, Norihiko Kawai, Tomokazu Sato, Yasuhiro Mukaigawa, Naokazu Yokoya Graduate School of Information Science, Nara

More information

High Performance Imaging Using Large Camera Arrays

High Performance Imaging Using Large Camera Arrays High Performance Imaging Using Large Camera Arrays Presentation of the original paper by Bennett Wilburn, Neel Joshi, Vaibhav Vaish, Eino-Ville Talvala, Emilio Antunez, Adam Barth, Andrew Adams, Mark Horowitz,

More information

Image Formation and Camera Design

Image Formation and Camera Design Image Formation and Camera Design Spring 2003 CMSC 426 Jan Neumann 2/20/03 Light is all around us! From London & Upton, Photography Conventional camera design... Ken Kay, 1969 in Light & Film, TimeLife

More information

MAS.963 Special Topics: Computational Camera and Photography

MAS.963 Special Topics: Computational Camera and Photography MIT OpenCourseWare http://ocw.mit.edu MAS.963 Special Topics: Computational Camera and Photography Fall 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

More information

Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009

Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009 Wavelengths and Colors Ankit Mohan MAS.131/531 Fall 2009 Epsilon over time (Multiple photos) Prokudin-Gorskii, Sergei Mikhailovich, 1863-1944, photographer. Congress. Epsilon over time (Bracketing) Image

More information

Removal of Glare Caused by Water Droplets

Removal of Glare Caused by Water Droplets 2009 Conference for Visual Media Production Removal of Glare Caused by Water Droplets Takenori Hara 1, Hideo Saito 2, Takeo Kanade 3 1 Dai Nippon Printing, Japan hara-t6@mail.dnp.co.jp 2 Keio University,

More information

Dictionary Learning based Color Demosaicing for Plenoptic Cameras

Dictionary Learning based Color Demosaicing for Plenoptic Cameras Dictionary Learning based Color Demosaicing for Plenoptic Cameras Xiang Huang Northwestern University Evanston, IL, USA xianghuang@gmail.com Oliver Cossairt Northwestern University Evanston, IL, USA ollie@eecs.northwestern.edu

More information

Improving Film-Like Photography. aka, Epsilon Photography

Improving Film-Like Photography. aka, Epsilon Photography Improving Film-Like Photography aka, Epsilon Photography Ankit Mohan Courtesy of Ankit Mohan. Used with permission. Film-like like Optics: Imaging Intuition Angle(θ,ϕ) Ray Center of Projection Position

More information

Less Is More: Coded Computational Photography

Less Is More: Coded Computational Photography Less Is More: Coded Computational Photography Ramesh Raskar Mitsubishi Electric Research Labs (MERL), Cambridge, MA, USA Abstract. Computational photography combines plentiful computing, digital sensors,

More information

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017

Lecture 22: Cameras & Lenses III. Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Lecture 22: Cameras & Lenses III Computer Graphics and Imaging UC Berkeley, Spring 2017 F-Number For Lens vs. Photo A lens s F-Number is the maximum for that lens E.g. 50 mm F/1.4 is a high-quality telephoto

More information

Modeling and Synthesis of Aperture Effects in Cameras

Modeling and Synthesis of Aperture Effects in Cameras Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman, Ramesh Raskar, and Gabriel Taubin Computational Aesthetics 2008 20 June, 2008 1 Outline Introduction and Related Work Modeling Vignetting

More information

Chapter 25. Optical Instruments

Chapter 25. Optical Instruments Chapter 25 Optical Instruments Optical Instruments Analysis generally involves the laws of reflection and refraction Analysis uses the procedures of geometric optics To explain certain phenomena, the wave

More information

Single-shot three-dimensional imaging of dilute atomic clouds

Single-shot three-dimensional imaging of dilute atomic clouds Calhoun: The NPS Institutional Archive Faculty and Researcher Publications Funded by Naval Postgraduate School 2014 Single-shot three-dimensional imaging of dilute atomic clouds Sakmann, Kaspar http://hdl.handle.net/10945/52399

More information

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics

IMAGE FORMATION. Light source properties. Sensor characteristics Surface. Surface reflectance properties. Optics IMAGE FORMATION Light source properties Sensor characteristics Surface Exposure shape Optics Surface reflectance properties ANALOG IMAGES An image can be understood as a 2D light intensity function f(x,y)

More information

Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography

Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography Reinterpretable Imager: Towards Variable Post-Capture Space, Angle and Time Resolution in Photography The MIT Faculty has made this article openly available. Please share how this access benefits you.

More information

Explanation of Aberration and Wavefront

Explanation of Aberration and Wavefront Explanation of Aberration and Wavefront 1. What Causes Blur? 2. What is? 4. What is wavefront? 5. Hartmann-Shack Aberrometer 6. Adoption of wavefront technology David Oh 1. What Causes Blur? 2. What is?

More information

Single Lens Off-Chip Cellphone Microscopy

Single Lens Off-Chip Cellphone Microscopy Single Lens Off-Chip Cellphone Microscopy Aydın Arpa Gordon Wetzstein Douglas Lanman Ramesh Raskar Abstract Within the last few years, cellphone subscriptions have widely spread and now cover even the

More information

Light-Field Database Creation and Depth Estimation

Light-Field Database Creation and Depth Estimation Light-Field Database Creation and Depth Estimation Abhilash Sunder Raj abhisr@stanford.edu Michael Lowney mlowney@stanford.edu Raj Shah shahraj@stanford.edu Abstract Light-field imaging research has been

More information

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken

Dynamically Reparameterized Light Fields & Fourier Slice Photography. Oliver Barth, 2009 Max Planck Institute Saarbrücken Dynamically Reparameterized Light Fields & Fourier Slice Photography Oliver Barth, 2009 Max Planck Institute Saarbrücken Background What we are talking about? 2 / 83 Background What we are talking about?

More information

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5

Vision. The eye. Image formation. Eye defects & corrective lenses. Visual acuity. Colour vision. Lecture 3.5 Lecture 3.5 Vision The eye Image formation Eye defects & corrective lenses Visual acuity Colour vision Vision http://www.wired.com/wiredscience/2009/04/schizoillusion/ Perception of light--- eye-brain

More information

Synthetic aperture photography and illumination using arrays of cameras and projectors

Synthetic aperture photography and illumination using arrays of cameras and projectors Synthetic aperture photography and illumination using arrays of cameras and projectors technologies large camera arrays large projector arrays camera projector arrays Outline optical effects synthetic

More information

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A.

Camera Simulation. References. Photography, B. London and J. Upton Optics in Photography, R. Kingslake The Camera, The Negative, The Print, A. Camera Simulation Effect Cause Field of view Film size, focal length Depth of field Aperture, focal length Exposure Film speed, aperture, shutter Motion blur Shutter References Photography, B. London and

More information

Dr F. Cuzzolin 1. September 29, 2015

Dr F. Cuzzolin 1. September 29, 2015 P00407 Principles of Computer Vision 1 1 Department of Computing and Communication Technologies Oxford Brookes University, UK September 29, 2015 September 29, 2015 1 / 73 Outline of the Lecture 1 2 Basics

More information

What will be on the midterm?

What will be on the midterm? What will be on the midterm? CS 178, Spring 2014 Marc Levoy Computer Science Department Stanford University General information 2 Monday, 7-9pm, Cubberly Auditorium (School of Edu) closed book, no notes

More information

ECEN 4606, UNDERGRADUATE OPTICS LAB

ECEN 4606, UNDERGRADUATE OPTICS LAB ECEN 4606, UNDERGRADUATE OPTICS LAB Lab 2: Imaging 1 the Telescope Original Version: Prof. McLeod SUMMARY: In this lab you will become familiar with the use of one or more lenses to create images of distant

More information

Computational Illumination

Computational Illumination Computational Illumination Course WebPage : http://www.merl.com/people/raskar/photo/ Ramesh Raskar Mitsubishi Electric Research Labs Ramesh Raskar, Computational Illumination Computational Illumination

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Image of Formation Images can result when light rays encounter flat or curved surfaces between two media. Images can be formed either by reflection or refraction due to these

More information

Chapter 36. Image Formation

Chapter 36. Image Formation Chapter 36 Image Formation Notation for Mirrors and Lenses The object distance is the distance from the object to the mirror or lens Denoted by p The image distance is the distance from the image to the

More information

Announcement A total of 5 (five) late days are allowed for projects. Office hours

Announcement A total of 5 (five) late days are allowed for projects. Office hours Announcement A total of 5 (five) late days are allowed for projects. Office hours Me: 3:50-4:50pm Thursday (or by appointment) Jake: 12:30-1:30PM Monday and Wednesday Image Formation Digital Camera Film

More information

Full Resolution Lightfield Rendering

Full Resolution Lightfield Rendering Full Resolution Lightfield Rendering Andrew Lumsdaine Indiana University lums@cs.indiana.edu Todor Georgiev Adobe Systems tgeorgie@adobe.com Figure 1: Example of lightfield, normally rendered image, and

More information

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image

Overview. Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Camera & Color Overview Pinhole camera model Projective geometry Vanishing points and lines Projection matrix Cameras with Lenses Color Digital image Book: Hartley 6.1, Szeliski 2.1.5, 2.2, 2.3 The trip

More information

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction

Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction 2013 IEEE International Conference on Computer Vision Modeling the calibration pipeline of the Lytro camera for high quality light-field image reconstruction Donghyeon Cho Minhaeng Lee Sunyeong Kim Yu-Wing

More information

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name:

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name: EE119 Introduction to Optical Engineering Spring 2003 Final Exam Name: SID: CLOSED BOOK. THREE 8 1/2 X 11 SHEETS OF NOTES, AND SCIENTIFIC POCKET CALCULATOR PERMITTED. TIME ALLOTTED: 180 MINUTES Fundamental

More information

Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: Signature:

Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: Signature: Physics 431 Final Exam Examples (3:00-5:00 pm 12/16/2009) TIME ALLOTTED: 120 MINUTES Name: PID: Signature: CLOSED BOOK. TWO 8 1/2 X 11 SHEET OF NOTES (double sided is allowed), AND SCIENTIFIC POCKET CALCULATOR

More information

Computational Photography

Computational Photography Computational photography Computational Photography Digital Visual Effects Yung-Yu Chuang wikipedia: Computational photography h refers broadly to computational imaging techniques that enhance or extend

More information

Spatial Augmented Reality: Special Effects in the Real World

Spatial Augmented Reality: Special Effects in the Real World Spatial Augmented Reality: Special Effects in the Real World Ramesh Raskar MIT Media Lab Cambridge, MA Poor Man s Palace Spatial Augmented Reality Raskar 2010 Poor Man s Palace Augment the world, projectors

More information

VC 11/12 T2 Image Formation

VC 11/12 T2 Image Formation VC 11/12 T2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information

Optical Signal Processing

Optical Signal Processing Optical Signal Processing ANTHONY VANDERLUGT North Carolina State University Raleigh, North Carolina A Wiley-Interscience Publication John Wiley & Sons, Inc. New York / Chichester / Brisbane / Toronto

More information

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3 Image Formation Dr. Gerhard Roth COMP 4102A Winter 2015 Version 3 1 Image Formation Two type of images Intensity image encodes light intensities (passive sensor) Range (depth) image encodes shape and distance

More information

Time-Lapse Light Field Photography With a 7 DoF Arm

Time-Lapse Light Field Photography With a 7 DoF Arm Time-Lapse Light Field Photography With a 7 DoF Arm John Oberlin and Stefanie Tellex Abstract A photograph taken by a conventional camera captures the average intensity of light at each pixel, discarding

More information

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 )

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 ) High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 ) School of Electronic Science & Engineering Nanjing University caoxun@nju.edu.cn Dec 30th, 2015 Computational Photography

More information

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS

6.098 Digital and Computational Photography Advanced Computational Photography. Bill Freeman Frédo Durand MIT - EECS 6.098 Digital and Computational Photography 6.882 Advanced Computational Photography Bill Freeman Frédo Durand MIT - EECS Administrivia PSet 1 is out Due Thursday February 23 Digital SLR initiation? During

More information

A reprint from. American Scientist. the magazine of Sigma Xi, The Scientific Research Society

A reprint from. American Scientist. the magazine of Sigma Xi, The Scientific Research Society A reprint from American Scientist the magazine of Sigma Xi, The Scientific Research Society This reprint is provided for personal and noncommercial use. For any other use, please send a request Brian Hayes

More information

Deconvolution , , Computational Photography Fall 2018, Lecture 12

Deconvolution , , Computational Photography Fall 2018, Lecture 12 Deconvolution http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2018, Lecture 12 Course announcements Homework 3 is out. - Due October 12 th. - Any questions?

More information

Project 4 Results http://www.cs.brown.edu/courses/cs129/results/proj4/jcmace/ http://www.cs.brown.edu/courses/cs129/results/proj4/damoreno/ http://www.cs.brown.edu/courses/csci1290/results/proj4/huag/

More information

Intro to Virtual Reality (Cont)

Intro to Virtual Reality (Cont) Lecture 37: Intro to Virtual Reality (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A Overview of VR Topics Areas we will discuss over next few lectures VR Displays VR Rendering VR Imaging CS184/284A

More information

OPTICAL SYSTEMS OBJECTIVES

OPTICAL SYSTEMS OBJECTIVES 101 L7 OPTICAL SYSTEMS OBJECTIVES Aims Your aim here should be to acquire a working knowledge of the basic components of optical systems and understand their purpose, function and limitations in terms

More information

arxiv: v2 [cs.gr] 7 Dec 2015

arxiv: v2 [cs.gr] 7 Dec 2015 Light-Field Microscopy with a Consumer Light-Field Camera Lois Mignard-Debise INRIA, LP2N Bordeaux, France http://manao.inria.fr/perso/ lmignard/ Ivo Ihrke INRIA, LP2N Bordeaux, France arxiv:1508.03590v2

More information

Computational Photography Introduction

Computational Photography Introduction Computational Photography Introduction Jongmin Baek CS 478 Lecture Jan 9, 2012 Background Sales of digital cameras surpassed sales of film cameras in 2004. Digital cameras are cool Free film Instant display

More information

Point Spread Function Engineering for Scene Recovery. Changyin Zhou

Point Spread Function Engineering for Scene Recovery. Changyin Zhou Point Spread Function Engineering for Scene Recovery Changyin Zhou Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences

More information

Applications of Optics

Applications of Optics Nicholas J. Giordano www.cengage.com/physics/giordano Chapter 26 Applications of Optics Marilyn Akins, PhD Broome Community College Applications of Optics Many devices are based on the principles of optics

More information

EE119 Introduction to Optical Engineering Spring 2002 Final Exam. Name:

EE119 Introduction to Optical Engineering Spring 2002 Final Exam. Name: EE119 Introduction to Optical Engineering Spring 2002 Final Exam Name: SID: CLOSED BOOK. FOUR 8 1/2 X 11 SHEETS OF NOTES, AND SCIENTIFIC POCKET CALCULATOR PERMITTED. TIME ALLOTTED: 180 MINUTES Fundamental

More information

Short-course Compressive Sensing of Videos

Short-course Compressive Sensing of Videos Short-course Compressive Sensing of Videos Venue CVPR 2012, Providence, RI, USA June 16, 2012 Richard G. Baraniuk Mohit Gupta Aswin C. Sankaranarayanan Ashok Veeraraghavan Tutorial Outline Time Presenter

More information

Lecture 9. Lecture 9. t (min)

Lecture 9. Lecture 9. t (min) Sensitivity of the Eye Lecture 9 The eye is capable of dark adaptation. This comes about by opening of the iris, as well as a change in rod cell photochemistry fovea only least perceptible brightness 10

More information

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis

Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis Near-Invariant Blur for Depth and 2D Motion via Time-Varying Light Field Analysis Yosuke Bando 1,2 Henry Holtzman 2 Ramesh Raskar 2 1 Toshiba Corporation 2 MIT Media Lab Defocus & Motion Blur PSF Depth

More information

Phy Ph s y 102 Lecture Lectur 21 Optical instruments 1

Phy Ph s y 102 Lecture Lectur 21 Optical instruments 1 Phys 102 Lecture 21 Optical instruments 1 Today we will... Learn how combinations of lenses form images Thin lens equation & magnification Learn about the compound microscope Eyepiece & objective Total

More information

Observational Astronomy

Observational Astronomy Observational Astronomy Instruments The telescope- instruments combination forms a tightly coupled system: Telescope = collecting photons and forming an image Instruments = registering and analyzing the

More information

The eye & corrective lenses

The eye & corrective lenses Phys 102 Lecture 20 The eye & corrective lenses 1 Today we will... Apply concepts from ray optics & lenses Simple optical instruments the camera & the eye Learn about the human eye Accommodation Myopia,

More information

Introduction to Computer Vision

Introduction to Computer Vision Introduction to Computer Vision CS / ECE 181B Thursday, April 1, 2004 Course Details HW #0 and HW #1 are available. Course web site http://www.ece.ucsb.edu/~manj/cs181b Syllabus, schedule, lecture notes,

More information

Computational Photography and Video. Prof. Marc Pollefeys

Computational Photography and Video. Prof. Marc Pollefeys Computational Photography and Video Prof. Marc Pollefeys Today s schedule Introduction of Computational Photography Course facts Syllabus Digital Photography What is computational photography Convergence

More information

Generalized Assorted Camera Arrays: Robust Cross-channel Registration and Applications Jason Holloway, Kaushik Mitra, Sanjeev Koppal, Ashok

Generalized Assorted Camera Arrays: Robust Cross-channel Registration and Applications Jason Holloway, Kaushik Mitra, Sanjeev Koppal, Ashok Generalized Assorted Camera Arrays: Robust Cross-channel Registration and Applications Jason Holloway, Kaushik Mitra, Sanjeev Koppal, Ashok Veeraraghavan Cross-modal Imaging Hyperspectral Cross-modal Imaging

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014

MASSACHUSETTS INSTITUTE OF TECHNOLOGY. 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 2.71/2.710 Optics Spring 14 Practice Problems Posted May 11, 2014 1. (Pedrotti 13-21) A glass plate is sprayed with uniform opaque particles. When a distant point

More information

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

PHYSICS. Chapter 35 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT PHYSICS FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E Chapter 35 Lecture RANDALL D. KNIGHT Chapter 35 Optical Instruments IN THIS CHAPTER, you will learn about some common optical instruments and

More information

VC 16/17 TP2 Image Formation

VC 16/17 TP2 Image Formation VC 16/17 TP2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Hélder Filipe Pinto de Oliveira Outline Computer Vision? The Human Visual

More information

Phys 102 Lecture 21 Optical instruments

Phys 102 Lecture 21 Optical instruments Phys 102 Lecture 21 Optical instruments 1 Today we will... Learn how combinations of lenses form images Thin lens equation & magnification Learn about the compound microscope Eyepiece & objective Total

More information

CS6670: Computer Vision

CS6670: Computer Vision CS6670: Computer Vision Noah Snavely Lecture 4a: Cameras Source: S. Lazebnik Reading Szeliski chapter 2.2.3, 2.3 Image formation Let s design a camera Idea 1: put a piece of film in front of an object

More information

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1 TSBB09 Image Sensors 2018-HT2 Image Formation Part 1 Basic physics Electromagnetic radiation consists of electromagnetic waves With energy That propagate through space The waves consist of transversal

More information

There is a range of distances over which objects will be in focus; this is called the depth of field of the lens. Objects closer or farther are

There is a range of distances over which objects will be in focus; this is called the depth of field of the lens. Objects closer or farther are Chapter 25 Optical Instruments Some Topics in Chapter 25 Cameras The Human Eye; Corrective Lenses Magnifying Glass Telescopes Compound Microscope Aberrations of Lenses and Mirrors Limits of Resolution

More information

Color Computer Vision Spring 2018, Lecture 15

Color Computer Vision Spring 2018, Lecture 15 Color http://www.cs.cmu.edu/~16385/ 16-385 Computer Vision Spring 2018, Lecture 15 Course announcements Homework 4 has been posted. - Due Friday March 23 rd (one-week homework!) - Any questions about the

More information

Physics 3340 Spring Fourier Optics

Physics 3340 Spring Fourier Optics Physics 3340 Spring 011 Purpose Fourier Optics In this experiment we will show how the Fraunhofer diffraction pattern or spatial Fourier transform of an object can be observed within an optical system.

More information

Color , , Computational Photography Fall 2017, Lecture 11

Color , , Computational Photography Fall 2017, Lecture 11 Color http://graphics.cs.cmu.edu/courses/15-463 15-463, 15-663, 15-862 Computational Photography Fall 2017, Lecture 11 Course announcements Homework 2 grades have been posted on Canvas. - Mean: 81.6% (HW1:

More information

Hexagonal Liquid Crystal Micro-Lens Array with Fast-Response Time for Enhancing Depth of Light Field Microscopy

Hexagonal Liquid Crystal Micro-Lens Array with Fast-Response Time for Enhancing Depth of Light Field Microscopy Hexagonal Liquid Crystal Micro-Lens Array with Fast-Response Time for Enhancing Depth of Light Field Microscopy Chih-Kai Deng 1, Hsiu-An Lin 1, Po-Yuan Hsieh 2, Yi-Pai Huang 2, Cheng-Huang Kuo 1 1 2 Institute

More information

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring

Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Implementation of Adaptive Coded Aperture Imaging using a Digital Micro-Mirror Device for Defocus Deblurring Ashill Chiranjan and Bernardt Duvenhage Defence, Peace, Safety and Security Council for Scientific

More information

Sampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25.

Sampling and pixels. CS 178, Spring Marc Levoy Computer Science Department Stanford University. Begun 4/23, finished 4/25. Sampling and pixels CS 178, Spring 2013 Begun 4/23, finished 4/25. Marc Levoy Computer Science Department Stanford University Why study sampling theory? Why do I sometimes get moiré artifacts in my images?

More information

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object.

Cameras. Shrinking the aperture. Camera trial #1. Pinhole camera. Digital Visual Effects Yung-Yu Chuang. Put a piece of film in front of an object. Camera trial #1 Cameras Digital Visual Effects Yung-Yu Chuang scene film with slides by Fredo Durand, Brian Curless, Steve Seitz and Alexei Efros Put a piece of film in front of an object. Pinhole camera

More information

VC 14/15 TP2 Image Formation

VC 14/15 TP2 Image Formation VC 14/15 TP2 Image Formation Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra Outline Computer Vision? The Human Visual System

More information