Computational Sensors

Similar documents
Fundamentals of CMOS Image Sensors

EE 392B: Course Introduction

Charged Coupled Device (CCD) S.Vidhya

Photons and solid state detection

Introduction , , Computational Photography Fall 2018, Lecture 1

A Dynamic Range Expansion Technique for CMOS Image Sensors with Dual Charge Storage in a Pixel and Multiple Sampling

A 3D Multi-Aperture Image Sensor Architecture

Image Formation and Capture. Acknowledgment: some figures by B. Curless, E. Hecht, W.J. Smith, B.K.P. Horn, and A. Theuwissen

Automotive Image Sensors

Digital photography , , Computational Photography Fall 2017, Lecture 2

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

ABSTRACT. Section I Overview of the µdss

Lecture 30: Image Sensors (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

VLSI DESIGN OF A HIGH-SPEED CMOS IMAGE SENSOR WITH IN-SITU 2D PROGRAMMABLE PROCESSING

Digital Photographic Imaging Using MOEMS

Introduction to Computer Vision

Image Formation and Capture

Optical Flow Estimation. Using High Frame Rate Sequences

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Digital photography , , Computational Photography Fall 2018, Lecture 2

Digital camera. Sensor. Memory card. Circuit board

A Foveated Visual Tracking Chip

Active Pixel Sensors Fabricated in a Standard 0.18 um CMOS Technology

Detectors for microscopy - CCDs, APDs and PMTs. Antonia Göhler. Nov 2014

Opto-VLSI-based reconfigurable photonic RF filter

CMOS 0.18 m SPAD. TowerJazz February, 2018 Dr. Amos Fenigstein

Diode Sensor Lab. Dr. Lynn Fuller

Digital Imaging Rochester Institute of Technology

Design and Simulation of High Speed Multi-Processing CMOS Image Sensor

ECEN474/704: (Analog) VLSI Circuit Design Fall 2016

CMOS Today & Tomorrow

Lecture Notes 5 CMOS Image Sensor Device and Fabrication

Coded photography , , Computational Photography Fall 2017, Lecture 18

A 1.3 Megapixel CMOS Imager Designed for Digital Still Cameras

A Framework for Analysis of Computational Imaging Systems

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Performance and Characteristics of Silicon Avalanche Photodetectors in

ELEN6350. Summary: High Dynamic Range Photodetector Hassan Eddrees, Matt Bajor

TRIANGULATION-BASED light projection is a typical

Polarization-analyzing CMOS image sensor with embedded wire-grid polarizers

When Does Computational Imaging Improve Performance?

Putting It All Together: Computer Architecture and the Digital Camera

Lecture Notes 10 Image Sensor Optics. Imaging optics. Pixel optics. Microlens

Coding and Modulation in Cameras

IN RECENT years, we have often seen three-dimensional

Focal-Plane Compression Imager with Increased Quantization Bit Rate and DPCM Error Modeling

Charge-integrating organic heterojunction

Ultra-high resolution 14,400 pixel trilinear color image sensor

Lenses, exposure, and (de)focus

CCDS. Lesson I. Wednesday, August 29, 12

Adaptive sensing and image processing with a general-purpose pixel-parallel sensor/processor array integrated circuit

Bits From Photons: Oversampled Binary Image Acquisition

! Review: MOS IV Curves and Switch Model. ! MOS Device Layout. ! Inverter Layout. ! Gate Layout and Stick Diagrams. ! Design Rules. !

Compressive Imaging. Aswin Sankaranarayanan (Computational Photography Fall 2017)

A Short History of Using Cameras for Weld Monitoring

DESIGN OF AN ANALOG FIBER OPTIC TRANSMISSION SYSTEM

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

Techniques for Pixel Level Analog to Digital Conversion

1 Introduction & Motivation 1

A Motion Sensor with On-Chip Pixel Rendering Module for Optical Flow Gradient Extraction

A flexible compact readout circuit for SPAD arrays ABSTRACT Keywords: 1. INTRODUCTION 2. THE SPAD 2.1 Operation 7780C - 55

A New Single-Photon Avalanche Diode in 90nm Standard CMOS Technology

Photoelectric effect

Low-power smart imagers for vision-enabled wireless sensor networks and a case study

Based on lectures by Bernhard Brandl

Design and Implementation of a Scanner with Stitching of Multiple Image Capture

A Sorting Image Sensor: An Example of Massively Parallel Intensity to Time Processing for Low Latency Computational Sensors

ESE 570: Digital Integrated Circuits and VLSI Fundamentals

Lecture 1, Introduction and Background

Camera Image Processing Pipeline

A 1Mjot 1040fps 0.22e-rms Stacked BSI Quanta Image Sensor with Cluster-Parallel Readout

Device design for global shutter operation in a 1.1-um pixel image sensor and its application to nearinfrared

An Introduction to CCDs. The basic principles of CCD Imaging is explained.

Image Formation and Camera Design

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Course Outcome of M.Tech (VLSI Design)

(12) United States Patent (10) Patent No.: US 7,605,376 B2

Nano-crystalline Oxide Semiconductor Materials for Semiconductor and Display Technology Sanghun Jeon Ph.D. Associate Professor

MICROMACHINED INTERFEROMETER FOR MEMS METROLOGY

Cameras As Computing Systems

Computational Photography Introduction

Extremely Low-Cost Diagnostic Bio-Sensor using CMOS Technology for Medical Applications

Deformable MEMS Micromirror Array for Wavelength and Angle Insensitive Retro-Reflecting Modulators Trevor K. Chan & Joseph E. Ford

Spectroscopy in the UV and Visible: Instrumentation. Spectroscopy in the UV and Visible: Instrumentation

MEGAFRAME: a fully integrated, timeresolved SPAD pixel array with microconcentrators

Deconvolution , , Computational Photography Fall 2018, Lecture 12

A CMOS Imager with PFM/PWM Based Analogto-digital

White Paper High Dynamic Range Imaging

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

Jack Keil Wolf Lecture. ESE 570: Digital Integrated Circuits and VLSI Fundamentals. Lecture Outline. MOSFET N-Type, P-Type.

Sensitivity Enhancement of Bimaterial MOEMS Thermal Imaging Sensor Array using 2-λ readout

Physics 3340 Spring Fourier Optics

FDTD Analysis of Readout Characteristics in a near-field MAMMOS recording system. Matthew Manfredonia Paul Nutter & David Wright

Coded photography , , Computational Photography Fall 2018, Lecture 14

A 3MPixel Multi-Aperture Image Sensor with 0.7µm Pixels in 0.11µm CMOS

Simultaneous Image Formation and Motion Blur. Restoration via Multiple Capture

Detectors that cover a dynamic range of more than 1 million in several dimensions

A CMOS Image Sensor with Ultra Wide Dynamic Range Floating-Point Pixel-Level ADC

Course Overview. Dr. Edmund Lam. Department of Electrical and Electronic Engineering The University of Hong Kong

Silicon sensors for radiant signals. D.Sc. Mikko A. Juntunen

Transcription:

Computational Sensors Suren Jayasuriya Postdoctoral Fellow, The Robotics Institute, Carnegie Mellon University

Class Announcements 1) Vote on this poll about project checkpoint date on Piazza: https://piazza.com/class/j6dobp76al46ao?cid=126 1) Reminder: Start HW5 this first week if you haven t started by Wednesday, you should hurry up.

Computational Photography optics to focus light on an image plane digital sensor to capture focused light (electrical process) arbitrary computation between sensor and image Slide courtesy of Ioannis Gkioulekas

Computational Photography optics to focus light on an image plane digital sensor to capture focused light (electrical process) arbitrary computation between sensor and image Examples include: Coded Apertures Light Fields Slide courtesy of Ioannis Gkioulekas

Computational Photography optics to focus light on an image plane digital sensor to capture focused light (electrical process) arbitrary computation between sensor and image Examples include: Panorama Stitching HDR Imaging Slide courtesy of Ioannis Gkioulekas

Computational Photography optics to focus light on an image plane digital sensor to capture focused light (electrical process) arbitrary computation between sensor and image This lecture will cover recent developments in computational sensors Slide courtesy of Ioannis Gkioulekas

Review: Traditional CMOS Image Sensors Pixel stack: microlens microlens - Microlens - Color Filter color filter photodiod e potential well color filter photodiod e potential well - Photodiode - Readout Circuitry silicon for readout etc. circuitry stores emitted electrons Slide courtesy of Ioannis Gkioulekas

Pixel Diagram and Operation 3T (3 Transistors) Pixel: - Each pixel has a reset, source follower (or amplifier) and row select transistor - The relative timings between turning on/off the transistors = exposure and readout of the pixel

Pixel Diagram and Operation RST = ON - The photodiode is charged to a high voltage (between 1-3.3 volts for modern technologies)

Pixel Diagram and Operation RST = OFF - The photodiode is now integrated photocurrent onto its own (internal) capacitor - Voltage is decreased across the capacitor as (negative) charge accumulates Photocurrent flows from cathode to anode - Eventually, the pixel will saturate or voltage = 0

Pixel Diagram and Operation ROW = ON - Transistors Msf and Msel are turned on - The voltage is read out to the column, where it is sent to a column amplifier, then an ADC to be digitized

Rolling Shutter CMOS Image Sensor Slide courtesy of Jinwei Gu

Image Rows Rolling Shutter Time Slide courtesy of Jinwei Gu

Image Rows Rolling Shutter Exposure Time Slide courtesy of Jinwei Gu

Image Rows Rolling Shutter Exposure Readout Time Slide courtesy of Jinwei Gu

Image Rows Rolling Shutter Exposure Readout Time Slide courtesy of Jinwei Gu

Advantages and Disadvantages of Rolling Shutter Advantages: - Easy to read out the image sensor, space-efficient in column parallel readout - No need for extra memory to store pixel voltage (unlike global shutter) Disadvantages: - Rolling shutter effect - Can effect performance of computer vision algorithms such as structure-from-motion, SLAM, and stereo if not careful

Computational Sensors But what is different about computational sensors vs. regular sensors? Some options: (1) Change the readout/timing (2) Change the pixel design itself (3) Change the on-chip processing optics to focus light on an image plane digital sensor to capture focused light (electrical process) arbitrary computation between sensor and image

Computational Sensors But what is different about computational sensors vs. regular sensors? Some options: (1) Change the readout/timing (2) Change the pixel design itself CMOS Image Sensor (3) Change the on-chip processing

Change the Readout and Timing Gu et al, Coded Rolling Shutter Photography: Flexible Space-Time Sampling ICCP 2010

Change the Readout and Timing Gu et al, Coded Rolling Shutter Photography: Flexible Space- Time Sampling ICCP 2010

Change the Readout and Timing

Computational Sensors But what is different about computational sensors vs. regular sensors? Some options: (1) Change the readout/timing (2) Change the pixel design itself microlens microlens color filter color filter (3) Change the on-chip processing photodiod photodiod e potential well e potential well silicon for readout etc. circuitry stores emitted electrons

ASPs: A New Type of Pixel A. Wang and A. Molnar, A Light Field Image Sensor in 180nm CMOS, JSSC 2012

Capturing Light Fields using ASPs A. Wang and A. Molnar, A Light Field Image Sensor in 180nm CMOS, JSSC 2012

Capturing Light Fields using ASPs n-well Photodiode A. Wang and A. Molnar, A Light Field Image Sensor in 180nm CMOS, JSSC 2012

Capturing Light Fields using ASPs n-well Photodiode A. Wang and A. Molnar, A Light Field Image Sensor in 180nm CMOS, JSSC 2012

ASPs A. Wang and A. Molnar, A Light Field Image Sensor in 180nm CMOS, JSSC 2012

Operating Principle: Talbot Effect Plane wave on grating generates periodic diffraction pattern ~ 1 um 0.5 mm

Intensity, a.u. Depth, microns Incident Angle 0 5 0 5 x, microns 10 15

Intensity, a.u. Depth, microns Incident Angle shifts 0 5 0 5 x, microns 10 15

Add an Analyzer Grating 0 degrees Detector 10 degrees Detector

CMOS Implementation Passivation M6 M5 Inter-metal dielectric (SiO 2 ) M4 M3 M2 Interconnect M1 metallization n-well p-substrate

Angle Response

Angle Response

Angle Response

Angle Response

Angle Response V out = I 0 A θ 1 + m cos βθ + α

Quadrature Inversion V 0 = I 0 A θ 1 + m cos βθ V π 2 = I 0 A θ 1 m sin βθ V π = I 0 A θ 1 m cos βθ V 3π 2 = I 0 A θ 1 + m sin βθ Intensity Incident angle I 0 A θ = V 0 + V π 2 = V π 2 + V 3π 2 2 θ = 1 β tan 1 V 0 V π V 3π 2 V π 2

2D ASP Tile 10um Physical Layout Impulse Response (2D) Low α 0 γ 0 Low α 0 γ 0 f Med γ 45 Med γ 45 q High α 90 High α 90 ρ α,β,γ (θ) = 1 2 + m 2 cos β cos γ θ x + β sin γ θ y + α

ASP Camera We tile the entire image sensor with this repeated pattern of different ASP pixels The sensor is fabricated in an unmodified CMOS process

Experimental Setup Chip Package Prototype Setup ASP Sensor Main Lens

ASP Light Field Capture òò i(x, y) = r (a,b,g ) l(x, y,q,f)dqdf Physical Layout Impulse Response (2D) Each pixel modulates the light field with a different angular response function ρ α,β,γ (θ) = 1 2 + m 2 cos β cos γ θ x + β sin γ θ y + α

ASP Light Field Capture Model the image capture process: i = Fl Physical Layout Impulse Response (2D) F ρ α,β,γ (θ) = 1 2 + m 2 cos β cos γ θ x + β sin γ θ y + α

ASP Light Field Capture Linear Reconstruction: ldownsampled = F -1 i We can invert this equation using linear methods by reducing the resolution of the 4D light field The resulting reconstruction is low resolution due to the spatio-angular tradeoff

Captured 2D Image 4D Light Field Compressive Light Field Photography ASP Projection =

Captured 2D Image Compressive Light Field Photography ASP Projection = Overcomplete dictionary Dictionary Sparse Coefficients

Decomposing light fields into sparse representations = s.t. is sparse Original Light Field Dictionary Coefficient vector = Overcomplete dictionary Can lead to fewer non-zero coefficients

Dictionary Learning Training light fields = s.t. is sparse Use K-SVD algorithm to solve this problem Sample 1,000,000 random 4D patches from training light fields [Marwah et al. 2013]

Captured 2D Image Compressive Light Field Reconstruction ASP Projection = Overcomplete dictionary Basis Pursuit Denoise:

Experimental results

Comparison of reconstruction methods

Digital refocusing after the picture has been taken Focused on Swan Focused on Knight

Computational Sensors But what is different about computational sensors vs. regular sensors? Some options: (1) Change the readout/timing (2) Change the pixel design itself microlens microlens color filter color filter (3) Change the on-chip processing photodiod photodiod e potential well e potential well silicon for readout etc. circuitry stores emitted electrons

Event-Based Cameras (also called Dynamic Vision Sensors) Dynamic Vision Sensor

Event-Based Cameras (also called Dynamic Vision Sensors) Concept Figure for Event Based Camera https://www.youtube.com/watch?v=kpczesv fhoq High Speed Output on a Quadcopter https://www.youtube.com/watch?v=lauq6l WTkxM Dynamic Vision Sensor

Applications of Event Based Cameras Kim et al, Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera ECCV 2016 (Best Paper Award!)

Time-of-Flight (TOF) Imaging and Transient Imaging Microsoft Kinect V2 Single Photon Avalanche Diodes (SPAD) Streak Cameras

Time-of-Flight (TOF) Imaging and Transient Imaging More on TOF and transient imaging in future lectures Single Photon Avalanche Diodes (SPAD) Streak Cameras

Computational Sensors But what is different about computational sensors vs. regular sensors? Some options: (1) Change the readout/timing (2) Change the pixel design itself (3) Change the on-chip processing

On-Chip Image Compression Chen et al, A CMOS Image Sensor with On-Chip Image Compression Based on Predictive Boundary Adaptation and Memoryless QTD Algorithm, VLSI 2011

On-chip CNNs Design of on-chip mixed-signal ADC for implementing a CNN on chip Goal: Energy-efficient computer vision LiKamWa et al, RedEye ISCA 2016

Future of Computational Image Sensors Tighter integration of hardware and software, spanning programming languages to computer architecture to circuits to optics Image sensors custom for specific applications (machine vision, scientific imaging, etc) New pixel/sensing technologies: MEMS, Photonics, 3D stacking, etc. What do you predict?

References Basic Reading: A El Gamal, H Eltoukhy, CMOS Image Sensors IEEE Circuits and Devices Magazine, 2005 Additional Readings: J. Gu et al, Coded Rolling Shutter Photography: Flexible Space-Time Sampling, ICCP 2010 M. Hirsch et al, A Switchable Light Field Camera Architecture using Angle Sensitive Pixels and Dictionary-based Sparse Coding, ICCP 2014 H. Kim et al, Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera ECCV 2016 R. LiKamWa et al., RedEye: Analog ConvNet Image Sensor for Continuous Mobile Vision, ISCA 2016