New Directions in Imaging Sensors Ravi Athale, MITRE Corporation OIDA Annual Forum 19 November 2008

Similar documents
The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

Admin. Lightfields. Overview. Overview 5/13/2008. Idea. Projects due by the end of today. Lecture 13. Lightfield representation of a scene

Light field sensing. Marc Levoy. Computer Science Department Stanford University

Capturing Light. The Light Field. Grayscale Snapshot 12/1/16. P(q, f)

To Do. Advanced Computer Graphics. Outline. Computational Imaging. How do we see the world? Pinhole camera

Computational Approaches to Cameras

Introduction to Light Fields

Improving Film-Like Photography. aka, Epsilon Photography

Game Changing Technologies

High Resolution Spectral Video Capture & Computational Photography Xun Cao ( 曹汛 )

Wavelengths and Colors. Ankit Mohan MAS.131/531 Fall 2009

Digital camera. Sensor. Memory card. Circuit board

Light field photography and microscopy

Wavefront coding. Refocusing & Light Fields. Wavefront coding. Final projects. Is depth of field a blur? Frédo Durand Bill Freeman MIT - EECS

PolarCam and Advanced Applications

Digital Imaging with the Nikon D1X and D100 cameras. A tutorial with Simon Stafford

Resolution test with line patterns

Digital Photographic Imaging Using MOEMS

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

Lecture 29: Image Sensors. Computer Graphics and Imaging UC Berkeley CS184/284A

DIGITAL IMAGING. Handbook of. Wiley VOL 1: IMAGE CAPTURE AND STORAGE. Editor-in- Chief

CCD Requirements for Digital Photography

Digital Cameras The Imaging Capture Path

Digital Imaging Rochester Institute of Technology

Copyright 2005 Society of Photo Instrumentation Engineers.

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Image Formation and Camera Design

MML-High Resolution 5M Series

COLOR FILTER PATTERNS

Remote Sensing Platforms

EE 392B: Course Introduction

Light Microscopy. Upon completion of this lecture, the student should be able to:

Digital Photography and Geometry Capture. NBAY 6120 March 9, 2016 Donald P. Greenberg Lecture 4

Growing a NASA Sponsored Metrology Project to Serve Many Applications and Industries. James Millerd President, 4D Technology

Computational Cameras. Rahul Raguram COMP

Εισαγωγική στην Οπτική Απεικόνιση

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

Digital Photography. Visual Imaging in the Electronic Age Lecture #8 Donald P. Greenberg September 14, 2017

SUPPLEMENTARY INFORMATION

University Of Lübeck ISNM Presented by: Omar A. Hanoun

Improved sensitivity high-definition interline CCD using the KODAK TRUESENSE Color Filter Pattern

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

Advanced 3D Optical Profiler using Grasshopper3 USB3 Vision camera

CMOS Image Sensors in Cell Phones, Cars and Beyond. Patrick Feng General manager BYD Microelectronics October 8, 2013

Cameras As Computing Systems

Spatial-Phase-Shift Imaging Interferometry Using Spectrally Modulated White Light Source

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Lecture 18: Light field cameras. (plenoptic cameras) Visual Computing Systems CMU , Fall 2013

Short-course Compressive Sensing of Videos

Wavefront sensing by an aperiodic diffractive microlens array

3.0 Alignment Equipment and Diagnostic Tools:

LENSES. INEL 6088 Computer Vision

Spatially Resolved Backscatter Ceilometer

Compressive Optical MONTAGE Photography

Colorado School of Mines. Computer Vision. Professor William Hoff Dept of Electrical Engineering &Computer Science.

Chapter 18 Optical Elements

Multi-aperture camera module with 720presolution

How to Choose a Machine Vision Camera for Your Application.

Building a Real Camera. Slides Credit: Svetlana Lazebnik

FLASH LiDAR KEY BENEFITS

Learning the image processing pipeline

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

e2v Launches New Onyx 1.3M for Premium Performance in Low Light Conditions

ENHANCEMENT OF THE RADIOMETRIC IMAGE QUALITY OF PHOTOGRAMMETRIC SCANNERS.

Coded Computational Photography!

Lecture 30: Image Sensors (Cont) Computer Graphics and Imaging UC Berkeley CS184/284A

An Efficient Nonlinear Filter for Removal of Impulse Noise in Color Video Sequences

Tomorrow s Digital Photography

Imaging Fourier transform spectrometer

Coded photography , , Computational Photography Fall 2017, Lecture 18

MUSKY: Multispectral UV Sky camera. Valentina Caricato, Andrea Egidi, Marco Pisani and Massimo Zucco, INRIM

Opterra II Multipoint Scanning Confocal Microscope. Innovation with Integrity

Digital Cameras vs Film: the Collapse of Film Photography Can Your Digital Camera reach Film Photography Performance? Film photography started in

Coded photography , , Computational Photography Fall 2018, Lecture 14

Measurement of Texture Loss for JPEG 2000 Compression Peter D. Burns and Don Williams* Burns Digital Imaging and *Image Science Associates

Integrated Multi-Aperture Imaging

Building a Real Camera

Coding and Modulation in Cameras

Spectral and Polarization Configuration Guide for MS Series 3-CCD Cameras

Testing Aspheric Lenses: New Approaches

Use of Photogrammetry for Sensor Location and Orientation

Peregrine: A deployable solar imaging CubeSat mission

Ayuekanbe Atagabe. Physics 464(applied Optics) Winter Project Report. Fiber Optics in Medicine. March 11, 2003

OLYMPUS Digital Cameras for Materials Science Applications: Get the Best out of Your Microscope

Digital Photography and Geometry Capture. NBAY 6120 March 8, 2018 Donald P. Greenberg Lecture 3

Cameras. CSE 455, Winter 2010 January 25, 2010

Imaging Instruments (part I)

More specifically, I would like to talk about Gallium Nitride and related wide bandgap compound semiconductors.

Image Formation. Dr. Gerhard Roth. COMP 4102A Winter 2015 Version 3

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

Design and characterization of 1.1 micron pixel image sensor with high near infrared quantum efficiency

Understanding Infrared Camera Thermal Image Quality

Ronald Driggers Optical Sciences Division Naval Research Laboratory. Infrared Imaging in the Military: Status and Challenges

Image Formation III Chapter 1 (Forsyth&Ponce) Cameras Lenses & Sensors

INFRARED IMAGING-PASSIVE THERMAL COMPENSATION VIA A SIMPLE PHASE MASK

Remote Sensing Platforms

Image and Multidimensional Signal Processing

Basic principles of photography. David Capel 346B IST

UNCLASSIFIED R-1 ITEM NOMENCLATURE FY 2013 OCO

IMAGE SENSOR SOLUTIONS. KAC-96-1/5" Lens Kit. KODAK KAC-96-1/5" Lens Kit. for use with the KODAK CMOS Image Sensors. November 2004 Revision 2

Transcription:

New Directions in Imaging Sensors Ravi Athale, MITRE Corporation OIDA Annual Forum 19 November 2008

We live in xxxx age information, biotech, nano, neurotech, quantum Regardless of the answer, we live in an age of IMAGES! Photo removed due to copyright restrictions. A person using his cell phone to take of photo of a fire or explosion. Images (clockwise from upper left) from US Govt Agencies: NSA/ESA; 9-11 Commission; NIMH; NIH. 2

Exponential Growth in Camera Technology Stand-alone digital cameras: 1991: Kodak DCS-100, 1280x1024 pixels, $30,000 2008: Kodak Easyshare V1003, 10 Megapixel, $170 Total Digital Camera Volume > 150 million Cellphone cameras: 1997: First baby birth recorded on cell phone camera (VGA res) 2008: Samsung SCH-B600, 10 Megapixel, 30% of cell phone contain cameras Total cell phone volume to reach 1 billion Courtesy of Barry Hendry (Wikipedia) 3

Mammoth Camera: 1900 In 1900, George R. Lawrence built this mammoth 900 lb. camera, then the world s largest, for $5,000 (enough to purchase a large house at that time!) It took 15 men to move and operate the gigantic camera. The photographer was commissioned by the Chicago & Alton Railway to make the largest photograph (the plate was 8 x 4.5 ft in size!) of its train for the company s pamphlet "The Largest Photograph in the World of the Handsomest Train in the World." World s Smallest Cameras: 2006 http://www.letsgodigital.org/en/8687/omnivision_camerachip_ov6920/ http://www.medigus.com/camera_1_8_mm/camera.aspx OmniVision OV6920 sensor, 2.1 x 2.3 mm; PillCam Medigus Introspicio Camera 1.8 mm, 326x382 pixels Medigus Corp. Israeli medical imaging company 1.8 mm Endoscope But.basic Camera Architecture Remained Unchanged over 100 years 4

Other Observations: Detector arrays in visible wavelength scaling up very rapidly 100 Mpixel available Gigapixel possible (1.2 micron pixel over 35 mm sq array) Conventional imaging optics (wide FOV, high resolution) scales very poorly (heavy, bulky, expensive) Governing principles Maximum sample rate for all parameters everywhere Fixed resource allocation Measure everything then process Information unevenly distributed => most of the mega pixels contain very little to no information Large data volume (Multi GB/frame) overwhelming processing and communications. 5

What is the nature of the problem? Coming of data tsunami.. Storing, moving, processing data IDC report. Data storage technology falling behind data generation (primarily driven by still images and video) Worsening pixel-pupil ratio. <20% of images get looked at (this is an optimistic number) We are in an era that is pixel rich information poor One solution: Invoke Moore s Law to make problems go away Other approach: Change our basic notions about imaging 6

Imaging Sensors: Back to Basics Questions we ask: Who / What Where When Sensing Two primary sensing modes: How Why Analysis Exploitation Proximate Stand-off Photo courtesy of D Sharon Pruitt on Flickr. Stand-off sensing involves wave propagation which carries energy and information over distance without material transport scrambles spatial organization of signals Two aspects to processing Photo courtesy of anjamation on Flickr. Source coding: how object information is encoded in wavefront Channel distortion 7

Taking pictures => Scene interrogation WORLD Sensor Acquisition Front End Useable Information User Exploitation Back End Action Decision Useable information is the key concept dependent on the user Break from the past paradigm: Generic front end sensor generating a 2D pixel map Application-specific tasks performed in backend computation Useable information for navigation task is different from target recognition task Acquiring 3D spatial, spectral, polarization, temporal information that is relevant to task at hand in the most resource efficient manner is the primary goal. 8

Future Directions for Imaging sensors Cameras will also change form. Today, they are basically film cameras without the film, which makes about as much sense as automobiles circa 1910, which were horse-drawn carriages without the horse. A car owner of that time would be pretty shocked by what's in a showroom now. Camera stores of the future will surprise us just as much. Nathan Myhrvold, former chief technology officer of Microsoft and a co-founder of Intellectual Ventures, NY Times, 5 June 2006 9

Where are imaging sensors headed: Extending the Automotive Analogy Horse-drawn Carriage Horse-less Carriage Courtesy of M Skaffari on Flickr. Courtesy of digitpedia on Flickr. Images (clockwise from upper left): DARPA, US Army, USDA, NASA. Specialization? Autonomy? Film Cameras Film-less Cameras 10

Reworking Biological Inspiration: Human Eye and the Camera Replace film by CCD Made sense when cameras were used by exclusively humans Does it make sense for autonomous and semi-autonomous systems? Animal world shows a far greater diversity of imaging sensor designs Co-evolution of eye-brain-locomotion Task-specific sensor design Efficient use of resources 11

SOME EXAMPLES OF NEW CAMERA DESIGNS AND OPERATION 12

Prototype camera Stanford U Courtesy of Ren Ng. Used with permission. Contax medium format camera Kodak 16-megapixel sensor Adaptive Optics microlens array 125μ square-sided microlenses 4000 4000 pixels 292 292 lenses = 14 14 pixels per lens

Extending the depth of field Stanford U Courtesy of Ren Ng. Used with permission. conventional photograph, main lens at f / 4 conventional photograph, main lens at f / 22 light field, main lens at f / 4, after all-focus algorithm [Agarwala 2004]

Our Modification of Light Field Camera: Flexible Modality Imaging A light field architecture facilitates placing multidimensional diversity in the camera s pupil plane: Color information (e.g.) is available at each spatial location in (s,t) from each filter array image Spatial resolution from pinholes, filter resolution from # filters Ref: Horstmeyer, R., G.W. Euliss, R.A. Athale, and M. Levoy. "Flexible Multimodal Camera Using a Light Field Architecture." Proceedings of IEEE ICCP, 2009. 15

Experimental Results Use conventional Nikon 50mm f/1.8 lens, 10Mpix 9µ CCD Pinhole arrays printed on transparencies, varying size + pitch Filters cut and arranged on laser-cut plastic holders, placed inside lens over aperture stop Left and lower center images 2009 IEEE. Courtesy of IEEE. Used with permission. Source: Horstmeyer, R., G.W. Euliss, R.A. Athale, and M. Levoy. "Flexible Multimodal Camera Using a Light Field Architecture." Proceedings of IEEE ICCP, 2009. 16

Experimental Results Nine filters: Color =R, G, B, Y, C, Neutral Density =.4,.6, 1 pinhole r = 25µ, pitch = 250µ Use 3 ND filters to extend dynamic range (CMYK with density filter, HDR) RGB CMYK HDR Images courtesy of SPIE. Used with permission. Source: Horstemeyer, R., R. A. Athale, and G. Euliss. "Light Field Architecture for Reconfigurable Multimode Imaging." Proc. of SPIE 7468, August 2009. doi: 10.1117/12.828653 17

Experimental Results Sixteen filters: layout color IR pol. ND Image 2009 IEEE. Courtesy of IEEE. Used with permission. Source: Horstmeyer, R., G.W. Euliss, R.A. Athale, and M. Levoy. "Flexible Multimodal Camera Using a Light Field Architecture." Proceedings of IEEE ICCP, 2009. 18

Thin observation module bound by optics (TOMBO) Compound image is collected via microlens array High-resolution image is reconstructed from sub-images Architecture enables reduction in size and weight See Tanida, et. al., Applied Optics 40, 1806-1813 (2001)

Examples of Scene Interrogation systems: Same Scaling Analysis Doesn t Apply Adobe Photo of Adobe Lightfield camera array (2008). See http://www.notcot.com/archives/ 2008/02/adobe-lightfiel.php Mesa Imaging SR 3100 3D camera. See http://www.flickr.com/photos/81 381691@N00/3720851779/ Pixim D2500 Orca chipset for wide dynamic range video (e.g. surveillance). See http://www.pixim.com/productsand-technology/pixim-orca-chipsets Light-field cameras Time-of-flight imaging Active pixel sensors Images removed due to copyright restrictions. Image of demonstration. Nova Sensors Foveation 20

Final Thought. A Personal Imaging Assistant (PIA) for: Health care: Checking for sun burns, status of superficial wounds, ear infections. Appearance: Wardrobe matching (color and styles) while getting dressed or shopping Make up assistance (skin color analysis) Hygiene: Cleanliness of surroundings (presence of bacteria), water, food safety, quality Relationships: Remembering people, names, likes/dislikes, family details Discerning moods (boredom, deceit, amorous intents ) and of course taking pictures and videos without manual intervention based on user preferences learned over time How? Multi-spectral, polarimetric, day/night, active/passive illuminations, powerful processing Unobtrusive (almost covert) form factor Part of getting dressed

MIT OpenCourseWare http://ocw.mit.edu MAS.531 / MAS.131 Computational Camera and Photography Fall 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.