FCam: An architecture for computational cameras Dr. Kari Pulli, Research Fellow Palo Alto
What is computational photography? All cameras have optics + sensors But the images have limitations they cannot give the same visual experience as the human eye Typical CP: Take several images combine them compute and tease out more information create better images Other approaches experiment on changing the camera itself we mostly take the camera hardware as given but want to take everything out of it 2
High-Dynamic-Range Photography +... + = 3
Panoramic Photography... 4
Flash-No-Flash Photography 5
Traditional Computational Photography With high-end cameras big optics and sensors -> high image quality Mostly in the lab researchers, professionals, hard-core hobbyists camera on a tripod, static or controlled situation Offline computational photography processing done later, offline on a PC no user interaction during capture 6
Mobile Computational Photography From labs to everybody camera phones are consumer products Camera phone challenges small optics and sensors -> high noise handheld Online computational photography interactive loop between user, computation, and imaging On-device processing 7 instant gratification share immediately get immediate feedback (do I need to recapture?)
But (mobile) camera APIs are not flexible Only the basic, simple use cases supported No access to most settings (e.g., absolute exposure) or raw data Changing those simple settings flush pipeline incurring a half-second delay No control over metering or focus algorithms Applications beyond initiating the autofocus routine requiring quick changes of sensor or lens settings that need to know which settings apply to which image that need raw frames
The FCam Architecture A software architecture for programmable cameras that attempts to expose the maximum device capabilities while remaining easy to program 9
The Sensor The sensor has no visible state it is a pipeline that converts requests for images (Shots) into images the Shot specifies all parameters to be used in that image s capture 10
Other Devices Other devices (like the lens and flash) can schedule Actions to be triggered at a given time during an exposure tag returned images with extra metadata 11
Control Algorithms No hidden daemon running autofocus/metering nobody changes the settings under you Programmer has full control over sensor settings access to the supplemental statistics the ISP computes for each frame 12
Implementations 13
Implementations 14
HDR viewfinder 15
Final HDR Result Created completely on-camera 16
Automatic Panorama Capture Automatically captures high-resolution images as user pans Alternates exposures to extend dynamic range 17
HDR Panorama 18
Lucky Imaging: hand-held long exposures Attach inertial measurement unit with 3-axis gyro to the N900 Estimate if a captured image suffers from handshake keep capturing if it does Allows hand-held 1/4 second exposures with little extra capture time 19
Low-noise Viewfinder and Capture Viewfinder combines multiple aligned frames averaging reduces noise Capture high-gain/short-exposure image and a low-gain/long-exposure image combine to a sharp image with little noise 20
21 Long Exposure - Blurry
22 Short Exposure - Noisy
23 Result
24
Double-flash example Using the F2 Frankencamera and two Canon flash units control the cameras during the exposure low intensity strobing followed by second curtain flash 25
Public release plans We plan to publicly release FCam for the N900 in July, shortly before SIGGRAPH on garage.maemo.org 26