A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications

Similar documents
A simulation tool for evaluating digital camera image quality

Visibility of Uncorrelated Image Noise

SYSTEMATIC NOISE CHARACTERIZATION OF A CCD CAMERA: APPLICATION TO A MULTISPECTRAL IMAGING SYSTEM

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

According to the proposed AWB methods as described in Chapter 3, the following

Realistic Image Synthesis

HIGH DYNAMIC RANGE MAP ESTIMATION VIA FULLY CONNECTED RANDOM FIELDS WITH STOCHASTIC CLIQUES

Goal of this Section. Capturing Reflectance From Theory to Practice. Acquisition Basics. How can we measure material properties? Special Purpose Tools

Burst Photography! EE367/CS448I: Computational Imaging and Display! stanford.edu/class/ee367! Lecture 7! Gordon Wetzstein! Stanford University!

Improving Image Quality by Camera Signal Adaptation to Lighting Conditions

INCREASING LINEAR DYNAMIC RANGE OF COMMERCIAL DIGITAL PHOTOCAMERA USED IN IMAGING SYSTEMS WITH OPTICAL CODING arxiv: v1 [cs.

BASLER A601f / A602f

Automatic High Dynamic Range Image Generation for Dynamic Scenes

HDR images acquisition

Introduction to Video Forgery Detection: Part I

Omnidirectional High Dynamic Range Imaging with a Moving Camera

A Study of Slanted-Edge MTF Stability and Repeatability

ISET Selecting a Color Conversion Matrix

DISCRIMINANT FUNCTION CHANGE IN ERDAS IMAGINE

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Image Processing for feature extraction

Wavelet Based Denoising by Correlation Analysis for High Dynamic Range Imaging

Lecture Notes 11 Introduction to Color Imaging

Remote Sensing Calibration Solutions

HDR imaging Automatic Exposure Time Estimation A novel approach

High Dynamic Range Imaging

On the use of synthetic images for change detection accuracy assessment

Introduction to Image Processing and Computer Vision -- Noise, Dynamic Range and Color --

Figure 1 HDR image fusion example

Acquisition and representation of images

A Saturation-based Image Fusion Method for Static Scenes

Noise Characteristics of a High Dynamic Range Camera with Four-Chip Optical System

A Novel Hybrid Exposure Fusion Using Boosting Laplacian Pyramid

Color Restoration of RGBN Multispectral Filter Array Sensor Images Based on Spectral Decomposition

Image Denoising Using Statistical and Non Statistical Method

Cvision 2. António J. R. Neves João Paulo Silva Cunha. Bernardo Cunha. IEETA / Universidade de Aveiro

VU Rendering SS Unit 8: Tone Reproduction

DEFENSE APPLICATIONS IN HYPERSPECTRAL REMOTE SENSING

A Framework for Analysis of Computational Imaging Systems

Measurement of Texture Loss for JPEG 2000 Compression Peter D. Burns and Don Williams* Burns Digital Imaging and *Image Science Associates

Acquisition. Some slides from: Yung-Yu Chuang (DigiVfx) Jan Neumann, Pat Hanrahan, Alexei Efros

The ultimate camera. Computational Photography. Creating the ultimate camera. The ultimate camera. What does it do?

A Short History of Using Cameras for Weld Monitoring

Luminance Adaptation Model for Increasing the Dynamic. Range of an Imaging System Based on a CCD Camera

Multiscale model of Adaptation, Spatial Vision and Color Appearance

High Dynamic Range Imaging using FAST-IR imagery

Image based lighting for glare assessment

WHITE PAPER. Methods for Measuring Flat Panel Display Defects and Mura as Correlated to Human Visual Perception

Automatic Selection of Brackets for HDR Image Creation

Interpolation of CFA Color Images with Hybrid Image Denoising

To Denoise or Deblur: Parameter Optimization for Imaging Systems

Noise and ISO. CS 178, Spring Marc Levoy Computer Science Department Stanford University

Acquisition and representation of images

Correction of Clipped Pixels in Color Images

Solid State Luminance Standards

PSEUDO HDR VIDEO USING INVERSE TONE MAPPING

Enhancement of Speech Signal Based on Improved Minima Controlled Recursive Averaging and Independent Component Analysis

Real-time ghost free HDR video stream generation using weight adaptation based method

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

Issues in Color Correcting Digital Images of Unknown Origin

COMPUTATIONAL PHOTOGRAPHY. Chapter 10

Camera Image Processing Pipeline

Continuous Flash. October 1, Technical Report MSR-TR Microsoft Research Microsoft Corporation One Microsoft Way Redmond, WA 98052

International Journal of Innovative Research in Engineering Science and Technology APRIL 2018 ISSN X

DETERMINING LENS VIGNETTING WITH HDR TECHNIQUES

Fringe Parameter Estimation and Fringe Tracking. Mark Colavita 7/8/2003

Multispectral. imaging device. ADVANCED LIGHT ANALYSIS by. Most accurate homogeneity MeasureMent of spectral radiance. UMasterMS1 & UMasterMS2

Estimation of spectral response of a consumer grade digital still camera and its application for temperature measurement

Comprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method

Digital Imaging Systems for Historical Documents

Using Spatially Varying Pixels Exposures and Bayer-covered Photosensors for High Dynamic Range Imaging

High dynamic range imaging and tonemapping

Camera Image Processing Pipeline: Part II

Low Dynamic Range Solutions to the High Dynamic Range Imaging Problem

A Real Time Algorithm for Exposure Fusion of Digital Images

Contrast Image Correction Method

Calibration considerations for a reduced-timeline optimized approach for VNIR earthorbiting

SCIENTIFIC AND TECHNICAL INSIGHT INTO MICROCARB

Sequential Algorithm for Robust Radiometric Calibration and Vignetting Correction

A collection of hyperspectral images for imaging systems research Torbjørn Skauli a,b, Joyce Farrell *a

Extended Dynamic Range Imaging: A Spatial Down-Sampling Approach

A 120dB dynamic range image sensor with single readout using in pixel HDR

COLOR CORRECTION METHOD USING GRAY GRADIENT BAR FOR MULTI-VIEW CAMERA SYSTEM. Jae-Il Jung and Yo-Sung Ho

Quantitative Estimation of Vvariability in the Underwater Radiance Distribution (RadCam)

Image Processing by Bilateral Filtering Method

Images and Displays. Lecture Steve Marschner 1

Spatially Varying Color Correction Matrices for Reduced Noise

Camera Image Processing Pipeline: Part II

Enhanced LWIR NUC Using an Uncooled Microbolometer Camera

High Dynamic Range Images

ELEC Dr Reji Mathew Electrical Engineering UNSW

ISO INTERNATIONAL STANDARD. Photography Electronic still-picture cameras Methods for measuring opto-electronic conversion functions (OECFs)

CCD reductions techniques

(i) Understanding the basic concepts of signal modeling, correlation, maximum likelihood estimation, least squares and iterative numerical methods

DIGITAL IMAGE PROCESSING UNIT III

When Does Computational Imaging Improve Performance?

Background Adaptive Band Selection in a Fixed Filter System

SNR Estimation in Nakagami-m Fading With Diversity Combining and Its Application to Turbo Decoding

Fast Bilateral Filtering for the Display of High-Dynamic-Range Images

Learning the image processing pipeline

Transcription:

A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications IEEE Transactions on Image Processing, Vol. 21, No. 2, 2012 Eric Dedrick and Daniel Lau, Presented by Ran Shu School of Electrical Engineering and Computer Science Kyungpook National Univ.

Abstract High dynamic range imaging Receiving little attention Dynamic range of a scene Exceeding captured by image sensor in single exposure Constructing radiance maps In measurement applications Proposing novel HDRI Pixel-by-pixel Kalman filtering Evaluating performance Using proposing objective metric Presented experiments 9.4-dB improvement in signal-to-noise ratio 29% improvement in radiometric accuracy Over classic method 2/37

Introduction Exceeding dynamic range of scene Captured by camera in single exposure Over and underexposed areas Multiexposure techniques Fusing exposures into single composite image With higher dynamic range 3/37

Previous multiexposure methods Using exposure time ratio and pixel value mappings between exposure Obtaining parametric camera response function Exposure fusion performed using weighted average Using reciprocity relation Exposure times and smoothness constraint Constructing nonparametric camera response function» Exposure fusion performed using weighted average Recovering camera response function In presence of camera white balancing Estimating radiance uncertainties» Basing on statistics of pooled pixels Performing exposure fusion by iteratively updating radiance estimates Using weighting term based on estimated noise variance 4/37

HDRI for infrared camera and temperature measurement Using blackbodies of controlled temperatures for calibration Finding response function of camera with InSb image sensor Using emissivities Performing exposure fusion Computing radiance of each pixel from exposure» Longest exposure time» Pixel not saturated HDRI for thermographic application Recovering inverse camera response function Mapping pixel values to radiance 5/37

Proposing method Postulating HDRI targeting measurement applications Rooted in solid-state image sensor models Weights used in exposure fusion» Basing on noise present in acquired exposures Estimating uncertainty in radiance estimates» Providing useful information to application Presenting new method of HDRI Camera response function Vary independently across image sensor Making term pixel response function more appropiate Calibration procedure Improving performance across sensor array Corrects pixelwise nonuniformity caused by sensor array» Measurement noise power» Scene illumination» Optical vignetting 6/37

Calibration limitation Unchanging part Illumination conditions Camera parameters Focus and aperture size Changing part Depth-dependent changes» In illumination» Object distance differing too much Performance of approach depending Controlled camera settings Environmental conditions 7/37

HDR achieving Applying Kalman filtering independently Each pixel location across multiple exposures Demonstrating usefulness of proposed method Introducing objective metrics Evaluating accuracy and performance» Comparing to classic HDRI techniques 8/37

Camera calibration Performing calibration Using spectrally flat white balance card Correcting any fixed pattern spatial nonuniformities Existing due to illumination, optical vignetting, sensor noise Not sensitive to type of illuminant Reflected radiant intensity of white balance card Target reflector Both scaling identically with variations in illumination intensity» Reflectance measurements not sensitive to illuminant intensity 9/37

Selecting camera Linear camera response function Without noise Response of each pixel z ATr B (1) where z T r AB, is output of a particular pixel, is exposure time, is scene radiance at this pixel location, parameters to be determined for each pixel through calibration 10/37

Spatial nonuniformities due to Lighting Vignetting fall-off Sensor fixed pattern noises Fig.1. Exposure of a uniform target taken using a 10-bit camera. Nonuniformities are evident. 11/37

Acquiring exposures of calibration card With varying exposure times allowing A and B Pixel gain coefficients Pixel offset coefficients (a) (b) (c) Fig.2. (a) Gain and (b) offset coefficients of the pixel response functions. (c) Variance of a sequence of 49 exposures taken with identical camera parameters. 12/37

Two noise sources Dark current Photo response nonuniformity» Corrected by pixel gain and offset terms of pixel response functions Zero-mean noise Shot noise Read noise» Not uniform across sensor» Few isolated pixels with large variances suppressed Modeled for particular pixel R CTr D (2) where calibration card defines, R CD, is measurement noise power, are parameters to be determined for each pixel through linear regression r 1 13/37

Coefficients of measurement noise power model (a) (b) (c) Fig.3. (a) Gain and (b) offset coefficients of the measurement noise power model. (c) Estimated process noise power Q. 14/37

Pixel response model Including process noise and measurement noise Estimating process noise power from residual error z AT r B n (3) 2 2 2 z A T Q R (4) where Q is process noise power at particular pixel location, Any outlier in calibration data Estimating of process noise power» Occasionally occur in some isolated pixels 15/37

Shot noise process Described by Poisson distribution Expecting number of occurrences increase Fig.4. (Open circle) Probability density function of a Poisson distribution overlaid with the probability density function of a Gaussian distribution. Both distributions have a mean and variance of 10. 16/37

Selecting Prosilica GC640 camera Micron MT9V203 CMOS image sensor Operating fully manual mode» Eliminating need to compensate for features» Reducing quantization noise and not introducing compression artifacts 17/37

Radiance estimation HDRI based on Kalman filtering Selected camera with linear response function Chosen operating region Applying Gaussian noise model State x of linear system Expressing in state space form x Φ x Γ u w (5) where Φ Γ w k k 1 k 1 k 1 k 1 k 1 governs the time evolution of the system, is a weight applied to the control, is additive white Gaussian process noise Measurement of state performing given by measurement model zk Hkxk v k (6) where z is an observation vector, H relates state to observation, v is additive white Gaussian process noise u 18/37

Expected value operator E E T T E T vw k k 0 w w Q, v v R, k k k k k k Optimal estimates of system state Generated recursively with Kalman filter where x Φ xˆ Γ u k 1 k 1 k 1 k 1 P Φ P Φ Q T k k 1 k 1 k 1 k 1 K P H H P H R T T k k k k k k k xˆ xˆ K z H xˆ k k k k k k, P Ex xˆ x x T T k k k k k k k k k P I K H P I K H K R K I K ˆx 1 designations refer to a priori and posteriori estimates, is an estimate of covariance, is the identity matrix, is the Kalman gain ˆ T (7) (8) (9) (10) (11) 19/37

Reciprocity relation stating response of pixel Function of product of scene radiance At that location Exposure time General process and measurement models Simpler scalar forms» With assumptions of static scene r r k k 1 k 1 z AT r B n k k k (12) (13) where A and B are gain and offset parameters of a particular pixel determined from calibration 20/37

Simplifications in Kalman filter Used to estimating radiance at this pixel location» Performing procedure independently for each pixel» Each with its own filter rˆ k rˆ k1 P P Q k k 1 k 1 K AT P A T P R 2 2 k k k k k K rˆ rˆ K z AT rˆ B k k k k k k 2 2 k k k k k k P l K AT P K R 1 (14) (15) (16) (17) (18) 21/37

Performance analysis Comparing performance of proposed method Previous methods Based on exposure sequences of Gretag-Macbeth color chart Illuminated by 60-W incandescent source Fig.5. Exposures used as inputs to HDRI algorithms. Exposure times were 0.5, 1.0, 2.5, 5.0, 6.5, 8.0, 15.5, 23.0, 35.5, and 65.5 ms. 22/37

Proposing characterizing algorithm performance Using objective metric with SNR as measure of precision Radiance ratio test used as measure of accuracy Previous camera response functions Fig.6. Camera response functions computed by the Debevec, Mitsunaga, and Robertson methods. 23/37

Number of usable samples and HDR results Fig.7. Number of usable samples in the sequence at each pixel location. Fig.8. Radiance estimates generated using (a) Debevec, (b) Akyuz, (c) Robertson, (d) Mitsunaga, (e)richards, and (f) Kalman filtering. 24/37

Uncertainty estimates generated by Kalman approach Fig.9. Estimates of the uncertainty in relative radiance generated by the Kalman-filtering approach. 25/37

Subtle difference between HDR images Considering for measurement techniques Applying objective metrics» Beginning with their associated SNRs» As measures of uniformity r SNR 20 log 10 r (19) where r is the mean of radiance of six fully visible patches in first row of in put images as signal amplitude, r is standard deviation of radiance is taken to be noise amplitude 26/37

SNRs of the original exposure sequence Table.1. SNR of the exposures shown in Fig. 5. 27/37

SNRs of various HDRI techniques Table.2. SNR of the HDR images generated from the sequence in Fig. 5. 28/37

Accuracy of radiance estimates Relative radiance easier to obtain than absolute radiance Devised a radiance ratio test» Comparing measured luminance values» To CIELAB coordinates of Gretag-Macbeth color chart Converting luminance values to Relative luminance» Divided by lightest reference checker» Yielded a ratio independent of white point normalization factor Averaged over each patch 29/37

Relative reflectances of the HDR images generated from the sequence in Fig.5 Table.3. Relative reflectances of the HDR images generated from the sequence in Fig.5 30/37

Examining Figs. 8 and 9 Uncertainty in relative radiance Closely related to relative radiance itself» Substituting (16) into (18) and iterating k times P k» Dividing by and letting P k» Further simplifies P k P 0 k i1 i 2 2 k k T i j 0 i1 i i 1 i 1 i Ri P0 P0 P A R R A 2 k i1 R i R 2 k k T i i 1Ri i1 Ri 2 A 2 1 k i1 Ti R i (20) (21) (22) 31/37

» Experimentally verified D is typically less than 1% of CTr P k 2 A» Sampling HDR scene 2 C i1» Double exposure time between each exposure P k A Sequence of exposures with much lower dynamic range Exposure times repeated k C 2 k 1 Ti R i 2 1 T 1 (23) (23) Fig.10. Second set of exposures used as inputs to the different HDRI techniques. Exposure times are 2, 2, 2, 2.5, 2.5, and 2.5 ms. 32/37

Table.4. SNR of the exposures shown in Fig. 10 Table.5. SNR of the HDR images generated from the input sequence in Fig. 10 33/37

Table.6. Relative reflectances of the HDR images generated from the sequence in Fig. 10 34/37

Relative radiance estimate of Kalman-filtering Fig.11. Relative radiance estimates of the Kalman-filtering approach. Fig.12. Relative radiance estimates of the Kalman-filtering approach with no process noise. 35/37

Frames from an HDR video sequence (a) (b) (c) (d) Fig.13. (a) a moving matchbox and (b) a candle, with the exposure times chosen for the (c) matchbox and (d) candle video sequences. 36/37

conclusions HDRI based on Kalman filtering Proposed objective quality metric Assess precision and accuracy Useful for measurement applicaiton 37/37