Issues in Color Correcting Digital Images of Unknown Origin

Similar documents
Colour correction for panoramic imaging

Estimating the scene illumination chromaticity by using a neural network

DYNAMIC COLOR RESTORATION METHOD IN REAL TIME IMAGE SYSTEM EQUIPPED WITH DIGITAL IMAGE SENSORS

Color Constancy Using Standard Deviation of Color Channels

The Effect of Exposure on MaxRGB Color Constancy

According to the proposed AWB methods as described in Chapter 3, the following

Calibration-Based Auto White Balance Method for Digital Still Camera *

Scene illuminant classification: brighter is better

Color Correction in Color Imaging

Natural Scene-Illuminant Estimation Using the Sensor Correlation

CS6640 Computational Photography. 6. Color science for digital photography Steve Marschner

Applying Visual Object Categorization and Memory Colors for Automatic Color Constancy

Color Computer Vision Spring 2018, Lecture 15

Image Sensor Color Calibration Using the Zynq-7000 All Programmable SoC

VU Rendering SS Unit 8: Tone Reproduction

Color constancy by chromaticity neutralization

Modified Jointly Blue Noise Mask Approach Using S-CIELAB Color Difference

Spectrogenic imaging: A novel approach to multispectral imaging in an uncontrolled environment

Spatially Varying Color Correction Matrices for Reduced Noise

The Quality of Appearance

A Model of Color Appearance of Printed Textile Materials

Automatic White Balance Algorithms a New Methodology for Objective Evaluation

Lecture Notes 11 Introduction to Color Imaging

Color , , Computational Photography Fall 2018, Lecture 7

For a long time I limited myself to one color as a form of discipline. Pablo Picasso. Color Image Processing

Color , , Computational Photography Fall 2017, Lecture 11

Frequency Domain Based MSRCR Method for Color Image Enhancement

On Contrast Sensitivity in an Image Difference Model

A Color Balancing Algorithm for Cameras

A Kalman-Filtering Approach to High Dynamic Range Imaging for Measurement Applications

Optimizing color reproduction of natural images

On Contrast Sensitivity in an Image Difference Model

Illumination-invariant color image correction

New Figure of Merit for Color Reproduction Ability of Color Imaging Devices using the Metameric Boundary Descriptor

The Effect of Opponent Noise on Image Quality

Digital Processing of Scanned Negatives

Bayesian Method for Recovering Surface and Illuminant Properties from Photosensor Responses

OS1-4 Comparing Colour Camera Sensors Using Metamer Mismatch Indices. Ben HULL and Brian FUNT. Mismatch Indices

Lecture: Color. Juan Carlos Niebles and Ranjay Krishna Stanford AI Lab. Lecture 1 - Stanford University

Time Course of Chromatic Adaptation to Outdoor LED Displays

Efficient Color Object Segmentation Using the Dichromatic Reflection Model

Segmentation using Saturation Thresholding and its Application in Content-Based Retrieval of Images

Fig Color spectrum seen by passing white light through a prism.

Color Gamut Mapping Using Spatial Comparisons

Multiplex Image Projection using Multi-Band Projectors

CMPSCI 670: Computer Vision! Color. University of Massachusetts, Amherst September 15, 2014 Instructor: Subhransu Maji

Spatio-Temporal Retinex-like Envelope with Total Variation

icam06, HDR, and Image Appearance

IMPROVEMENTS ON SOURCE CAMERA-MODEL IDENTIFICATION BASED ON CFA INTERPOLATION

Improving Image Quality by Camera Signal Adaptation to Lighting Conditions

A Novel Method for Enhancing Satellite & Land Survey Images Using Color Filter Array Interpolation Technique (CFA)

White Paper. Reflective Color Sensing with Avago Technologies RGB Color Sensor. Reflective Sensing System Hardware Design Considerations

ABSTRACT. Keywords: Color image differences, image appearance, image quality, vision modeling 1. INTRODUCTION

Local Adaptive Contrast Enhancement for Color Images

Color Accuracy in ICC Color Management System

Illuminant estimation in multispectral imaging

Keywords- Color Constancy, Illumination, Gray Edge, Computer Vision, Histogram.

6.098/6.882 Computational Photography 1. Problem Set 1. Assigned: Feb 9, 2006 Due: Feb 23, 2006

Improving Color Reproduction Accuracy on Cameras

A High-Speed Imaging Colorimeter LumiCol 1900 for Display Measurements

A Study of Slanted-Edge MTF Stability and Repeatability

University of British Columbia CPSC 414 Computer Graphics

A Parametric Method of Perspective Alignment and Color Correction for Skin Lesion Imaging

Effect of Capture Illumination on Preferred White Point for Camera Automatic White Balance

The Influence of Luminance on Local Tone Mapping

Review Paper on. Quantitative Image Quality Assessment Medical Ultrasound Images

Meet icam: A Next-Generation Color Appearance Model

Visibility of Uncorrelated Image Noise

Practical Method for Appearance Match Between Soft Copy and Hard Copy

Camera Image Processing Pipeline

Forget Luminance Conversion and Do Something Better

Color Reproduction Algorithms and Intent

ISSN Vol.03,Issue.29 October-2014, Pages:

New applications of Spectral Edge image fusion

Physical Asymmetries and Brightness Perception

ALEXA Log C Curve. Usage in VFX. Harald Brendel

The Perceived Image Quality of Reduced Color Depth Images

FOCAL LENGTH CHANGE COMPENSATION FOR MONOCULAR SLAM

High resolution images obtained with uncooled microbolometer J. Sadi 1, A. Crastes 2

ABSTRACT 1. PURPOSE 2. METHODS

Colour Management Workflow

Introduction to Video Forgery Detection: Part I

A moment-preserving approach for depth from defocus

Color Outline. Color appearance. Color opponency. Brightness or value. Wavelength encoding (trichromacy) Color appearance

Influence of Background and Surround on Image Color Matching

Investigations of the display white point on the perceived image quality

CS 89.15/189.5, Fall 2015 ASPECTS OF DIGITAL PHOTOGRAPHY COMPUTATIONAL. Image Processing Basics. Wojciech Jarosz

A simulation tool for evaluating digital camera image quality

LECTURE III: COLOR IN IMAGE & VIDEO DR. OUIEM BCHIR

Color Image Processing

Simulation of film media in motion picture production using a digital still camera

Edge Potency Filter Based Color Filter Array Interruption

Checkerboard Tracker for Camera Calibration. Andrew DeKelaita EE368

Contrast adaptive binarization of low quality document images

Color Visualization System for Near-Infrared Multispectral Images

Analysis On The Effect Of Colour Temperature Of Incident Light On Inhomogeneous Objects In Industrial Digital Camera On Fluorescent Coating

Colors as Seen by Humans and Machines

Image Distortion Maps 1

Calibrating the Yule Nielsen Modified Spectral Neugebauer Model with Ink Spreading Curves Derived from Digitized RGB Calibration Patch Images

Appearance Match between Soft Copy and Hard Copy under Mixed Chromatic Adaptation

Transcription:

Issues in Color Correcting Digital Images of Unknown Origin Vlad C. Cardei rian Funt and Michael rockington vcardei@cs.sfu.ca funt@cs.sfu.ca brocking@sfu.ca School of Computing Science Simon Fraser University urnaby.c. V5A S6 Canada Abstract Color correcting images of unknown origin (e.g. downloaded from the Internet adds additional challenges to the already difficult problem of color correction because neither the pre-processing the image was subjected to nor the camera sensors or camera balance are known. In this paper we propose a framework of dealing with some aspects of this type of image. In particular we discuss the issue of color correction of images ere an unknown non-linearity may be present. We show that the diagonal model used for color correcting linear images also works in the case of corrected images. We also discuss the influence that unknown sensors and unknown camera balance has on color constancy algorithms. Keywords: computer vision color constancy image processing Introduction Color constancy is an under-determined problem and is thus impossible to solve in the most general case. Among the many constraints that have been implicitly introduced by various color constancy algorithms[-5] the sensor calibration and image linearity are the most common. The color of a surface appearing in an image is determined in part by its surface reflectance and in part by the spectral power distribution of the light illuminating it. Thus as is well known a variation in the scene illumination changes the color of the surface as it appears in an image. This creates problems for computer vision systems such as colorbased object recognition[6] and digital cameras[7]. For a human observer however the perceived color shifts due to changes in illumination are relatively small. In other words humans exhibit a relatively high degree of color constancy[8]. From a computational perspective we define as the goal of color constancy the computation of an image with the same colors as would have been obtained by the same camera for the same scene under a standard canonical illuminant. We see this as a two-stage process: estimate the chromaticity of the illumination; and correct the image colors based on this estimate. One way to estimate the illumination is to have a ite patch in the image the chromaticity of ich will then be the chromaticity of the illuminant. Alternatively a more sophisticated color constancy method can be employed[-5]. After estimating the illuminant s chromaticity the scene can then be color corrected[9] based on a diagonal or coefficient-rule transformation. In this paper we will use the term color correction to denote the diagonal transformation of an image based on the coefficients computed from the estimation of the color of the illuminant given by a color constancy algorithm. In general existing color constancy algorithms [-5] ich estimate the incident scene illumination rely in one way or another on knowing something about the camera being used as well as on assumptions about the statistical properties of the expected illuminants and surface reflectances. Estimating the chromaticity of the illumination in an image of unknown origin poses new set of challenges. First of all not knowing the sensor sensitivity curves of the camera means that even for known surface under known illuminant we will not be able to predict its value. Figure shows how much the chromaticities in the rg-chromaticity space (defined as r/[++]

and g/[++] can vary between cameras. It shows the rg chromaticities of the Macbeth Colorchecker patches that would be obtained by a SONY DXC-93 and a Kodak DCS46 camera both color balanced for the same illuminant. The data for Figure was synthesized from the known camera response curves to avoid the values being disrupted by noise or other artifacts[]. Although the ite values coincide as they must given that cameras were balanced identically there is a substantial chromaticity difference between the chromaticities from the two cameras for many of the other patches. g.6.5.4.3.2....2.4.6.8 Figure Variation in chromaticity response of two digital cameras. A further problem for color constancy on images of unknown origin is that we do not know the illuminant for ich the camera was balanced. Even if two images are taken with the same camera the output will be different for different color balance settings. Yet another unknown is the camera s response as a function of intensity. Cameras often have a nonlinear response the main parameter of ich is often known as the camera s. For a variety of reasons[] different cameras may have different values or alternatively may produce linear output (. In this paper we will use the following definition of camera : ( ISD ere I is the resulting luminance S is the camera gain D is a pixel value in the.. range. A typical r DCS SONY value of is.45 however the results below apply for any reasonable value of. Although the chromaticity of ite or gray ( is preserved a change in will distort most other chromaticities with the general effect being to desaturate colors: (2 r /( + + g /( + + r g /( /( + + + + Usually r r and g g. In the following sections we present a framework for dealing with each of the above issues related to illumination estimation and color correction created by lack of knowledge about a camera s sensitivity functions and its. The effect of on color correction In terms of the effect of on color correction a crucial question is ether or not the diagonal model ich has been shown to work well on linear image data[9] still holds once the non-linearity of is introduced? We address this question both empirically and theoretically. Consider a n by 3 matrix Q of values of pixels from an image seen under illuminant E and a similar matrix Q 2 containing values from the same image but seen under illuminant E 2. According to the diagonal model of illumination change there exists a diagonal matrix M such that (3 Q M Q2 It must be noticed that M depends only on illuminants E and E 2 and does not depend on the pixel values in the images. In particular if ( are the values of ite under illuminant E and ( 2 2 2 are the values of ite under illuminant E 2 then M is given by (4 2 / M 2 / 2 /

For the purpose of this paper let M denote element-by-element exponentiation of the elements of matrix M. In the case ere the diagonal model M holds exactly for linear images then for images to ich a non-linear factor has been applied the diagonal transformation matrix will become M : (5 Q M Q 2 In general the diagonal model does not hold exactly due to broad or overlapping camera sensors so the transformation matrix will also contains small off-diagonal terms[2]. These off-diagonal terms are amplified by the introduction of. To explore the effects of on the off-diagonal terms we will evaluate the diagonal transformation between two synthesized images generated using spectral reflectances of the 24 patches of the Macbeth Colorchecker. One image is synthesized relative to CIE illuminant A and the other one relative to D65. We used the spectral sensitivities of the SONY DXC- 93 camera and scaled the resulting s to [...]. If A is the matrix of synthesized s under illuminant A and D is the matrix of s under illuminant D65 the transformation from matrix D to A is given by: (6 D M A For linear image data the best (non-diagonal transformation matrix M and the best diagonal matrix M D (in the least square errors sense are found to be (7 4.225 M.372.45 M D 3.886.66 2.27.48 2.36.82.76.32.792 and These transformation matrices are computed to minimize the mean square error using the pseudoinverse: (8 M D * A ere * denotes the pseudo-inverse of the matrix. The error of the transformation is computed between the estimated effect of the illuminant change EDM and the actual values under A. For the non-diagonal case the MS error E linear.6 the average error µ linear.88 and the standard deviation σ linear.6. In the perceptually uniform CIE Lab space the average error µ Lab 2.4 and the standard deviation σ Lab.56. The diagonal elements of M D are close to those of M but not equal to them. The difference compensates for the effect of constraining the non-diagonal terms to. We can expect the errors for the diagonal transformation to be someat higher. Using the diagonal transformation M D the MS error in space E linear.229 the average error µ linear.92 and the standard deviation σ linear.28. In CIE Lab space the average error µ Lab 3.36 and standard deviation σ Lab 2.3. Although these errors are almost twice as large as for the full non-diagonal linear transformation they are still quite small and show that a diagonal transformation provides a good model of illumination change. To determine the effect of on the effectiveness of the diagonal model we took the previously synthesized data and applied of /2.2. In this case the best transformation M and the best diagonal transformation M D are (9 M M D 2.2.86.24.38.38.52.855.38.94.43.95 and.877 The MS error using M is E.76 with average error µ.67 and standard deviation σ.37. In CIE Lab space the average error is µ Lab.6 with standard deviation σ Lab.69. For M D the MS error in space E.26 the average error µ.8 and the standard deviation σ.3. In CIE Lab space the average error µ Lab 2.4 with standard deviation σ Lab.39. These errors are comparable to the linear case above.

These results indicate that the diagonal model still holds in the case of images to ich a non-linear has been applied even in the case ere the diagonal model in the linear case provides only an approximate model of illumination change. Another issue in terms of color correction of image of unknown has to do with the effects of brightness scaling of the form ( to (kkk. A brightness scaling may result either from a change in incident illumination or camera exposure settings or it may be applied as a normalization step during color correction. In either case it turns out that a brightness change does not affect a pixel s chromaticity even en has been applied. Consider a pixel ( from a linear image with red chromaticity of r /( + +. After its red chromaticity will be ( r ( + + In the linear case any brightness scaling leaves the chromaticity unchanged. In the non-linear case the red chromaticity of a pixel will be ( r N ( k /( /((k + + ( k + r + ( k Similar results hold for other chromaticity channels so brightness changes do not effect the chromaticities in images. Note however that this does not mean that the chromaticity of a pixel is the same before and after the application of. Color correction on non-linear images We have shown thus far that ether or not has been applied the diagonal model works and the brightness of the original image does not affect the resulting chromaticities. In at follows we will discuss the commutativity of and color correction. iven an image I represented as an n-by-3 matrix of s we define two operators on this image. Γ(I denotes the application of and C(IM denotes the color correction operator: (2 ( Γ I I ere is considered constant and: (3 C( I M I M We wish to find out if the two operators commute i.e. if: (4 C ( Γ ( I M Γ( C( I M The diagonal transformation matrix M depends on the image I and the illuminant under ich it was taken. This transformation maps pixels belonging to a ite surface in the image into achromatic pixels (NNN. The problem is that applying affects the image chromaticities so a color constancy algorithm will receive a different set of input chromaticities depending on ether or not the image has had applied. Moreover the diagonal color correction transformation needs to be different. If ( is the color of the illuminant (i.e. the camera s response to an ideal ite surface under that illuminant for image I and ( is an arbitrary pixel in I then (5 C ( M [ m m m ] ( Γ( [ ] M C [ ] ere M is the transformation to be used on the image with applied: (6 M m m m If we know the color of the illuminant the diagonal elements of M can be computed from the following equation: (7 C ( Γ( [ ] M C( [ ] M [ m m m ] [ ]

(8 Thus the transformation matrix becomes: / M / / We can rewrite equation 5 as a function of ( and ( : (9 C ( Γ( [ ] M [ m m m ] The right hand side of equation 4 can be written as: ( Γ( [ m m m ] (2 Γ C( I M ere m x are the diagonal elements of matrix M. Since M maps a ite surface into ite we can write M as: (2 (22 / M / / Thus equation 2 can be rewritten as: Γ ( C( [ ] M Γ From equations 9 and 22 it follows that equation 4 is true for any pixel in I i.e. that color correction and application are commutative. Thus we can perform color correction on affected images in the same way as on linear images. In the equations above we assumed that there is a perfect ite surface in the image I or equivalently that the color of the illuminant is known. However because affects the chromaticities of the pixels in the image it will also affect their statistical distribution. This is because has a general tendency to desaturate colors. This change in the distribution of chromaticities can adversely affect the color constancy algorithms that rely on a priori knowledge about the statistics of the world. Color Correcting Images from Unknown Sensors There are two aspects related to unknown sensors: the color balance of the camera and the sensor sensitivity curves. In most cases the color balance is determined by scaling the three color channels according to some predetermined settings. The goal of the color balance is to obtain equal values for a ite patch under a canonical light. In this case we say that the camera is calibrated for that particular illuminant. Color correcting images taken with an unknown balance does not pose a problem since the calibrating coefficients can be absorbed in the diagonal transformation that performs the color correction. However finding the diagonal transformation might prove difficult for stochastic algorithms[3;4;5] that can have difficulties in generalizing their estimates if they fall outside the illumination gamut for ich they were trained. In the most general case ere the sensors of camera that took an image are unknown it is difficult to estimate the scene illumination due to the various sensors responses to even the same surfaces under identical lighting (see Fig.. In this case using a color constancy algorithm that has been trained in a self-supervised manner on such uncalibrated images can provide a simple and effective solution. This type of algorithm such as the one described in [3] uses a neural network that is trained to estimate the chromaticity of the incident scene illumination without having exact knowledge of the illumination chromaticity in the training set. The network learns to make a better estimate than the simple grayworld algorithm used in initially training it.

Conclusion We presented a framework for dealing with a quite general case of color correction; namely that of the images taken with a digital camera for ich both the spectral sensitivity of its sensors and its setting are unknown. One conclusion is that for images to ich has been applied it is possible to perform color correction by a diagonal transformation without first linearizing the image data. The off-diagonal elements of the general image transformation are larger en has been applied and thus the average error of a diagonal transformation (ich ignores the off-diagonal terms will increase. However the perceptual error is still very small and the diagonal transformation thus remains a good model of illumination change. In the case of unknown sensors there are large differences in sensor response even for cameras calibrated for the same illuminant. This variation in the distribution of sensor responses can adversely affect color constancy algorithms that rely on assumed distributions of sensor responses. Future work will focus on refining a self-supervised neural network approach to estimating the illumination in images of unknown origin. Acknowledgements The authors would like to acknowledge the support of Natural Sciences and Engineering esearch Council of Canada and Hewlett Packard Incorporated. eferences [] E. H. Land The etinex Theory of Color Vision Scientific American pp. 8-29 977. [2]. uchsbaum A Spatial Processor Model for Object Colour Perception J. Franklin Institute 3 ( pp. -26 98 [3] D. A. Forsyth A Novel Algorithm for Color Constancy International Journal of Computer Vision 5: 5-36 99. [4]. Finlayson Color in Perspective IEEE Trans. PAMI 8 ( pp. 34-38 996. [5]. Funt V. Cardei and K. arnard Learning Color Constancy Proc. IS&T/SID Fourth Color Imaging Conf. pp. 58-6 Scottsdale Arizona November 996. [6] M. Swain and D. allard Color Indexing Int. J. of Computer Vision 7: pp. -32 99. [7]. Funt K. arnard and L. Martin Is colour constancy good enough? Proc. 5 th European Conf. on Computer Vision pp. 445-459 998 [8] D. H. rainard W. A. runt and J. M.. Speigle Color constancy in the nearly natural image.. Asymmetric matches. J. Opt. Soc. Am. A 4(9 997 [9]. Finlayson M. Drew and. Funt Color Constancy: eneralized Diagonal Transforms Suffice J. Opt. Soc. Am. A ( pp. 3-32 994. []. Funt V. C. Cardei K. arnard Neural Network Color Constancy and Specular eflecting Surfaces AIC Color 97 Kyoto Japan pp. 523-526 997. [] C. Poynton The ehabilitation of amma.e. ogowitz and T.N. Pappas (eds. Proc. of SPIE 3299 pp. 232-249 ellingham WA.: SPIE 998. [2] J. A. Worthey and M. H. rill Heuristic Analysis of von Kries color constancy J. Opt. Soc. of Am. A Vol 3( 79-72 986. [3]. Funt and V. C. Cardei ootstrapping Color Constancy SPIE Electronic Imaging 99 San Jose Jan. 999 (in press.