A new edge-adaptive demosaicing algorithm for color filter arrays

Similar documents
AN EFFECTIVE APPROACH FOR IMAGE RECONSTRUCTION AND REFINING USING DEMOSAICING

DIGITAL color images from single-chip digital still cameras

Image Demosaicing. Chapter Introduction. Ruiwen Zhen and Robert L. Stevenson

A Novel Method for Enhancing Satellite & Land Survey Images Using Color Filter Array Interpolation Technique (CFA)

Color Filter Array Interpolation Using Adaptive Filter

Demosaicing Algorithms

Artifacts Reduced Interpolation Method for Single-Sensor Imaging System

An Improved Color Image Demosaicking Algorithm

Edge Potency Filter Based Color Filter Array Interruption

Demosaicing Algorithm for Color Filter Arrays Based on SVMs

ABSTRACT I. INTRODUCTION. Kr. Nain Yadav M.Tech Scholar, Department of Computer Science, NVPEMI, Kanpur, Uttar Pradesh, India

Two-Pass Color Interpolation for Color Filter Array

Analysis on Color Filter Array Image Compression Methods

Comparative Study of Demosaicing Algorithms for Bayer and Pseudo-Random Bayer Color Filter Arrays

DEMOSAICING, also called color filter array (CFA)

Research Article Discrete Wavelet Transform on Color Picture Interpolation of Digital Still Camera

Simultaneous Capturing of RGB and Additional Band Images Using Hybrid Color Filter Array

Color Demosaicing Using Variance of Color Differences

COLOR demosaicking of charge-coupled device (CCD)

IN A TYPICAL digital camera, the optical image formed

Method of color interpolation in a single sensor color camera using green channel separation

NOVEL COLOR FILTER ARRAY DEMOSAICING IN FREQUENCY DOMAIN WITH SPATIAL REFINEMENT

Lecture Notes 11 Introduction to Color Imaging

THE commercial proliferation of single-sensor digital cameras

Design of Practical Color Filter Array Interpolation Algorithms for Cameras, Part 2

New Efficient Methods of Image Compression in Digital Cameras with Color Filter Array

A robust, cost-effective post-processor for enhancing demosaicked camera images

MOST digital cameras capture a color image with a single

Practical Implementation of LMMSE Demosaicing Using Luminance and Chrominance Spaces.

Design of practical color filter array interpolation algorithms for digital cameras

Denoising and Demosaicking of Color Images

An Effective Directional Demosaicing Algorithm Based On Multiscale Gradients

A new CFA interpolation framework

Interpolation of CFA Color Images with Hybrid Image Denoising

Demosaicing using Optimal Recovery

RGB RESOLUTION CONSIDERATIONS IN A NEW CMOS SENSOR FOR CINE MOTION IMAGING

COLOR DEMOSAICING USING MULTI-FRAME SUPER-RESOLUTION

Color filter arrays revisited - Evaluation of Bayer pattern interpolation for industrial applications

Improvements of Demosaicking and Compression for Single Sensor Digital Cameras

Enhanced DCT Interpolation for better 2D Image Up-sampling

IMPROVEMENTS ON SOURCE CAMERA-MODEL IDENTIFICATION BASED ON CFA INTERPOLATION

Narrow-Band Interference Rejection in DS/CDMA Systems Using Adaptive (QRD-LSL)-Based Nonlinear ACM Interpolators

Detail preserving impulsive noise removal

No-Reference Perceived Image Quality Algorithm for Demosaiced Images

Image and Vision Computing

A New Image Sharpening Approach for Single-Sensor Digital Cameras

Image Processing for feature extraction

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

IEEE Signal Processing Letters: SPL Distance-Reciprocal Distortion Measure for Binary Document Images

1982 IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 24, NO. 11, NOVEMBER 2014

Dr. J. J.Magdum College. ABSTRACT- Keywords- 1. INTRODUCTION-

PCA Based CFA Denoising and Demosaicking For Digital Image

An evaluation of debayering algorithms on GPU for real-time panoramic video recording

Demosaicking methods for Bayer color arrays

Effective Pixel Interpolation for Image Super Resolution

Normalized Color-Ratio Modeling for CFA Interpolation

New Edge-Directed Interpolation

Image Demosaicing: A Systematic Survey

Recent Patents on Color Demosaicing

Simultaneous geometry and color texture acquisition using a single-chip color camera

Comparative Study of Different Wavelet Based Interpolation Techniques

Acquisition Basics. How can we measure material properties? Goal of this Section. Special Purpose Tools. General Purpose Tools

Region Adaptive Unsharp Masking Based Lanczos-3 Interpolation for video Intra Frame Up-sampling

COMPRESSION OF SENSOR DATA IN DIGITAL CAMERAS BY PREDICTION OF PRIMARY COLORS

Removal of High Density Salt and Pepper Noise through Modified Decision based Un Symmetric Trimmed Median Filter

Subband coring for image noise reduction. Edward H. Adelson Internal Report, RCA David Sarnoff Research Center, Nov

Contrast adaptive binarization of low quality document images

Multi-sensor Super-Resolution

Moving Object Detection for Intelligent Visual Surveillance

Smart Interpolation by Anisotropic Diffusion

Improved sensitivity high-definition interline CCD using the KODAK TRUESENSE Color Filter Pattern

Adaptive demosaicking

International Journal of Advancedd Research in Biology, Ecology, Science and Technology (IJARBEST)

IDENTIFYING DIGITAL CAMERAS USING CFA INTERPOLATION

Introduction. Prof. Lina Karam School of Electrical, Computer, & Energy Engineering Arizona State University

Introduction to DSP ECE-S352 Fall Quarter 2000 Matlab Project 1

Image Distortion Maps 1

Image and Video Processing

TO reduce cost, most digital cameras use a single image

Color Demosaicing Using Asymmetric Directional Interpolation and Hue Vector Smoothing

Image Interpolation Based On Multi Scale Gradients

A new quad-tree segmented image compression scheme using histogram analysis and pattern matching

A complexity-efficient and one-pass image compression algorithm for wireless capsule endoscopy

Fig Color spectrum seen by passing white light through a prism.

Joint Demosaicing and Super-Resolution Imaging from a Set of Unregistered Aliased Images

COLOR FILTER PATTERNS

Optimal Color Filter Array Design: Quantitative Conditions and an Efficient Search Procedure

Performance Evaluation of Edge Detection Techniques for Square Pixel and Hexagon Pixel images

Module 6 STILL IMAGE COMPRESSION STANDARDS

Image acquisition. In both cases, the digital sensing element is one of the following: Line array Area array. Single sensor

Vision Review: Image Processing. Course web page:

Lossless Image Watermarking for HDR Images Using Tone Mapping

Spatially Adaptive Color Filter Array Interpolation for Noiseless and Noisy Data

IEEE TRANSACTIONS ON IMAGE PROCESSING, VOL. 19, NO. 9, SEPTEMBER /$ IEEE

A new directional image interpolation based on Laplacian operator

A Robust Nonlinear Filtering Approach to Inverse Halftoning

Neural Network with Median Filter for Image Noise Reduction

Image Processing: An Overview

Spatially Adaptive Algorithm for Impulse Noise Removal from Color Images

AUTOMATIC DETECTION AND CORRECTION OF PURPLE FRINGING USING THE GRADIENT INFORMATION AND DESATURATION

Transcription:

Image and Vision Computing 5 (007) 495 508 www.elsevier.com/locate/imavis A new edge-adaptive demosaicing algorithm for color filter arrays Chi-Yi Tsai, Kai-Tai Song * Department of Electrical and Control Engineering, National Chiao Tung University, 00 Ta Hsueh Road, Hsinchu 300, Taiwan Received 6 December 004 received in revised form 4 September 006 accepted 9 December 006 Abstract A novel edge-adaptive demosaicing algorithm (EADA) is proposed in this paper to effectively reduce color artifacts in demosaiced images from a color filter array (CFA). The proposed algorithm aims to reduce the aliasing error in red and blue channels by exploiting high-frequency information of the green channel. To achieve this, color-difference based edge-adaptive filtering and post-processing schemes are designed to reproduce the color values by exploiting the green channel information. For green channel interpolation, any of the existing image interpolation methods can be used and combined with the proposed algorithm. Moreover, a new adaptive interpolation method is presented for reconstructing the green channel from CFA samples. We have compared this technique with two recently proposed demosaicing techniques: Gunturk s and Lu s methods. The experimental results show that EADA outperforms both of them in both PSNR values and CIELAB DE ab measures. Ó 007 Elsevier B.V. All rights reserved. Keywords: Image representation Color reproduction CFA demosaicing Adaptive filtering Color artifacts. Introduction * Corresponding author. Tel.: +886 3 573865 fax: +886 3 575998. E-mail addresses: chiyi.ece9g@nctu.edu.tw (C.-Y. Tsai), ktsong@ mail.nctu.edu.tw (K.-T. Song). Digital color images from single-chip digital still cameras are obtained by interpolating the output from a color filter array (CFA). The CFA consists of a set of spectrally selective filters that are arranged in an interleaved pattern so that each sensor pixel samples one of three primary color components. These sparsely sampled color values are termed mosaiced images. To render a full-color image from a mosaiced image, an image reconstruction process, commonly known as CFA interpolation or CFA demosaicing, is required to estimate for each pixel its two missing color values. The simplest CFA demosaicing methodologies apply well-known interpolation techniques to each color channel such as nearest-neighbor replication, bilinear interpolation, and cubic spline interpolation. However, these single-channel algorithms usually introduce severe color artifacts and blurring around sharp edges []. These drawbacks indicate the need for more specialized algorithms for advanced demosaicing performance. We discuss these demosaicing methods in three groups. The first class of demosaicing methods is based on high cross-correlation between color channels [ 3]. In [], smooth hue transition algorithms are presented based on inter-channel correlation, which assumes that hue does not change abruptly between neighboring pixel locations. This class of algorithms exploits color ratios between red and green, blue and green to interpolate the missing color values. In addition to interpolating the color ratios, they also use inter-channel color differences (red minus green and blue minus green) [3]. Although these algorithms normally give better performance than single-channel algorithms, they cannot produce satisfactory demosaicing results around sharp edges, where the assumption of inter-channel correlation does not hold. To reduce the undesirable demosaicing artifacts on sharp edges, the second class of demosaicing algorithms employs an edge-directed interpolation approach to perform the interpolation along image edges and prevent the interpolation across edges [4 7]. These methods first analyze the 06-8856/$ - see front matter Ó 007 Elsevier B.V. All rights reserved. doi:0.06/j.imavis.006..08

496 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 spatial correlation of a local image neighborhood and then select a suitable interpolation direction together with its neighboring pixels to estimate the missing color values. The third class of demosaicing methods uses both the above demosaicing approaches [8 5]. Kimmel proposes a hybrid demosaicing method [9], which estimates each missing color value by combining the edge-directed interpolation with smooth hue transition algorithm in an iterative process. Li and Orchard [0] proposed an edge-directed interpolation scheme to interpolate inter-channel color differences by exploiting the geometric duality between low-resolution covariance of CFA samples and high-resolution covariance of demosaiced images. In [], a nonlinear demosaicing scheme based on optimal recovery interpolation of gray scale images was proposed. The optimal recovery interpolation scheme was presented to interpolate the green plane and the inter-channel color differences. Another effective method that exploits inter-channel correlation was proposed by Gunturk et al. [3]. They utilized an edge-directed interpolation scheme to estimate the missing color values in the green channel and an alternating projection scheme to estimate the missing color values in red and blue channels based on high inter-channel correlation. In a recent effort, Lu and Tan [4] presented an improved hybrid CFA demosaicing method that consists of two successive steps: an interpolation step to render full-color images and a postprocessing step to suppress visible demosaicing artifacts. The second and third class algorithms generally produce high quality visual results, especially in reconstructing sharp edges of the demosaiced image. However, in regions of fine details, where edges tend to be short and in different directions, these algorithms introduce undesirable errors and generate color artifacts. Color artifacts are caused primarily by aliasing error in high-frequency regions such as edges or fine textures. In demosaicing, color artifacts existing around edges and fine textures of the demosaiced image are a factor limiting performance. Existing algorithms are unable to resolve color artifacts in these regions effectively to obtain demosaiced results with high visual quality. To effectively reduce color artifacts in demosaiced images, we here propose a novel edge-adaptive CFA demosaicing algorithm. The proposed demosaicing algorithm consists of color-difference based edge-adaptive low-pass filtering and post-processing schemes to reproduce color values by exploiting the green channel information for making high-frequency components of red and blue channels similar to the green channel. The red and blue channels are first reconstructed using bilinear interpolation and then edge-adaptive filtered in color-difference space. Subsequently, the visible color artifacts in high-frequency regions of the full-color demosaiced image can be reduced effectively in the post-processing step. Another advantage of the proposed algorithm is that any existing image interpolation methods can be combined with the proposed algorithm to reconstruct the green channel. Moreover, we also present a new adaptive interpolation method for reconstructing the green channel from CFA samples. To evaluate the performance of the proposed demosaicing method, we adopted the PSNR and CIELAB DE ab [6] to measure the fidelity of demosaiced images. Experimental results reveal that the proposed method performs satisfactorily on well-defined edges and effectively reduces visible color artifacts in fine details of the demosaiced images. We have compared this algorithm with two well-known demosaicing techniques [3,4]. The experimental results show that our method outperforms both of them, both in PSNR values and CIELAB DE ab measures, and also gives superior demosaiced fidelity in visual comparison. The rest of this paper is organized as follows: In Section, we will describe the motivation to study the color-difference based demosaicing algorithm. Section 3 presents the proposed color-difference based demosaicing algorithm. We then present a new adaptive interpolation method for reconstructing a green channel from CFA samples in Section 4. Section 5 presents the experimental results and compares the demosaicing results of the proposed method with other existent methods. Section 6 summarizes the contributions of this work.. Color-difference approach to demosaicing Fig. illustrates the most used CFA pattern, the Bayer pattern [7], where R, G and B denote, respectively, pixels having only red, green and blue color values. We limit our discussion in this paper to the Bayer pattern because it is so popular. In a Bayer pattern, green samples are obtained on a quincunx, while red and blue samples are obtained in rectangular lattices. The density of red and blue samples is one-half that of the green ones, and the aliasing error of high-frequency components in green channel is likely to be less than that in red and blue channels. Thus, a common problem in demosaicing is that the visible color artifacts in high-frequency regions are caused primarily by aliasing in the red and blue channels. Fortunately, there is usually high inter-channel correlation in high-frequency regions among red, green, and blue channels for natural color images [3]. This implies that the red, green, and blue Fig.. Bayer color filter array pattern.

C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 497 channels are quite similar at fine texture and edge locations with all three colors. Therefore, a valid assumption can be made that object boundaries are the same in all three color channels. In other words, the high-frequency regions are similar in all three channels and close to the high-frequency regions of the green channel. In order to validate this assumption, we utilize twenty-four natural images from the Kodak PhotoCD (see Fig. ), which have been used as test images for several demosaicing studies [3 5]. Fig. 3 shows the flowchart for demonstrating the assumption of color-difference model mentioned above. The key concept is to replace the high-frequency components of red and blue planes by using those of green plane, and compare then the mean squared error (MSE) between the original and reconstructed color planes. A low-pass filter is utilized for red and blue planes and a high-pass filter for the green plane. We utilize -D ideal low-pass and highpass filters in this procedure. Their transfer functions are given by [8]: H lowpass ðu vþ ¼ if Dðu vþ 6 D 0 and 0 if Dðu vþ > D 0 H highpass ðu vþ ¼ 0 if Dðu vþ 6 D 0 if Dðu vþ > D 0 where D 0 is a specified nonnegative quantity and D(u,v) is the distance from point (u,v) to the origin of the frequency plane. We set D 0 equal to 8 in this test. After filtering in each color plane, the new red and blue planes, R and B, are reconstructed by adding the high-frequency components of the green plane to their low-frequency components respectively. Table records the MSE comparison results of each step. The first and second columns show the MSE between original and low-pass filtered red (blue) planes R low (B low ). The third and fourth columns show the MSE between original and reconstructed red (blue) planes RðBÞ. From the Fig. 3. Flowchart for demonstrating the assumption of color-difference model. test results, it is clear that the MSE is reduced effectively by adding the high-frequency regions of the green plane G high to the low-pass filtered red (blue) planes. This implies that the high-frequency regions of red and blue planes are similar and close to the high-frequency regions of the green plane. Thus, the assumption is validated. Based on this assumption, our motivation in this study is to reduce the color artifacts in high-frequency regions by adding the high-frequency information of green channel to other color channels. As described below, this can be achieved by utilizing the color-difference model. Let ½ R d G d B d Š denote three color planes of a demosaiced image. The Fourier spectrum of each color plane can be described as follows: F ½R d Š¼F½R d Š l þ F ½R d Š h F ½G d Š¼F½G d Š l þ F ½G d Š h F ½B d Š¼F½B d Š l þ F ½B d Š h ðþ Fig.. Test images used in the experiment.

498 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 Table Comparison of mean squared error at each step in Fig. 3 Image no. MSE(R,R low ) MSE(B,B low ) MSEðR RÞ MSEðB BÞ 60.865 63.530 4.306.9049 73.0098 66.0994 9.8449.6476 3 59.4 55.4568 3.630 4.0863 4 75.0895 7.003 8.0543.637 5 307.866 93.3846 6.334 9.040 6 78.8668 74.340.57 4.843 7 76.3988 76.040.884 3.366 8 539.6 56.6953 8.70 7.836 9 8.787 77.9.6365 3.997 0 86.740 89.376.855 4.737 49.5650 39.446 4.58.00 70.4837 74.3636 3.48.9343 3 45.0555 45.0555 3.3856 0.374 4 68.8 4.964 9.458 8.06 5.4 03.0.8 3.653 6 8.9777 80.868.368.8066 7 8.5764 84.854.567 3.9789 8 87.993 8.844 6.65 0.065 9 79.4598 63.5454.788 3.845 0 9.7853 09.3889.47 6.9 55.7798 60.695.635 4.974 4.385 9.3350 7.7838 7.3834 3 57.6438 55.0500 4.97 4.7799 4 3.675 30.3846 0.3759 9.568 where F[Æ] denotes the -D discrete Fourier transform and the subscripts l and h stand for low-frequency and high-frequency components, respectively. The color-difference models of the demosaiced image are defined such that R g ¼ R d G d B g ¼ B d G d : ðþ Let L{Æ} denote a linear low-pass filtering process, and ~R g and ~B g denote the low-frequency regions of the color differences corresponding to R g and B g. Suppose that the highfrequency components of the color differences R g and B g can be removed by the low-pass filtering process, the Fourier spectrum of ~R g and ~B g can be described such that F ½~R g Š¼LfF ½R g Šg ¼ F ½R d Š l F ½G d Š l F ½~B g Š¼LfF ½B g Šg ¼ F ½B d Š l F ½G d Š l : Subsequently, the new red and blue planes of the demosaiced image, R d and B d, can be obtained by adding ~R g and ~B g with G d respectively. Their Fourier spectra are given by F ½R d Š¼F½~R g þ G d Š¼F½R d Š l þ F ½G d Š h F ½B d Š¼F½~B g þ G d Š¼F½B d Š l þ F ½G d Š h : It is clear from (4) that the high-frequency components of the new red and blue planes of the demosaiced image are replaced by the high-frequency components of the green plane. Because the aliasing error in the green plane is usually much smaller than those in red and blue planes, based on the assumption described above, the aliased errors in red and blue channels can be efficiently reduced by linear low-pass filtering in the color-difference spaces and adding the results with green channel to obtain the new ones. This ð3þ ð4þ observation leads to the development of an efficient demosaicing algorithm based on color-difference that can reduce the color artifacts in high-frequency regions such as edges or fine textures. The color-ratio model has been another useful model for the development demosaicing algorithms [,9]. The main difference between the color-difference model and the colorratio model is that the latter assumes the ratios between the red and green values are constant within a given object, as are the ratios between the blue and green values. However, the color-ratio model usually fails to work around edge regions and results in some color artifacts because the constant-ratio assumption is not valid in these regions. But, again, the assumption used by color-difference model is that the high-frequency regions are similar in all three channels and close to the high-frequency regions of the green channel. If the high-frequency components of the green channel (such as edges and fine textures) can be recovered within small aliasing errors, then the results can be used to effectively reduce the aliasing errors in red and blue channels. In the following section, we describe the proposed edge-adaptive demosaicing algorithm (EADA) based on the color-difference model. 3. Proposed edge-adaptive demosaicing algorithm Fig. 4 illustrates a simplified CFA demosaicing procedure for digital cameras. Given the CFA samples, an interpolation step is first performed to obtain the full color demosaiced image. Due to restrictions on computing power, the demosaiced images obtained from the interpolation step usually lack sharpness and contain false colors or color artifacts. Hence, a post-processing step is required to provide more visually pleasing demosaiced results. In the proposed EADA, the interpolation and post-processing steps are both developed based on the color-difference model. To begin with derivation of the proposed demosaicing algorithm, we first assume that the green channel has been fully recovered by using an existing image interpolation method. The initial estimation of red and blue channels can be obtained, for instance, by utilizing some well-known method such as bilinear interpolation. Note that at this stage we do not process the original red (blue) values, but rather keep them the same as the original CFA-sampled color values. When an initial demosaiced image is obtained, we then utilize an adaptive low-pass filter to filter the color-difference value at the missing pixel locations in red (blue) channels. 3.. Edge-adaptive low-pass filtering Let ½ R d i G d i B d i Š denote three color planes of the initial demosaiced image. The color-difference planes are given by R g ¼ R d i G d i and B g ¼ B d i G d i ð5þ

C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 499 Fig. 4. Simplified CFA demosaicing procedure in digital cameras. where R d i and B d i are, respectively, the initial estimated red and blue channels obtained from bilinear interpolation and G d i is the initial estimated green channel obtained by any image interpolation method. The filtering procedure involves two sub-steps: first, edge-adaptive low-pass filtering of the red (blue) values over the original blue (red) pixels, as shown in Fig. 5(a) second, edge-adaptive low-pass filtering of the red (blue) values over the original green pixels, as shown in Fig. 5(b). Because the same procedure is used for both R g and B g color-difference planes, only the procedure of R g will be described in the following presentation. Fig. 5. (a) The red value on blue pixel and (b) the red value on green pixel of a central pixel to be estimated. Referring to Figs. 5(a) and (b), the color-difference value R g of red pixel at the center position is to be filtered adaptively by R g ¼ e a^r g þ e a^r g þ e a3^r g3 þ e a4^r g4 ð6þ e a þ e a þ e a3 þ e a4 where ^R g ^R g4 are the color-difference adjusted values and e a e a4 are the edge indicators corresponding to each color-difference adjusted value. These edge indicators are defined as a decreasing function of the directional derivative of the center point and its neighboring points. In the edge-adaptive filtering stage, we propose to introduce the second-order directional derivatives of neighboring colordifference values for detecting edges more accurately. In the case of Fig. 5(a), the edge indicators are given by e a ¼ þ R g R p g3 þ R g5 R g þr p g3 e a ¼ þ R g4 R p g þ R g4 R g þr p g6 ffiffi ffiffi e a3 ¼ þ R g R p ffiffi g3 þ R g R g3 ffiffi þr p g7 e a4 ¼ þ R g4 R ffiffi g p þ R g8 R g4 ffiffi þr g : p ð7:aþ ffiffi In the case of Fig. 5(b), they are given by e a ¼ þ R g3 R g þ R g3 R g þr g5 e a ¼ þ R g R g4 þ R g6 R g þr g4 e a3 ¼ þ R g3 R g þ R g7 R g3 þr g e a4 ¼ þ R g R g4 þ R g R g4 þr g8 : ð7:bþ The color-difference adjusted values ^R g ^R g4 are derived based on the assumption that the difference of neighboring color-difference values along an interpolation direction is constant. For example, to find the color-difference adjusted value at R g location, this assumption gives the following relationships for the neighboring color-difference values along the right-up direction R g R g3 ¼ðR g ^R g Þþð^R g R g3 Þ and ð8þ R g ^R g ¼ ^R g R g3 ð9þ where ^R g denotes the missing color-difference value at B g location. Combining (8) and (9), we have R g R g3 ¼ ðr g ^R g Þ: This implies that ^R g ¼ R g þðr g3 R g Þ=. This value is denoted by ^R g, which is used to estimate R g in the rightup interpolation direction. In a similar manner, the ffiffi

500 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 color-difference adjusted values along other interpolation directions are given by ^R g ¼ R g þ R g4 R g ^R g4 ¼ R g4 þ R g R g4 : ^R g3 ¼ R g3 þ R g R g3 Finally, the full-red plane is obtained by recovering the spatial plane from color-difference plane such that R d i ¼ R g þ G d i : ð0þ As the same procedure is utilized for recovering the blue plane, a full-color demosaiced image can be obtained. 3.. Post-processing The post-processing step aims to suppress visible color artifacts residing in the demosaiced image obtained from the edge-adaptive low-pass filtering step. To achieve this, a similar procedure is applied iteratively to smooth the colordifference values and make them become locally constant within an object. Let ½ R d i G d i B d i Š denote three color planes of the full-color demosaiced image obtained from the interpolation stage explained above. We first correct the green values to fit the color-difference values as smoothly as possible by using edge-adaptive filtering. The color-difference plane between green and red (blue) planes are given by G r ¼ G d i R d i and G b ¼ G d i B d i : ðþ Let us first illustrate how the color-difference value G r in Fig. 6 is filtered by using the edge-adaptive filtering method as an example. Referring to Fig. 6, the color-difference value G r of the central pixel is to be filtered. Similar to the edge-adaptive interpolation, the color-difference value G r is estimated by its neighboring color-difference values such that Fig. 6. Adaptive filtering of the red-green color difference plane. G r ¼ X8 j¼ w j G rj w j ¼ e bj P 8 k¼ e ðþ bk where the edge indicators e b e b8 associated with the neighboring color-difference values G r G r8 are also defined as a decreasing function of the directional derivative of the center point and its neighboring points. However, at this stage, only the first-order directional derivatives of neighboring color-difference values are introduced. Experimentally, this can work better than the case of introducing the second-order directional derivatives. Thus, the edge indicators corresponding to each neighboring color difference value at this stage are given by e b ¼ þ G r5 G r þ G r G r9 e b3 ¼ þ G r3 G r7 þ G r G r e b5 ¼ þ G r5 G r þ G r3 G r e b7 ¼ þ G r3 G r7 þ G r G r5 e b ¼ þ G r G p r6 þ G r0 G pffiffi r ffiffi e b4 ¼ þ G r8 G p r4 þ Gr G pffiffi r ffiffi e b6 ¼ þ G r G p r6 þ Gr G pffiffi r4 ffiffi e b8 ¼ þ G r8 G p ffiffi r4 þ G r6 G pffiffi r : ð3þ Subsequently, the same procedure is applied to adaptively filter the color-difference plane between green and blue planes G b. Then the post-processed green plane can be obtained such that G d p ¼ ðg r þ R d i ÞþðG b þ B d i Þ : ð4þ Since the same procedure is used to filter the other colordifference planes, R gp ¼ R d i G d p and B gp ¼ B d i G d p, the post-processed red and blue planes can be obtained respectively as R d p ¼ R gp þ G d p and Bd p ¼ B gp þ G d p. This procedure is set to repeat three times, and the post-processed demosaiced images ½ R d p G d p B d pš will be completed after this stage. Fig. 7 summarizes the proposed edge-adaptive demosaicing algorithm. For green plane interpolation, any of the existing image interpolation methods can be adopted, for instance edge-directed interpolation [0], adaptively quadratic image interpolation [9], etc. The initial full-color demosaiced image is first reconstructed in the interpolation step, then the color artifacts in the initial demosaiced result are reduced in the post-processing step. Note that the main difference between the proposed algorithm and Kimmel s method [9] is that Kimmel s method is based on the color-ratio model, whereas the proposed algorithm is based on the color-difference model. As a result, Kimmel s method usually induces color artifacts like overshot points around edge regions, but the proposed algorithm will not.

C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 50 Fig. 7. Complete steps of the proposed edge-adaptive demosaicing algorithm. 4. Green channel adaptive interpolation In this section, we present a novel adaptive interpolation method for green channel reconstruction from CFA samples. The green plane has the most spatial information of the image to be demosaiced and has great influence on the perceptual quality of the image. In order to reconstruct the demosaiced images with satisfactory quality, we propose a nonlinear procedure for choosing the direction of interpolation to reconstruct the green channel. Fig. 8 shows two cases of green samples in Bayer pattern, where the green value of central pixels are to be estimated from its four surrounding green pixels, G G 4. The central missing green value G can be estimated by the following expression: 8 w s ^G þ w s ^G þ w s3 ^G 3 þ w s4 ^G 4 if E < T >< w e ^G þw e3 ^G 3 w e þw e3 if E P T and w e þ w e3 > w e þ w e4 G ¼ w e ^G þw e4 ^G 4 w e þw e4 if E P T and w e þ w e4 > w e þ w e3 >: w e ^G þ w e ^G þ w e3 ^G3 þ w e4 ^G4 otherwise ð5þ where E = max{( G G + G 3 G 4 )/, ( G G 4 + G G 3 )/} ^G ^G 4 are the color-adjusted green values w s w s4 and w e w e4 are, respectively, the associated weights when E < T and E P T andt is a threshold value. In other words, if E < T, we regard the central pixel as being in a smooth region. Otherwise, we regard the central pixel as being in an edge region. In the smooth region, the weights associated with four surrounding color-adjusted Fig. 8. Two cases of missing green value on the central pixel.

Table PSNR (db) and DE ab measures for edge and smooth regions of demosaiced images Step: Interpolation step Post-processing step Method: Gunturk Lu s interpolation. EADA interpolation Gunturk with Lu s postprocessing Lu s post-processing EADA post-processing Region: Smooth Edge Smooth Edge Smooth Edge Smooth Edge Smooth Edge Smooth Edge 34.807 3.34 33.6657 30.769 33.8758 9.6090 35.747 3.59 35.650 3.5756 36.386 33.9.3746.8079.406.0304.3657.4.3358.74.359.6944.649.5896 36.43 3.34 37.3906 3.683 37.666 3.5044 36.0054 3.09 37.349 3.737 37.7334 3.76.49.4644.3306.3008.803.5.5067.46.336.30.608.0880 3 38.4495 3.964 39.79 3.997 39.80 3.7793 38.5750 3.373 39.9506 3.507 40.346 3.66 0.865.765 0.794.753 0.756.6497 0.8493.764 0.7489.630 0.7333.5469 4 36.390 3.90 36.93 3.007 36.9084 3.3759 36.963 3.899 37.5835 3.8044 37.557 3.889.89.9795.4.9963.003.0467.444.947.0376.8063.0647.8785 5 34.087 3.459 34.9989 30.7794 35.065 30.06 34.033 3.406 35.7578 3.0784 35.4706 3.967.9338.503.6754.576.634.663.954.495.5695.340.657.673 6 36.5753 3.7874 35.654 9.7878 36.635 30.979 36.8764 3.40 36.9993 3.59 37.857 33.34 0.90.6638 0.998.888 0.8893.789 0.9035.633 0.8484.646 0.8077.433 7 38.694 33.7006 40.308 34.764 40.486 34.0593 38.4476 33.469 40.3646 34.6075 40.4996 34.769 0.93.830 0.7834.669 0.7645.596 0.9349.880 0.7673.6307 0.7589.55 8 33.308 9.536 33.4 8.043 33.484 8.594 33.5960 9.86 34.46 9.70 34.8608 3.76.3547.05.335.654.84.0965.39.0440.988.994.366.78 9 38.4874 3.835 38.803 3.4653 39.684 33.66 38.950 3.967 39.949 33.3885 39.909 34.3989 0.7596.4954 0.738.550 0.6897.335 0.73.4655 0.6565.468 0.6665.580 0 38.568 33.49 38.87 3.8390 39.464 3.605 38.9668 33.00 39.757 33.378 39.606 33.773 0.7898.375 0.7636.390 0.749.743 0.7439.90 0.6857.0 0.708.959 36.9033 3.847 36.9056 30.6894 37.85 30.307 36.986 3.03 37.996 3.3035 38.98 3.7767.458.945.950.0457.60.0448.977.9036.0793.803.09.6977 39.86 34.3383 39.3036 33.0570 39.733 33.5898 39.37 34.450 40.38 34.477 40.4659 35.3664 0.5996.0088 0.5785.8 0.5530.094 0.593.006 0.5448.067 0.5309 0.8983 3 3.8056 8.7975 30.78 6.733 30.3566 5.865 3.870 9.340 3.368 9.384 3.7580 9.5080.099.8467.708 3.780.76 3.4638.0477.739.953.6386.977.634 4 3.990 8.7035 34.8480 30.73 34.9636 9.837 3.806 8.5379 35.33 30.789 34.685 9.9897.6537.406.465.368.478.65.647.390.3783.08.3790.0683 5 35.9837 30.594 36.57 9.785 36.707 8.9349 35.8843 30.5443 36.790 30.73 36.9546 30.0.84.397.846.3966.48.493.70.403.478.70.7.354 6 39.3335 34.5606 38.3 3.89 38.976 33.449 39.5908 34.9486 39.583 34.0333 40.6099 36.3883 0.7976.4893 0.833.7895 0.7844.586 0.7854.447 0.767.55 0.70.344 7 38.8503 3.8483 38.53 3.76 38.8687 30.536 39.346 3.9986 39.75 3.573 39.7808 3.743.3907.758.400.8633.306.859.749.6875.35.653.364.666 8 34.575 30.088 34.465 9.060 34.6087 8.03 34.7779 30.37 35.5586 30.4966 35.3887 30.3397.0733.8434.960.8745.9380 3.060.040.8043.793.6044.93.708 9 37.88 3.094 36.8853 3.4886 37.846 3.6989 37.6850 3.3 38.983 3.649 38.395 33.768.045.90.0660.9979.063.9443 0.9878.833 0.9398.779 0.935.6448 0 37.5354 3.588 38.083 3.097 38.94 30.4797 37.7964 3.6786 38.6400 3.3599 38.704 3.8354 0.8303.06 0.7749.70 0.749.497 0.7990.0676 0.753.9373 0.733.085 36.5895 3.0595 36.703 9.5479 36.5689 9.0086 36.7403 3.4046 37.309 3.506 37.6804 3.0090 0.9485.630 0.945.384 0.903.5049 0.94.0970 0.8757.04 0.8460.9569 35.085 30.0799 35.8485 30.604 35.8768 9.8668 35.034 30.006 36.690 30.7396 35.978 30.809 50 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508

.433.0937.343.9987.93.09.78.9.5.968.467.964 3 38.5853 3.466 39.693 3.73 40.63 3.305 38.448 3.3307 39.767 3.56 40.36 33.0353 0.8794.74 0.858.668 0.773.54 0.899.749 0.87.5970 0.7687.456 4 33.855 6.8785 33.7499 6.0904 33.636 5.634 33.645 6.9839 34.7 6.837 34.9 6.9459.0799.544 0.9865.993 0.9954.4805.079.477 0.9406.398 0.9844.68 Avg. 36.3906 3.3533 36.597 30.7439 36.8680 30.5050 36.5500 3.4787 37.4593 3.8537 37.6587 3.89.99.9943.457.054.00.0470.745.966.0579.8593.0507.7873 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 503 green values are denoted by w s w s4 and the central missing green value is then estimated by weighted sum of them. In the edge region, the weights associated with four surrounding color-adjusted green values are denoted by w e w e4 and the central missing green value is then carried out by selecting weighted sum in horizontal and vertical directions. Based on (5), the color-adjusted green values and the corresponding weights need to be determined for estimating the missing green value G. For instance, in the case of interpolating the missing green value on blue pixel positions, the color-adjusted values in each interpolation direction are referred to [4] and given by ^G ¼ G þ B B ^G 3 ¼ G 3 þ B B 3 ^G ¼ G þ B B ^G4 ¼ G 4 þ B B 4 : ð6þ And the corresponding weights in smooth and edge regions are, respectively, given by w sj ¼ e sj P 4 k¼ e and w ej ¼ e ej for j ¼ 4 ð7þ P sk 4 e ek k¼ where e sj and e ej are the edge indicators associated with four surrounding green pixels G j. In smooth regions, a valid assumption is that the directional derivatives of each color channel are small. Thus, the edge indicates in smooth regions can be seen as a decreasing function dependent on the sum of local first-order directional derivative such that e s ¼ þjg G 3 jþjg 5 G jþjb Bjþ G9 G4þG0 G þ R5 RþR6 R e s ¼ þjg G 4 jþjg 6 G jþjb Bjþ G GþG G3 þ R7 RþR8 R3 e s3 ¼ þjg G 3 jþjg 3 G 7 jþjb B 3 jþ G G3þG4 G4 þ R3 R9þR4 R0 e s4 ¼ þjg G 4 jþjg 4 G 8 jþjb B 4 jþ G G6þG3 G5 þ R4 RþR R : ð8þ In edge regions, the assumption is that the directional derivatives of each color channel are consistent along the direction of edges. Thus, the edge indicates in edge regions can be seen as a decreasing function dependent on the consistence of local first-order directional derivative such that ee ¼ þjg G3jþjG5 G þ G3jþjB B G þ G3jþjR5 R G9 þ G4jþjR6 R G0 þ Gj ee ¼ þjg G4jþjG6 G þ G4jþjB B G þ G4jþjR7 R G þ GjþjR8 R3 G þ G3j ee3 ¼ þjg G3jþjG G3 þ G7jþjG G3 B þ B3jþjG G3 R3 þ R9jþjG4 G4 R4 þ R0j ee4 ¼ þjg G4jþjG G4 þ G8jþjG G4 B þ B4jþjG G6 R þ RjþjG3 G5 R4 þ Rj : ð9þ Once the weights of each color-adjusted green value are obtained from (7), the missing green value G can be obtained by using (5). Finally, the full green channel can be obtained by adopting the same procedure as described above to interpolate the missing green value on red pixel posi-

504 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 tions. This method for interpolating green channel from CFA samples is combined with the proposed demosaicing algorithm. 5. Experimental results Extensive experimental studies have been carried out to verify the effectiveness of the proposed demosaicing algorithm. In the experiments, twenty-four natural images from the Kodak PhotoCD, as shown in Fig., are employed to demonstrate the demosaicing performance. The demosaiced results of the proposed method are compared with the recently published methods of Gunturk [3] and Lu [4]. For Gunturk s method, we make use of one-level (- L) decomposition with eight projection iterations in the experiments. Because Gunturk s method consists of only the interpolation step, for fair comparison in each step, Lu s post-processing method was adopted as the post-processing step for Gunturk s method. As shown in Fig., all test images are down-sampled to obtain the Bayer pattern and then reconstructed using the demosaicing methods under comparison in RGB color space. To evaluate the quality of the demosaiced images, PSNR and CIELAB DE ab are adopted as performance measure. The PSNR (in db) is defined as 8! 9 < PSNRðdBÞ ¼0log 0 55 X X = koðx yþ Dðx yþk : MN 6y6M 6x6N ð0þ where M, N are the total row and column number of the image Oðx yþ is the color vector at the (x, y) th position of the original color image and Dðx yþ is the corresponding color vector in the demosaiced color image. The CIELAB DE ab is given by DE ab ¼ X X ko Lab ðx yþ D Lab ðx yþk ðþ MN 6y6M 6x6N where O Lab ðx yþ and D Lab ðx yþ are the color vector in CIE- LAB color space at the (x, y) th position of the original and Fig. 9. Zoom-in demosaicing test results of test image No. : (a) original picture, (b) demosaiced result of Gunturk s method, (c) Lu s interpolation result, (d) the proposed EADA interpolation result, (e) Gunturk s method with Lu s post-processing result, (f) Lu s post-processing result, and (g) the proposed EADA post-processing result.

C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 505 demosaiced color image, respectively. A DE ab greater than.3 indicates the color difference is visible, while greater than 0 means the demosaiced image is very different from the original one [6]. For a demosaiced image, high fidelity implies high PSNR and small CIELAB DE ab measures. Furthermore, we compute the PSNR and DE ab measures of smooth and edge regions separately for more detailed comparison with other algorithms. The method used to divide the original image into smooth and edge regions is the same as that discussed in Section 4. The threshold value T was set to 0 in the experiments. Table presents the PSNR values and CIELAB DE ab for the smooth and edge regions of the demosaiced results obtained by the proposed interpolation method together with those by other methods for comparison. In each step, the bold-italic font denotes the largest PSNR and smallest CIELAB DE ab measures of smooth and edge regions across each row. It can be seen from the table that EADA provides improved demosaiced fidelity for smooth regions in most images compared with other methods in the interpolation step. However, in the post-processing step, the proposed post-processing scheme not only obtains the best demosaiced results in the smooth regions, but also provides superior improvement in the edge regions, which are crucial to the visual quality. These improvements can add-up the PSNR and reduce DE ab of the edge regions of the interpolation result by about.78 db and 0.6 units, respectively in average. In some cases, the gap can be as large as 3.89 db (PSNR) and 0.83 DE ab units, e.g., for image No. 3. Fig. 9 explains why such a large performance gap exists between the proposed post-processing method and others. Fig. 9(a) shows a zoom-in of the test image No.. Figs. 9(b) (d) are, respectively, the demosaiced results obtained from Gunturk s, Lu s and the proposed EADA interpolation step. The corresponding PSNR and DE ab measures for smooth and edge regions are presented separately in Table 3. It can be seen that these demosaiced images have different types of color artifacts around the edge and smooth regions. In Figs. 9(b) and (c), the color artifacts are mainly distributed in the smooth regions (such as the blinds of the window) and appeared as pale false-colors with some zipper-effect. In contrast, the color artifacts in Fig. 9(d) are concentrated around the edge regions and appeared as deep false-colors with little zipper-effect. Therefore, as can be seen from Table 3, the smooth regions of Fig. 9(d) give better PSNR and DE ab measures than those of Figs. 9(b) and (c) in the interpolation step. Figs. 9(e) (g) are the post-processing results corresponding to Figs. 9(b) (d), respectively. Furthermore, Figs. 9(e f) and (g) present, respectively, the demosaiced results obtained by using Lu s and the proposed post-processing step. It is clear that Figs. 9(e-f) still retain some color artifacts in smooth and edge regions, which is mainly because Lu s post-processing is only applied to artifact-prone regions (mainly the edge regions) [4]. This implies that Lu s post-processing method cannot effectively reduce the color artifacts in smooth regions, which usually contain some fine textures. Thus, color artifacts in the smooth Table 3 PSNR (db) and DE ab measures for a zoom-in of edge and smooth regions of demosaiced images Step: Interpolation step Post-processing step Method: Gunturk Lu s interpolation EADA interpolation Gunturk with Lu s post-processing Lu s post-processing EADA post-processing Region: Smooth Edge Smooth Edge Smooth Edge Smooth Edge Smooth Edge Smooth Edge Fig. 9 3.3975 30.370 30.5873 9.3493 3.686 9.0357 3.894 30.4666 3.697 3.705 34.465 3.966 (No.).98.4378.9463.4438.5840.4576.846.37.796.450.3745.9994 Fig. 0 3.868 6.7096 30.577 5.6 33.6837 6.956 3.648 7.334 3.3797 7.304 35.736 9.509 (No.8).686.880.654.949.4400.5744.5597.7558.4487.5779.445.0 Fig. 3.3044 3.4755 3.30 3.470 34.087 3.85 3.599 3.756 3.448 3.5350 36.69 33.547 (No. 9).64.9396.6734.888.4397.677.5637.876.498.7634.793.5336

506 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 regions of demosaiced image will still exist when utilizing Gunturk s and Lu s demosaicing methods. On the contrary, color artifacts of demosaiced images obtained from the proposed EADA interpolation step are mainly converged in edge regions. The discussion of color-difference model in Section shows that the color artifacts in highfrequency regions will be reduced effectively by linear low-pass filtering in the color-difference space. Therefore, the proposed post-processing method provides superior improvement for the demosaiced images obtained from the interpolation step. More specifically, EADA provides the fewest color artifacts for smooth regions in interpolation step and reduces the residual color artifacts around edge regions effectively in post-processing step. This can be seen in Table 3 and by visually comparing Fig. 9(d) with Fig. 9(g). Similar results can also be observed in the test results presented Fig. 0. The demosaiced results shown in Fig. evaluate the performance of EADA in edge regions and fine textures. Fig. (a) shows a zoom-in of the test image No. 9. In this scene, the fence region contains many long edges and highfrequency detail of the image. These features usefully challenge the performance of demosaicing methods. Figs. (b) (d) are, respectively the demosaiced results obtained from Gunturk s, Lu s and EADA interpolation methods. From visual comparison, we can observe that the Gunturk s and Lu s interpolation methods induce more color artifacts in the fence region than EADA does. Moreover, as shown in Figs. (e) and (f), the color artifacts in the demosaiced result still remain in the fence region when using Lu s post-processing method. However, by visually comparing Figs. (d) with (g), it is clear that the color artifacts in the demosaiced result can be effectively reduced by using the proposed post-processing method. Note in Table 3 that the proposed algorithm provides the best PSNR and DE ab measures not only in the interpolation step, but also in the post-processing step for this case. The gap between proposed and other methods can be as large as 3.57 db (PSNR) and 0. DE ab units in smooth region, 0.98 db (PSNR) and 0.3 DE ab units in edge region. These experimental results validate that EADA not only effectively reduces the color artifacts in edge regions and fine textures, Fig. 0. Zoom-in demosaicing test results of test image No. 8: (a) original picture, (b) demosaiced result of Gunturk s method, (c) Lu s interpolation result, (d) the proposed EADA interpolation result, (e) Gunturk s method with Lu s post-processing result, (f) Lu s post-processing result, and (g) the proposed EADA post-processing result.

C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 507 Fig.. Zoom-in demosaicing test results of test image No. 9: (a) original picture, (b) demosaiced result of Gunturk s method, (c) Lu s interpolation result, (d) the proposed EADA interpolation result, (e) Gunturk s method with Lu s post-processing result, (f) Lu s post-processing result, and (g) the proposed EADA post-processing result. but also gives superior performance compared with the two well-known methods. 6. Conclusion and future work This paper presents a novel edge-adaptive CFA demosaicing algorithm based on a color-difference approach. The proposed EADA algorithm effectively reduces color artifacts in both smooth and edge regions of demosaiced images. The proposed algorithm can be combined with any existing image interpolation method for reconstructing the green channel. A new adaptive interpolation method is then presented using a nonlinear procedure for choosing the direction of interpolation to reconstruct the green channel from CFA samples. The performance of the proposed method has been compared with two recent published demosaicing methods, Gunturk s and Lu s methods. Experimental results show that the proposed method not only outperforms both of them in PSNR (db) and DE ab measures, but also gives superior demosaiced fidelity in visual comparison with other methods. Future research directions will focus on developing the single-plane reconstruction algorithms to reconstruct the green channel with minimum interpolation error. Acknowledgements The authors would like to thank Prof. B.K. Gunturk of Louisiana State University, USA and Dr. Yap-Peng Tan of Nanyang Technological University, Singapore for providing us their CFA demosaicing programs. This work was partly supported by the National Science Council of Taiwan, ROC under Grant NSC 9-3-E-009-007. References [] D.R. Cok, Reconstruction of CCD images using template matching, in: Proc. IS&T, 47th Annual Conf./ICPS (994) 380 385. [] D.R. Cok, Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal, U.S. Patent 4,64,678, 987. [3] W.T. Freeman, Method and apparatus for reconstructing missing color samples, U.S. Patent 4,774,565, 988.

508 C.-Y. Tsai, K.-T. Song / Image and Vision Computing 5 (007) 495 508 [4] J.E. Adams, Interactions between color plane interpolation and other image processing functions in electronic photography, Proc. SPIE Cameras Syst. Electronic Photogr. Sci. Imag. 46 (995) 44 5. [5] J. Hamilton, J. Adams, Adaptive color plane interpolation in single sensor color electronic camera, U.S. Patent 5,69,734, 997. [6] J.E. Adams, Design of practical color filter array interpolation algorithms for digital cameras, Proc. SPIE Real Time Imag. II 308 (997) 7 5. [7] E. Chang, S. Cheung, D.Y. Pan, Color filter array recovery using a threshold-based variable number of gradients, Proc. SPIE Sensors, Cameras, Appl. Digital Photogr. 3650 (999) 36 43. [8] R.H. Hibbard, Apparatus and method for adaptively interpolation a full color image utilizing luminance gradients, U.S. Patent 5,38,976, 996. [9] R. Kimmel, Demosaicing: image reconstruction from color CCD samples, IEEE Trans. Image Process. 8 (9) (999) 8. [0] X. Li, M.T. Orchard, New edge-directed interpolation, IEEE Trans. Image Process. 0 (0) (00) 5 57. [] D.D. Muresan. T.W. Parks, Optimal recovery approach to image interpolation, in: Proc. IEEE. Int. Conf. Image Process., vol. 3, 00, pp. 848 85. [] K. Hirakawa, T.W. Parks, Adaptive homogeneity-directed demosaicing algorithm, in: Proc. IEEE Int. Conf. Image Process., vol., 003, pp. 669 67. [3] B.K. Gunturk, Y. Altunbasak, R.M. Mersereau, Color plane interpolation using alternating projections, IEEE Trans. Image Process. (9) (00) 997 03. [4] W. Lu, Y.P. Tan, Color filter array demosaicing: new method and performance measures, IEEE Trans. Image Process. (0) (003) 94 0. [5] L. Chang, Y.P. Tan, Effective use of spatial and spectral correlations for color filter array demosaicing, IEEE Trans. Consumer Electronics 50 () (004) 355 365. [6] M. Mahy, E. Van Eyckden, O. Oosterlinck, Evaluation of uniform color spaces developed after the adoption of CIELAB and CIELUV, Color Res. Appl. 9 () (994) 05. [7] B. Bayer, Color imaging array, U.S. Patent 3,97,065, 976. [8] R.C. Gonzalez, R.E. Woods, Digital Image Processing, second ed., Prentice-Hall, 00. [9] D.D. Muresan, T.W. Parks, Adaptively quadratic (AQua) image interpolation, IEEE Trans. Image Process. 3 (5) (004) 690 698.