Research Article AN EFFECTIVE APPROACH FOR IMAGE RECONSTRUCTION AND REFINING USING DEMOSAICING 1 M.Jayasudha, 1 S.Alagu Address for Correspondence 1 Lecturer, Department of Information Technology, Sri venkateswara college of Engineering, Chennai, India. Email:jayasudha@svce.ac.in, alagu@svce.ac.in ABSTRACT The proposed work uses Demosaicing process for restoring full-color images from incomplete color samples acquired by single-sensor digital cameras. In the demosaicing process red, green and blue components of the image are acquired and interpolated to reconstruct the image. To restore a full-color image from its CFA samples, the two missing color values at each pixel are usually estimated from their neighboring CFA samples. This process is commonly referred to as CFA demosaicing (or CFA interpolation).and it has a substantial impact on the quality of the color images produced by a single sensor digital cameras. If demosaicing is not performed properly, the restored images will suffer from visible artifacts, most dominated by zipper effects and false colors. The reconstructed images are affected by some visible and annoying artifacts. In this work a new effective algorithm is used to reduce these artifacts. This algorithm improves the performances of the demosaicing reconstruction. A refining step is included to further improve the resulting reconstructed image. It provides better performance with low computational cost. KEYWORDS -Bayer pattern, Color Filter array (CFA) interpolation, demosiacing, digital cameras. INTRODUCTION Most digital still cameras acquire imagery by using a single electronic sensor (CCD or CMOS) overlaid with a color filter array (CFA)[1]. The CFA is configured in such a way that each sensor pixel samples only one of the three primary colors (e.g., red, green, and blue) or complementary colors (e.g., cyan, magenta, and yellow). The most commonly used CFA configuration today is the Bayer pattern, a schematic of which is shown in Fig. a. In this pattern, the green (G) values are sampled on a quincunx lattice(an arrangement of five objects with one at each corner of a rectangle or square and one at the center), while the red ( R ) and blue (B) values are obtained on two separate rectangular lattices, respectively. Furthermore, the number of the green values is twice as many as that of the red (or blue) values. To restore a full-color image from its CFA samples, the two missing color values at each pixel are usually estimated from their neighboring CFA samples. This process is
commonly referred to as CFA demosaicking (or CFA interpolation).and it has a substantial impact on the quality of the color images produced by a single sensor digital cameras. If demosaicking is not performed properly, the restored images will suffer from visible artifacts, most dominated by zipper effects and false colors. Zipper effects [2] refer to abrupt or unnatural changes of intensities over a number of neighboring pixels, manifesting as an on-off pattern in regions around edges, as shown in Fig. b.they are primarily caused by improper averaging of neighboring color values across edges. False colors are spurious colors which are not present in the original image scene, as illustrated in Fig. c. Demosaicking methods can be grouped into two distinct classes[2]. Adaptive and Non adaptive, In the non adaptive class applies well- known interpolation techniques to each color channel separately. These techniques include nearest - neighbor replication, bilinear interpolation, and cubic spline interpolation. Although these single -channel algorithms can provide satisfactory results in smooth regions of an image, they usually fail in high frequency regions, especially along edges. As a first step, these algorithms interpolate the luminance (green) channel, which is usually done using bilinear interpolation [3] and [4]. The chrominance channels (red and blue) are estimated from the bilinearly interpolated red-to-green ratio and blue-to-green ratio. To be more explicit, the interpolated red and blue values are multiplied by the green value to determine the missing red and blue values at a particular pixel location. This process uses edge-directed interpolation [5]. The main difference between this approach and the previous one is that the bilinear interpolation of the green channel is replaced by adaptive interpolation to prevent interpolating across edges. In first-order horizontal and vertical gradients are computed at each missing green location on the Bayer pattern. If the horizontal gradient is greater and the vertical gradient is less than a predetermined threshold, suggesting a possible edge in the horizontal direction, interpolation is performed along the vertical direction. If the vertical gradient is larger and the horizontal gradient is less than the threshold, interpolation is performed only in the horizontal direction. When the horizontal and vertical gradients are nearly equal,(that is, both gradients are less than or greater than the threshold), the green value is obtained by averaging its four neighbors. Interpolation of the red and blue channels can be done by either interpolating color ratios or by interpolating the color differences instead of the color ratios.
II.PROPOSED SYSTEM Our proposed method consists of two main processes: reconstruction and refining. The procedure of the proposed method is described in detail below. III. RECONSTRUCTION PROCESS The reconstruction process comprises an initialization step, which obtains an initial full-color image, and an enhancement step, which updates the color planes of the initial estimate. III. (A) Initialization In this step, the green plane is the first to be interpolated; once fully populated, it is used to help interpolate the red and blue planes. Green plane (using edge directed interpolation) Edge directed interpolation can reconstruct a green plane with fewer zipper effects, especially in image regions with horizontal and vertical edges. Hence, in this step the missing green values are filled first. We define the two edge indicators as DH= -C i,j-2 +2C i,j C i,j+2 + G i,j-1 G i,j+1, (Horizontal direction) DV= -C i-2j +2C i,j C i+2j + G i-1,j G i+1,j, (Vertical direction) Each missing green values is then estimated along the chosen direction using Gi, j+ Gi, j+ 1 Ci, j 2+ 2C i, j Ci, j+ 2 + 2 2 Gi 1, j+ Gi+ 1, j Ci 2j+ 2C i, j Ci+ 2, + Ĝi,j= 2 2 Gi 1, j+ Gi, j 1+ Gi, j+ 1+ Gi+ 1, j 4 Ci 2j Ci, j 2+ 4C i, j Ci, j+ 2 Ci + 8 j + 2j (1) Where DH and DVare the edge indicators in the horizontal and vertical directions, respectively; the smaller the value of the DH/DV indicator, the more likely the edge is along the horizontal/vertical direction Red and Blue planes While edge directed interpolation can suppress some zipper effects and obtain satisfying results, it can be further improved. In particular, since it mainly relies on the edge information along the horizontal or vertical directions, it is prone to producing jagged edges in the other directions. The edge information estimated in regions with complex edges may not be accurate. The red and blue plane interpolation in our initial step does not apply the edge adaptive interpolation but a pattern adaptive interpolation scheme. Pattern interpolation which estimates the missing color values according to the different neighborhood patterns. As the same procedure is used to interpolate both the red and blue planes, the red plane interpolation is described below. Because the red plane is sampled more
sparsely than the green plane, its interpolation requires two sub processes: 1.Interpolating the missing red values at blue pixels: The missing red values at blue pixels (i.e., pixels with blue CFA samples) are first interpolated because at this point only the blue pixels have the four neighboring color difference (gr = G-R) values at the cross locations (i-1,j-1),(i-1,j+1),(i+1,j-1) and (i+1,j+1) forming a four neighbor pattern, while the green pixels have only two neighboring gr values. Furthermore, the pattern formed by these four gr values can be used to obtain the edge information along the diagonal direction. The interpolations for different types of neighboring patterns are as follows. First the color difference gr value of the blue pixel at (i,j) is obtained as: (a) Edge pattern Gr i,j= Median{gr i-1,,j-1,gr i-1,j+1,gr i+1,j-1,gr i+1,j+1 } (b) Compute the average color difference value along the edge direction determined by two edge classifiers defined as D1= R i-1,j-1 R i+1,j+1 andd2= R i-1,j+1 R i+1,j-1 : gri j, j 1 + gri+ 1, j+ 1 2 gr i, j =..(2) gri 1, j+ 1+ gri+ 1, j 1 2 The missing red values at blue pixels can then be obtained by using estimated color difference values gr i,j and the green values estimated in the previous step as R i,j = G i,j - gr i,j 2. Filling in the missing red values at green pixels : This process is similar to the preceding one, except that the neighboring pattern is now formed by the available gr values at locations (i-1,j), (i,j+1), (i+1,j), and (i,j- 1).As the blue planes can be estimated similarly, at the end of the initialization step, the complete red, green, and blue planes as well as the full color difference planes (gr and gb) are obtained. III. (B) Enhancement This step further reduces the remaining artifacts in the diagonal edges and highdetail regions of each image by updating the green plane using an edge-directed interpolation and then updating the red and blue planes using the more accurate green plane obtained[5]. Green plane (using edge-directed interpolation) The estimates of G i,j in the four interpolation directions (top, down, left and right) as
respectively. Where gc denotes a color difference value (gr or gb) obtained in the previous step. The weights in the four directions are calculated as α ( ) = 1 C i-2,j C i,j + G i-1,j G i+1,j + C i-1,j -C i,j.(7) α ( ) = 2 C i+2,j C i,j + G i+1,j G i-1,j + C i+1,j -C i,j..(8) α ( ) = 3 C i,j-2 C i,j + G i,j-1 G i,j+1 + C i,j+1 -C i,j.(9) α ( ) = 4 C i,j+2 C i,j + G i,j+1 G i,j -1 + C i,j+1 - C i,j..(10) The green value G i,j is then updated as Ĝ i,j = 4 K = 1 G 1+ ( K ) α ( K ).(11) 4 i= k Where the addition 1 1+ α ( K ) of one in each denominator is included to avoid division by zero. Red and blue planes In this step[5], the same method used in the initialization step to update the red and blue planes using the more accurate green plane obtained in the previous step. This step works well in regions with sharp edges and fine details. It also rectifies the wrong judgments made by the adaptive interpolation performed in the initialization step, compared with that of the initialization step; the image fidelity is further improved. IV. REFINEMENT PROCESS The technique reconstructs in a fast way the full resolution image avoiding visible and annoying artifacts. However, even with an accurate selection of the edge directions, the reconstructed image may contain several errors due to the interpolation artifacts [7], less noticeable than misguidance artifacts (introduced by a wrong edge-estimation), but still annoying. In the proposed algorithm, they can be introduced by the approximations made in the filter design and, furthermore, by the low-pass characteristic of the filters used to interpolate the green component and the color differences R-G and B-G. Note that these artifacts mainly affect the regions with high-frequency contents. The above artifacts are corrected by using the high bands interchannel correlation of the three primary colors. A good solution may consist in separating low- and highfrequency components in each pixel and replacing the high frequencies of the unknown components with the high frequencies of the Bayer-known component. The low-frequency component is preserved unchanged since the low-frequency
components of the color channels are less correlated. For example, for a green pixel in the location(i,j), the green value can be decomposed as G =G l + G h Where G l and G h denote the low- and high-frequency components, respectively, and the red and blue values can be corrected replacing R h and B h with G h R = R l + R h.. (12) B = B l + B h.. (13). That is The correction in the red and blue pixels is carried out in a similar way. The selection of the low-frequency components is performed using a low-pass filter while the high frequencies are calculated subtracting the low-frequency values fig. e. The design of this low-pass filter is very important for the performance of the refining step and has to consider the following points. A first important issue is in exploiting the knowledge of the Bayer data, since its sure that they are not affected by interpolation errors. So, it results preferable that the red (blue) component in the green locations, having only two neighbors belonging to the Bayer pattern, are corrected using a 1-D low pass filter selecting only the red (blue) positions. Figure a :Bayer color filter array pattern. Figure b: Zipper effect
Figure c: False color Figure d & e. Block Diagram (Proposed Method) For the correction of the green channel and of the red and blue colors in the blue and red pixels, one possible choice is to span all the neighborhood, for example with a 2-D filter with a 3 x 3 kernel. A similar approach has been recently presented and analyzed in [7], where the color differences are filtered and successively used to correct the high frequencies of the image. However, an isotropic filtering may introduce zipper effect near the edges degrading the quality of the image since it performs the interpolation of the color differences also across the edges. A more effective approach is to select the low and high frequencies using a 1-D lter, so the interpolation is carried out only along the edges of the image. To summarize, the refining step is performed as follows. 1) Updating of the green component.
For each red location (i,j), the green and red channels are filtered with a low-pass filter along the direction selected using δ h and δ v. The four componentsg l,g h, R l, and R h are obtained. Then, the green high- frequency values G h are replaced with R h and the green samples G are reconstructed. The same update is carried out for the green values in the blue locations. 2) Updating of the red and blue components in the green locations. For each green position, the green and the red sub band values are obtained through horizontal or vertical filtering, depending on where the neighbor red values in the Bayer pattern are placed. Then, the high-frequency component of the red channel is updated SCREEN SHOTS with the green one and the red values are reconstructed. The update of the blue component is carried out in the same way. 3) Updating of the red (blue) component in the blue (red) locations. The red and blue channels are decomposed into low- and high-frequency components according to the most appropriate direction given by the comparison of δ h andδ v. The updated values in the neighboring pixels are used in order to obtain a more reliable estimate. Then, the red high-frequency component R h is replaced with B h.the blue values in the red pixels are refined in a similar. It is noticeable that the updating improves the quality of the images, reducing the interpolation artifacts and the MSE values. Figure 5: Original Image
The original image is affected by some artifacts. In Figure 6 using bilinear interpolation techniques as shown below. Figure 6: Bilinear Interpolation Initialization is one part of reconstruction more artifacts occur in that diagonals and process. In that part, it removes the artifacts also edges as shown in Figure 7 but still annoying some
Figure 7: Initialization It will convert RGB into CFA Pattern as shown in Figure 8 Figure 8. RGB to CFA In the enhancement part, it removes the artifacts and the image will be smoothen as shown in Figure 9 Figure 9: Enhancement The Final Figure 9, the following things can be concluded
1. Brightness is improved. 2. 85% of Artifacts are removed. 3. Visual clarity is also improved. Applying Low pass filter as shown in Figure 10 Applying High pass filter as shown in Figure 11 Figure 10. Using Low pass Filter
Figure 11: Using High pass Filter V. CONCLUSION The concept of demosaicing for restoring full-color images from incomplete color samples acquired by single-sensor digital cameras.the original image will be reconstructed that reconstructed images are affected by some visible and annoying artifacts using edge directed interpolation. This process uses the edge directed interpolation to reconstruct a full resolution green component, and then the red and blue channels are interpolated using the green information. The refinement process to improve the quality of the reconstructed image and removing the artifacts using low pass and high pass filters. Low pass filters are used by smoothen the image and then apply high pass filters to remove the artifacts in sharp edges. These two modules of the system are implemented using matlab. VI. REFERENCES 1. Adams J.E. and Hamilton J.F. (1999) Adaptive color plane interpolation in single color electronic camera, IEEE Trans Image Processing, vol.11, pp.179-186. 2. Bayer B. (1976) Color imaging array, United States Patent No.3, 971,065. 3. Cok D.R. (1986) Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal, U.S. Patent 4,642,678.
4. Chang E. and Cheung S. and Pan D. (1999) Color filter array recovery using a threshold-based variable number of gradients, in Proc. SPIE, vol. 3650, pp. 36 43. 5. Gunturk B.K. Altunbasak Y. and Mersereau R.M. (1986) Color plane interpolation using alternating projections, IEEE Trans.Image Processing, vol. 11, no. 9, pp. 997 1013. 6. Glotzbach J.W. Schafer R.W. and Illgner K. (2001) A method of color filter array interpolation with alias cancellation properties, in Proc. IEEE Int. Conf. Image Processing, vol.1, pp. 141 144 7. Hur B.S. and Kang M.G.(2001) High definition color interpolation scheme for progressive scan CCD image sensor, IEEE Transactions on Consumer Electronics, vol. 47, no. 1, pp. 179-186. 8. Hirakawa K. and Parks T.W. (2003) Adaptive homogeneity-directed demosaicing algorithm, Proc. IEEE Int. Conf. Image Processing, vol. 3, pp. 669 672. 9. Keren D. and Osadchy M. (1999) Restoring sub sampled color images, Machine Vision and Applications, vol. 11, no. 4, pp. 197 202. 10. Muresan D.D. and Parks T.W. Optimal Recovery Demosaicing, IASTED Signal and Image Processing Conference (Hawaii 2002). 11. Mukherjee J. and Parthasarathi R. and Goyal S. (2001) Markov random field processing for color demosaicing, Pattern Recognition Letters, vol. 22, no. 3-4, pp. 339 351.