A New Method to Fusion IKONOS and QuickBird Satellites Imagery Juliana G. Denipote, Maria Stela V. Paiva Escola de Engenharia de São Carlos EESC. Universidade de São Paulo USP {judeni, mstela}@sel.eesc.usp.br Abstract The modern remote sensing imaging sensors, like those in the IKONOS and QuickBird satellites, are capable of generating panchromatic images with about one meter spatial resolution. The principal objective of fusion in remote sensing is to obtain images that can combine the spectral characteristic of the lowresolution multispectral images with the spatial information of the high-resolution panchromatic images. Traditional fusion methods, such as IHS, PCA and Brovey, can reach good spatial resolution results, but often cause spectral distortion problems. In the literature, it is possible to find some image fusion methods using frequency domain processing, like wavelet or Fourier transform. Although they preserve good spectral information, their spatial visual effects are not satisfactory or limits to three the number of spectral bands used in the fusion process. In this paper, a method based on Fourier transform is proposed in order to obtain good spatial and spectral resolutions, without limiting the number of bands. Quantitative measurements were applied to evaluate the quality of four fusion methods (IHS, IHS enhanced by Fourier transform, wavelet-based and the proposed one) in IKONOS and QuickBird images. The results have shown that the proposed method can keep almost the same spatial resolution as the panchromatic images, and its spectral effect is well preserved. 1. Introduction Remote sensing images allow analyzing difficult access locations without the need of being present in the studied area. They also allow having a global visualization of the considered area when compared to field observation, covering a huge collection of information by a single image [1]. In addition, satellite imaging sensors can detect frequencies invisible to human eye and represent extra information to be analyzed, for example the near infrared range of the spectrum. Nowadays, satellite sensors are capable to producing high spatial resolution panchromatic () images with less than one meter, like those from IKONOS and Quickbird satellites. Simultaneously, they produce good multispectral (MS) images. Spectral information from MS bands is useful to differentiate land cover classes like vegetation, bare soil, water, roads, streets and buildings. This is possible because each object that can be identified in an image has a particular spectral reflectance response. On the other hand, the spatial information from is necessary for an accurate description of image details, such as shape, contours and features [2]. Image fusion is used to combine both and MS bands to obtain highresolution multispectral images. The commonly used methods, as IHS (Intensity, Hue, Saturation), PCA (Principal Component Analyses) and Brovey transform can keep almost the same spatial resolution as, but they distort the spectral characteristics of the original MS images [3]. Image fusion methods using frequency domain processing, like those based on wavelet transform (WT) ([4], [5], [6] and [7]), preserve good spectral information, but their spatial visual effects are not satisfactory. IHS fusion methods enhanced by Fourier transform have been very suitable in preserving both spectral and spatial information ([2], [8], [9] and [1]), but they are limited to red (R), green (G) and blue (B) bands, excluding the near-infrared (NIR) band. Davis and Wang [11] investigated the importance of NIR band to feature extraction and classification of fused images. They achieved better results using NIR band rather than without it. In this paper, the problems and limitations of available fusion methods are analyzed, and a new fusion method based on Fourier transform () is proposed to minimize the spectral distortion, keep high
spatial resolution and employ R, G, B and NIR MS bands. In order to evaluate the result of the new image fusion method, we used IKONOS and QuickBird satellite images, and some quantitative measurements to compare it with IHS, IHS enhanced by Fourier transform (IHS+) and WT-based methods. A visual analysis was also performed. Qualitative and quantitative evaluations have shown that the proposed method outperforms the existing ones. 2. Image Data IKONOS satellite was launched by GeoEye in 1999. It is the first commercial satellite with one meter spatial resolution in the world [12]. has one meter spatial resolution and MS bands have four meters. The QuickBird satellite was launched by DigitalGlobe in 21 [13]. has.6 meter and MS has.24. The images used in the present study are from Guaxupé- MG, Brazil. 3. IHS Fusion Method The most common fusion method in the literature uses the IHS color model. The main steps of the IHS fusion are: 1. Convert R, G and B bands to I, H and S components. 2. Replace I component with. 3. Convert the new composition ( H S) back to RGB. However, for IKONOS and QuickBird imagery, it is necessary to perform some pre-processing, as the MS pixel size is four times the. Then, one pixel in MS must be resampled to four pixels. The major limitation of IHS fusion is that it can be applied only to three bands at a time [3], [7], as it is necessary to convert R, G and B bands in I, H and S components. The second limitation is the modification of original spectral information due to the change in saturation during fusion process [6], producing color distortion. 4. Wavelet-based Fusion Method Wavelet theory can be used to extract detail information from one image and inject it into another [3]. In a remote sensing image, the details, like objects edges, are a result of a high contrast between features, for example a light rooftop beside a dark ground. High contrast in spatial domain is high frequencies in the frequency domain, which are richer in than in MS. On the other hand, spectral information (color) appears as low frequencies in the frequency domain, which are richer in MS than in. According to Gonzalez et. al. [14] and Amolins et. al. [3], in practice, wavelet transform is based on high and low pass filters. First, wavelet transform is applied on columns and then on rows of a given image by using a bank of filters. Each pass through the bank of filters decomposes the input image into four coefficients with less resolution: approximation coefficient, vertical detail coefficient, horizontal detail and diagonal detail. The wavelet-based fusion method obtains the detail information from and injects it into MS. Substitution and addition are the most common ways to join and MS information. The substitutive wavelet method is similar to the standard IHS scheme. It involves completely replacing the details of the MS with those of the. The additive wavelet method consists in adding the detail coefficients from to those of the MS bands. To recompose the image, the filters are recombined to apply inverse wavelet transform on approximation and detail components. + R R R approx. details R F Wavelet Transform Substitute or add details to R details Inverse Wavelet Transform Figure 1 wavelet-based fusion scheme. As in IHS fusion, wavelet-based fusion also requires resampling the MS bands to agree with pixel size. The main steps of the wavelet-based fusion method for band R are illustrated in figure 1 and are as follows: 1. Apply WT to both R resampled band (R ) and band. 2. Substitute or add the detail component from the wavelet transformed to those of the wavelet transformed R. If multiple decompositions are applied, substitute or add the detail components at each resolution level. 3. Perform the inverse transform on the R approximation and details components. The same computation must be applied to G, B and NIR bands.
Some authors ([2], [3], [4], [5] and [7]) compared wavelet-based fusion to traditional fusion methods, like IHS, Brovey and PCA. They concluded that the wavelet-based fusion keeps important spatial and spectral characteristics and produces better visual results, particularly in terms of minimizing color distortion. It can also be applied to all bands in the MS image simultaneously. Nevertheless, Amolins et. al. [3], Garzelli [15], Li et. al. [2] and Zhang [16] discussed the limitations of wavelet-based fusion. The major negative aspect is the introduction of artificial artifacts into the fused image, because high frequencies are not smoothly added to low frequencies. Furthermore, these methods only use vertical, horizontal and diagonal details, which do not reflect the real high resolution information. Also they involve greater computational complexity. wavelet-based methods. Ling s methods have the drawback of using only R, G and B MS bands, excluding the NIR band, and consequently losing important information [11]..ft HPF HP_.ft LPF -1-1 R G B I H S I.ft LP_I.ft 5. IHS Fusion Method Enhanced by HP_ LP_I Some authors have enhanced IHS fusion by the application of [2], [8]., when applied to image fusion, adopts the same idea of WT, i.e., high contrast in spatial domain appears as high frequencies in frequency domain and the spectral information from MS images appears as low frequencies in frequency domain. Ling et. al. [8] proposed a -based fusion method that is illustrated by figure 2 and has the main following steps: 1. Convert R, G and B resampled bands to I, H and S components. 2. Perform on both and I component to obtain.ft and I.ft, respectively. 3. Apply high pass filter to.ft to obtain HP_.ft. 4. Apply low pass filter to I.ft to obtain LP_I.ft. 5. Perform the inverse on filtered images to obtain HP_ and LP_I. 6. Add HP_ to LP_I to obtain I. 7. Convert the new I HS image with IHS to RGB transform. Both high and low pass filters in steps 3 and 4 must be complementary to guarantee that no information will be lost. Ling et. al. [8] have tested many filters with different cutoff frequencies in steps 3 and 4: ideal, Gaussian, Butterworth and Hanning filters. The ideal filter presents artificial artifacts due to the abrupt injection of high frequencies into low frequencies. Gaussian, Butterworth and Hanning filters present better results as they soften the signal. The best result was achieved using Hanning filter with a circle radius of 32 pixels. They conclude that their method is satisfactory when compared with IHS, PCA and + I H S R F G F B F Figure 2 - Schematic diagram for IHS fusion method enhanced by 6. -based Proposed Fusion Method A fixed cutoff frequency expressed in pixels (like 32 pixels) does not perform well to all images because each image has a different size. It is more appropriate to use cycles per meters. The ideal cutoff frequency must be measured accordingly to the spatial resolution, which depends on the image sampling interval. We compute the cutoff frequency based on Nyquist sampling criterion [2]. As known from that criterion, the sampling interval is in an inverse proportion to the sampling frequency and the maximum frequency of an image is in an inverse proportion to its spatial resolution: 1 Max _ freq = eq. 1 2 x where x is the pixel size in meters. According to Li et. al. [2], the cutoff frequency is equal to the maximum frequency of MS which is 1/4 of maximum frequency of for both IKONOS and QuickBird. IKONOS has one meter spatial resolution and, according to equation 1, the maximum frequency
is.5 cycles per meter. So the cutoff frequency for an IKONOS image is.125 cycles per meters. QuickBird has.6 meter resolution and the maximum frequency is.8333, according to equation 1. The cutoff frequency is.283 cycles per meters. Some filters are tested to smoothly join spectral and spatial information, and again Hanning presented the best results. For IKONOS imagery, the Hanning low pass filter was applied in MS bands with the adaptive cutoff frequency of.125 cycles per meters (see equation 2). The high pass filter was applied in (equation 3). Note that the high and low pass filter must be complementary to not lose or overlap any information..ft HPF HP_P AN.ft LPF R R.ft LP_ R.ft + R.ft +.5cos( πd( /.125) H l ( D(.125 eq. 2.125 < D( Mf.5cos( πd( /.125) H h ( eq. 3 where u is the analyzed pixel line, v is the analyzed column, D( is the analyzed frequency and Mf is the maximum frequency like in equation 1. Equations 4 and 5 are, respectively, Hanning low pass and Hanning high pass filters for QuickBird. +.5cos( πd( /.283) H l (.5cos( πd( /.283) H h ( D(..283 eq. 4.283 < D( Mf eq. 5 The steps to fusion R and bands are illustrated in figure 3 and are as follows: 1. Perform on both resampled R and to obtain R.ft and.ft, respectively. 2. Apply a high pass filter to.ft to obtain HP_.ft. 3. Apply a low pass filter to R.ft to obtain LP_R.ft. 4. Add HP_.ft to LP_R.ft to obtain R.ft. 5. Perform the inverse to obtain R F fused band. As in wavelet-based fusion scheme, the same steps must be applied to G, B and NIR bands. -1 Figure 3 - Schematic diagram for -based image fusion method 7. Experiments and Results It is common to perform visual and statistical analyses to evaluate fusion methods. Figure 4 shows the results of all fusion methods described in previous sections for IKONOS imagery. The proposed method and IHS+ were executed with Hanning filter due to its good visual result. It is clearly seen that all methods result in high spatial resolution images (compare them to in figure 4a). However IHS and WT-based fusion distorted the colors (compare them to RGB MS composition in figure 4b). The proposed fusion method (figure 4f) presents similar results as IHS+ method (figure 4e) for high spatial resolution and true color. Figure 5 shows the results of all fusion methods described before for QuickBird imagery. IHS and WT methods had similar results as when applied to IKONOS imagery. But, although IHS+ method (figure 5e) and proposed method (figure 5f) did not distort the colors, the spatial resolution was not so good. This is under investigation and probably a modification in the cutoff frequency must be done. In general, a good fusion approach should retain the maximum spatial and spectral information from the original images and should not damage the internal relationship among the original bands [4]. Based on these criteria, the statistical analysis uses the correlation coefficient that indicates how an image is similar to another [17]. R F
(a) (b) (a) (b) (c) (d) (c) (d) (e) (f) Figure 4 A representative portion of fusion results in IKONOS images (a) original, (b) original MS RGB composition, (c) IHS, (d) Wavelet, (e) IHS+ and (f) proposed method Table 1 Correlation coefficient between original and original MS bands. Sensor R G B NIR IKONOS.19.32.21.55 QuickBird.42.53.42.34 Table 2 Correlation coefficient between original and MS bands after fusion. Method Sensor R G B NIR IHS IKONOS.96.96.96 --- QuickBird.93.94.91 --- WT IKONOS.82.89.89.91 QuickBird.72.82.79.89 IHS+ IKONOS.54.69.57 --- QuickBird.52.67.53 --- IKONOS.52.68.63.79 propose d QuickBird.52.66.57.82 (e) (f) Figure 5 A representative portion of fusion results in QuickBird images (a) original, (b) original MS RGB composition, (c) IHS, (d) Wavelet, (e) IHS+ and (f) proposed method Table 3 Correlation coefficient between the fused bands and their corresponding original bands for different fusion methods. Method Sensor R G B NIR IHS IKONOS.34.24.27 ---- QuickBird.59.42.49 ---- WT IKONOS QuickBird.64.88.61.86.53.82.77.76 IHS+ IKONOS.82.77.8 QuickBird.96.94.95 ---- ---- (propo sed) IKONOS.84.78.74.86 QuickBird.96.94.93.98 Table 1 shows the correlation coefficients between the original and original MS bands, while table 2 shows the correlation coefficients between the original and MS bands after fusion for each method and each sensor. From those two tables, it can be seen that the correlation between each band and is higher after fusion than before it for all tested methods, which implies that the fused images gain information from the
original. IHS method obtained more information from than the others. The results from the proposed method and IHS+ are very similar. On the other hand, after fusion, the MS bands are expected to be as similar to the original bands as possible in order to do not lose information. Table 3 shows the correlation coefficient between the original MS bands and their correspondent fusioned bands. The proposed method and IHS+ have the most suitable results, indicating that these methods attain more spectral information than the others. The principal difference of these two methods is that the proposed one also can be used for NIR band. 8. Conclusions This paper proposed an image fusion method based on filter in the frequency domain to the fusion of multiespectral satellite images. The proposed method was compared to other reported methods and has the advantage of using any number of bands, exploiting all the information from IKONOS and QuickBird imagery. In visual analysis, for the proposed method, it is easy to observe that high information from was added to MS information without distort the original colors. It also can be observed that the new method does not introduce artifacts in the results, due to the smooth join of the and MS because Hanning filter was used. From a statistical analysis, the proposed fusion method followed by IHS+ attain more spectral information. But if NIR information is desirable, IHS+ cannot be used, and the proposed method is preferable. Acknowledgments: The authors would like to acknowledge CNPq for the financial support given to this research. 9. References [1] Lillessand, T.M.; Kiefer, R.W., Remote Sensing and Image Interpretation. 2nd edition. New York: John Wiley & Sons, 1987. [2] Li, J.; Luo, J.; Ming, D.; Shen, Z., A new Method for merging IKONOS Panchromatic and Multispectral Image Data, Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 25, IGARSS '5, 25-29 July 25, Vol. 6, page(s): 3916-3919. [3] Amolins, K.; Zhang, Y.; Dare, P., Wavelet-based image fusion techniques an introduction, review and comparison, ISPRS Journal of Photogrammetry and Remote Sensing, v. 62, issue 4, p. 249-263, September 27. [4] Gungor, O.; Shan, J., Evaluation of Satellite Image Fusion using Wavelet Transform, 2th ISPRS (International Society of Photogrammetry and Remote Sensing) Conference, Istanbul, Turkey, July 12-23, 24. [5] Ioannido S.; Karathanassi, V., Investigation of the Dual-Tree Complex and Shift-Invariant Discrete Wavelet Transforms on Quickbird Image Fusion, IEEE Geoscience and Remote Sensing Letters, vol. 4, no. 1, January 27. [6] T T.M.; S S.C.; Shy H.C.; Huang, P.S., A new look at IHS-like image fusion methods, Information Fusion, Volume 2, Issue 3, September 21, Pages 177-186. [7] Wang, Z.; Zio D.; Armenakis, C.; Li, D.; Li, Q., Comparative Analysis of Image Fusion Methods, IEEE Transactions on Geoscience and Remote Sensing, vol. 43, no. 6, June, 25. [8] Ling, Y.; Ehlers, M.; Usery, E.L.; Madden, M., enhanced IHS transform method for fusing high-resolution satellite images, ISPRS Journal of Photogrammetry and Remote Sensing, Volume 61, Issue 6, February 27, Pages 381-392. [9] Tsai, V.J.D., Frequency-Based Fusion of Multiresolution Images, Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 23, IGARSS '5, 25-29 July, Vol. 6, page(s): 3916-3919. [1] Tsai, V.J.D., Evaluation of Multiresolution Image Fusion Algorithms, Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 24, IGARSS '4, 2-24 Sept., Vol. 1, page(s): 3665-3667. [11] Davis, C.H.; Wang, X., Urban Land Cover Classification from High Resolution Multi-Spectral IKONOS Imagery, Proceedings of the IEEE International Geoscience and Remote Sensing Symposium, 22, IGARSS '2, 24-28 June 22, Vol. 2, page(s):124 126. [12] GeoEye, IKONOS Imagery Products Guide, Available at:<http://www.geoeye.com/whitepapers_pdfs/geoeye_iko NOS_Product_Guide_v17.pdf >, access: May 3, 28. [13] DigitalGlobe, QuickBird Specifications. Available at: <http://www.digitalglobe.com/index.php/85/quickbird>, Access: August 11, 28. [14] Gonzalez, R.C.; Woods, R.E.; Eddins, S.L., Digital Image Processing Using Matlab, 1st edition. New Jersey: Prentice Hall, 24. [15] Garzelli, A., Possibilities and limitations of the use of wavelets in image fusion, IEEE International Geoscience and Remote Sensing Symposium, 22, IGARSS '2, 24-28 June 22, Vol. 1, page(s): 66-68. [16] Zhang, Y., Problems in the Fusion of Commercial High-Resolution Satelitte Images as well as Landsat 7 Images and Initial Solutions, International Archives of Photogrammetry and Remote Sensing (IAPRS), GeoSpatial Theory, Processing and Applications, Volume 34, Part 4, Ottawa, July 22. [17] Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P., Image quality assessment: from error visibility to structural similarity, IEEE Transactions on Image Processing, Volume 13, Issue 4, April 24 Page(s):6 612.