IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 1

Size: px
Start display at page:

Download "IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 1"

Transcription

1 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 1 A New Adaptive Component-Substitution-Based Satellite Image Fusion by Using Partial Replacement Jaewan Choi, Kiyun Yu, and Yongil Kim, Member, IEEE Abstract Preservation of spectral information and enhancement of spatial resolution are regarded as important issues in remote sensing satellite image fusion In previous research, various algorithms have been proposed Although they have been successful, there are still some margins of spatial and spectral quality that can be improved In addition, a new method that can be used for various types of sensors is required In this paper, a new adaptive fusion method based on component substitution is proposed to merge a high-spatial-resolution panchromatic (PAN) image with a multispectral image This method generates high-/low-resolution synthetic component images by partial replacement and uses statistical ratio-based high-frequency injection Various remote sensing satellite images, such as IKONOS-2, QuickBird, LANDSAT ETM+, and SPOT-5, were employed in the evaluation Experiments showed that this approach can resolve spectral distortion problems and successfully conserve the spatial information of a PAN image Thus, the fused image obtained from the proposed method gave higher fusion quality than the images from some other methods In addition, the proposed method worked efficiently with the different sensors considered in the evaluation Index Terms Adaptive fusion framework, color distortion, component substitution (CS) fusion, high-/low-resolution synthetic component image, partial replacement I INTRODUCTION AS THE REMOTE sensing satellites IKONOS-2, QuickBird, and SPOT-5 were launched, they provided low-spatial-resolution multispectral (MS) images and highspatial-resolution panchromatic (PAN) images With regard to remote sensing, it has been a difficult task to obtain highspatial-resolution MS images because of the technical limitations of satellite sensors, the incoming radiation energy to the sensor, and the data volume handled by the sensors [1] Image fusion is very important in that it aims to combine remotely sensed images that have different spectral and spatial resolutions with various applications such as visualization to identify the textures and shape of the various objects, feature extraction, and map updating [2] For the past few years, many image fusion algorithms have been proposed The image fusion method is called Pan sharpening because an MS image is sharpened by injection of spatial details extracted from a PAN image Among the image fusion or Manuscript received April 26, 2009; revised September 13, 2009 and January 18, 2010 This work was supported by a National Research Foundation of Korea grant funded by the Korea Government (MEST) (No ) The authors are with the Department of Civil and Environmental Engineering, Seoul National University, Seoul , Korea ( choijw11@snuackr; kiyun@snuackr; yik@snuackr) Color versions of one or more of the figures in this paper are available online at Digital Object Identifier /TGRS Pan sharpening algorithms, the generally effective techniques are component-substitution (CS)-based methods, multiresolution, and arithmetic model-based combinations A representative CS technique is the intensity hue saturation (IHS) fusion method However, classical IHS fusion can only fuse an RGB color image, and it significantly distorts color information because of the dissimilarity between the intensity and PAN images [3], [4] To merge MS images of more than three bands, a fast IHS (FIHS) fusion technique is proposed This can quickly merge the massive volume of data and reduce color distortion in the fused image [5] In various research works, the FIHS is modified by the spectral response characteristics of the sensor or the model parameters, such as in FIHS with spectral adjustment (FIHS-SA), generalized IHS (GIHS) with a tradeoff parameter (TP) in terms of spectral distortion and spatial enhancement, GIHS with a genetic algorithm (GIHS-GA), the fast spectral response function method, and general CS based upon the radiometric properties of sensors (GCOS) [5] [9] Another CS-based technique is the Gram Schmidt (GS) spectral sharpening algorithm that uses a GS orthogonalization procedure This method was patented by Eastman Kodak and implemented in the Environment for Visualizing Images (ENVI) software [10] GS spectral sharpening has various fusion modes in accordance with the selection of the low-resolution PAN image that is used in a forward GS transform Among the various modes, GS adaptive (GSA) using multiple regression analysis on low-resolution synthetic image generation turns out to be efficient in most cases [11] CS-based approaches focus on making an ideal intensity image and a suitable high-frequency injection model to preserve spectral information In CS-based fusion, some algorithms can be applied only to a specific sensor, although a few commercially available image fusion software tools have proven to be suitable for all available optical PAN and MS images In addition, these tools have a greater potential to improve the spectral quality, although they only show visually prominent results Multiresolution analysis (MRA) based on wavelet decomposition is notable in that it can reduce the distortion of spectral information more than the CS-based fusion method Among the multiresolution analyses, the approaches of discrete wavelet transform are employed by merging couples of subbands of corresponding frequency content [12] Recent wavelet analyses of image fusion have tended to center around the á trous algorithm, which is an undecimated wavelet transform [12] The additive wavelet LHS (AWL) method, of which representative á trous image fusion stands as the basis, injects the wavelet plane of the PAN image into every MS band [13] Proportional AWL (AWLP) is the version of AWL that has been generalized /$ IEEE Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

2 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 2 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING in order to apply it to a variety of remote sensing data The that can be applied to various sensors and improves the fusion AWLP method adaptively injects into the wavelet plane in a quality of the fused image is required Thus, in this paper, we manner based upon the proportions between each MS band and modified the general CS-based fusion to improve the fusedimage their summation [14] Similar to AWLP, the window spectral quality, and the modified version is called the new adap- response (WiSpeR) method is a generalization of other waveletbased tive CS-based image fusion Through this new type of fusion, to image fusion algorithms [14] Some research has demon- be explained in a later section, we removed spectral distortion strated that the additive wavelet principal component (AWPC) efficiently while preserving the spatial characteristics of the based on AWLP gives an outstanding result compared with PAN image In addition, the proposed method can efficiently other wavelet-based methods [15] Recently, various Laplacianbased be applied to various satellite sensors fusion methods with an adaptive spatial injection model, This paper is organized as follows The generalized CS-based such as the spectral distortion minimization approach, the image fusion method is briefly described in Section II The method of Ranchin, Wald, and Mangolini, context-based decision scheme of our proposed algorithm is reported in Section III (CBD), and modulation-transfer-function (MTF)-tailored Construction of a high-/low-resolution component image is multiscale fusion, have been proposed [12], [16] [18] To overcome performed by means of a partial replacement between the MS the limitations of the wavelet concept from the point of and PAN images by using linear regression and a corresponding view of the expression of directional information and intrinsic correlation coefficient Thereafter, an adaptive injection model geometrical structures, a combined adaptive PCA algorithm based on the statistical ratio is described In Sections IV and V, based on the discrete contourlet transform is developed [19] we compare our algorithms with various representative fusion Among them, AWLP and the Laplacian-based CBD method techniques The results of the method applied to the IKONOS, were considered as the best image fusion methods during the QuickBird, LANDSAT ETM+, and SPOT-5 satellite images are 2006 GRS-S Data-Fusion Contest, which performed better reported with respect to the visual, spatial, and spectral quality on average than the CS-based method [20] However, spatial A conclusion is presented in Section VI distortions may occur in using MRA-based fusion methods because of aliasing effects and the blurring of textures, and the fact that spatial enhancement attainable with MRA-based II GENERAL CS FUSION METHOD AND ITS DRAWBACKS methods is generally not satisfactory compared with CS-based In the following section, we report on analyses of the general methods [18] Spatial distortions must be reduced, given that CS-based fusion framework, which is widely used to combine the most important virtue of an image fusion algorithm is a MS and PAN images to improve the spatial resolution of the balance between spatial sharpness and spectral preservation fused image efficiently In the remainder of this section, we To avoid these problems, arithmetic framework-based combinations are developed and give more efficient outputs The clarify the cause of spectral/spatial distortion of fused images due to the spectral response of the sensor in CS-based fusion University of New Brunswick (UNB)-Pansharp is implemented in PCI Geomatica software [21] This method solves the color distortion problem by using the least squares technique and A CS-Based Image Fusion Approach statistical approaches A local-mean- and variance-matching (LMVM) filter was based on a normalization function at a A classical CS fusion approach is accomplished by a forward local scale within the images to match the local mean and specific feature space transformation and an inverse transformation The CS fusion method can be formulated with the variance of the PAN image with those of an MS image [22], [23] Its output was determined by filtering the window size following steps [5] of an LMVM Generally, the larger the filtering window, the I c 11 c 21 c n1 MS l 1 more structural information from the PAN image may be in- C corporated, but more spectral distortion may occur [24], [25] 1 Various mathematical approaches, such as the steepest descent = c 12 c 21 MS2 l method to minimize the energy function between the fused and C n 1 c 1n c nm MSn l PAN images, the Bayesian data model, the pixel neighborhood regularization by spectral consistency, and the restoration-based 1) Resize the MS image to the size of the PAN image framework, are proposed to remove the limitations of the 2) Transform the MS image into specific feature space by existing fusion method [26] [29] Sometimes, the arithmetic applying (1), where the first component, which is similar combinations demand the improvement of spatial and spectral to the low-spatial-resolution PAN image, is shown as I quality because of the noise data in the fused image arising from 3) The first component, namely, image I, is substituted by the lack of optimal spatial injection as a whole [20] the PAN image, which is a histogram matched to I Although existing fusion algorithms work fine in some aspects, margins still have to be improved to preserve the spa- high-spatial-resolution MS image by using 4) Carry out the inverse transformation to create the new tial information of the PAN image and, at the same time, to MS h w 1 11 w 21 w n1 PAN minimize spectral distortions Furthermore, some algorithms, MS2 h such as FIHS-SA and GCOS, are restricted to a specific sensor type, and the spatial enhancement or spectral preservation of w 12 w 22 C 1 (2) the fused image may not be satisfactory; hence, a new method MSn h w 1n w nm C n 1 Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

3 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 3 where MSn l is the nth band of the resized MS image and MSn h is the nth band of the fused MS image C n is the nth component by transformation, and c and w are the elements of the forward- and inverse-transformation matrix, respectively Because of the two-stage transformation matrix calculation, the classical CS fusion method generates a higher computational cost The general CS fusion technique overcomes this problem by using a simple linear equation First, classical CS fusion can be reformulated by using the following equation [9]: MS1 h MS2 h MS h n w 11 w 21 w n1 = w 12 w 22 w 1n w nm w 11 w 21 w n1 w = 12 w 22 w 1n w nm w 11 w + 12 (PAN I) = w 1n MS1 l MS2 l MS l n + w 11 w 12 w 1n 1+(PAN I) C 1 I C 1 C n 1 C n 1 (PAN I) (3) We can then get the linear equation (4), which is regulated by (3) MS h n = MS l n + w n (PAN I) =MS l n + w n δ (4) where w n is the nth modulation coefficient and δ is the spatial detail of the PAN image General CS fusion depends upon the weighted parameters for the formation of δ and the modulation coefficient w n [9], [30] B Spectral/Spatial Distortion Owing to the Relative Spectral Response of the Sensor In a general fusion framework, a theoretical high-spatialresolution MS image is formulated by the decomposition of its high- and low-frequency components [31] MSn h = High ( MSn h ) ( ) + Low MS h n (5) where High(MS h n) means the high-frequency information of the nth band of the ideal high-spatial-resolution MS image and Low(MS h n) is its low-frequency data As is well known, we cannot obtain a theoretical MS image MS h n, and (5) must be reconstructed using an existing low-spatial-resolution MS image and a high-spatial-resolution PAN image [31] As the low frequency of the high-spatial-resolution MS image approximates to a spatially degraded MS image, we can substitute the original MS image for the low-frequency data of the theoretical MS image Simultaneously, we can hypothesize that the high-frequency data of the high-spatial-resolution MS image is intimately linked with the relationship between the PAN image and the corresponding MS image if high spatial MS is highly correlated with PAN For these reasons, (5) can be written in a mathematical approximation as MSn h = High ( MSn h ) ( ) + Low MS h n High ( MSn) h + MS l n w n High ( PAN,MS l n) + MS l n (6) where High(PAN,MS l n) means the high-frequency information extracted by PAN and the nth band of the MS image and w n represents the relative weighted parameters of the nth band of the MS image The final form of CS-based fusion, such as (4), bears a resemblance to the mathematical relationship of (6) Therefore, the CS-based fusion method must satisfy the hypotheses of (6) so that the quality of the fusion result is similar to that of an ideal high-spatial-resolution MS imagery That is to say, the PAN image must be highly correlated with, or have similar spectral/spatial characteristics to, each MS image because the high frequency of the theoretical MS image is approximately equal to that of the mathematical analysis of the PAN and MS images Ideally, each band of satellite images must be covered by the spectral range of the PAN band in order that each MS image has high correlated spectral response with PAN However, the wavelength of the PAN band used in commercial satellites may be not overlapped by each MS band, as shown in the following normalized spectral response function in Fig 1 The blue and green band levels of the IKONOS sensor are greater than that of the PAN band, and the wavelengths of the PAN band exceed the near-infrared (NIR) band zone In the case of LANDSAT ETM+ and SPOT-5, some bands are not covered by the wavelengths of the PAN image, for example, bands 1, 5, and 7 of LANDSAT ETM+ and band 3 of SPOT-5 Subsequently, color distortion of the fused image will be an inevitable consequence in the injection process of high-frequency information due to the global dissimilarity or low correlation between the PAN and each MS image in their relative spectral response functions On the one hand, in the extraction of the high-frequency data, which means parameter δ in (4), many algorithms use the difference between a histogram-matched PAN image and a first-component image by using MS Each image band has its own unique bandwidth and statistical characteristics When the spatial details of the PAN image included in δ are separately injected into the MS bands, some excessive spectral information may be received because of the dissimilarity within the MS bands If the modulation coefficient is set too low in order to avoid this problem, the spatial quality of the fused image will decline [6] As a general result, the injection model of edge information allows the fused image to contain identical spatial information; however, the spectral property is relatively deprived because the PAN and first-component images are globally low correlated with the corresponding MS image To determine the optimal coefficient, researchers have suggested the various methods and experimental results as bases for selecting and optimizing the corresponding parameters to make the fused image, similar to Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

4 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 4 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING corresponding MS image must be considered in the CS-based fusion process III ADAPTIVE CS IMAGE FUSION USING PARTIAL REPLACEMENT As previously mentioned, we propose the new adaptive CS image fusion method that uses partial replacement to remove spectral distortion and to preserve the original spatial characteristics, regardless of the type of satellite sensor Thus, spatial details can be extracted from the PAN image without affecting the spectral/spatial distortion in every MS band Our method is organized into two parts The first step is the construction of a high-/low-resolution component image by using partial replacement between the PAN and MS images, and the next step is to assemble an adaptive CS fusion model to minimize the global/local spectral dissimilarity between the PAN and each MS band while preserving the spatial details of the original PAN image The data size of the MS image that is used on the fusion framework is resampled equivalently to the original PAN image by using bicubic interpolation The concept diagram of the method is set out as Fig 2 Fig 1 (a) IKONOS spectral response (b) SPOT-5 spectral response (c) LANDSAT ETM+ spectral response the original MS image as efficiently as possible [6], [7], [32] FIHS-SA solves this problem through the experimental coefficient in the IKONOS satellite image [5] In GIHS-GA, every parameter is obtained by means of an optimization technique, such as through engaging a genetic algorithm [7] FIHS-TP used the tradeoff experimental coefficient, and FRSF, GCOS, and GS spectral sharpening algorithms are gained by mathematical sensor and data analysis However, most algorithms do not provide an optimal result because they do not reflect the characteristics of the scene and some parameters are fixed for a specific satellite image Moreover, as is well known from [33], the PAN and MS images may present some local instability or dissimilarity, such as object occultation or contrast inversion If these effects are not taken into account along with correlation of the PAN and MS images, the fused result may suffer from artifact effects and global/local spectral dissimilarity with the original MS image This is because the spectral/spatial information, which is different with the characteristics of the MS bands, is injected into fusion processing Consequently, the global and local dissimilarities between the PAN image and the A Construction of a High-/Low-Resolution Component Image Using Partial Replacement The digital number (DN) values of an image depend on the spectral response function of a sensor The spectral relationship between the PAN and MS images is not fixed because the spectral characteristics are changed with every object, area, and circumstance Therefore, the experimental parameters in establishing δ cause nonstable results since they may be different in each case, and they are arrived at by taking an average of the MS bands or considering connections only between blue and green [5] We used the linear regression algorithm to produce the optimal intensity image Its model can be defined by the following equation [34], [35]: PAN l = α 0 + N α n MSn l (7) Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg n=1 where PAN l means the spatially degraded low-spatialresolution PAN image when decimated by bicubic interpolation α is the regression coefficient, N represents the spectral bands, and MSn l is the nth MS image, which is bicubically resampled with the equivalent size of PAN data In this linear regression model, the degraded PAN is applied as a response variable instead of the original PAN to use the similarity between the low-spatial-resolution PAN and MS images The α values are calculated directly using the least square estimation The initial intensity image is then produced using the following equation: I l = α 0 + N α n MSn l (8) n=1 where I l means the initial intensity image and α is the regression coefficient acquired by (7) General CS fusion techniques use the low-spatial-resolution synthetic component Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

5 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 5 Fig 2 Diagram of the proposed fusion method (a) Construction of a high-/low-resolution component image using partial replacement (b) Adaptive fusion framework for minimizing the global/local dissimilarity image for extracting the spatial details of the original PAN image Notwithstanding the strong correlation between the PAN and synthetic component images, each MS band retains a different spectral characteristic to the PAN and low-spatialresolution component images The PAN image must therefore comprise a separate property for each MS band to avoid overinjection by the low-spatial-resolution component We developed the new high-spatial-resolution component image by using the low-spatial-resolution synthetic component and MS images by modifying partial replacement [33] It is constructed to meet the spectral characteristics of the individual MS bands The correlation coefficient between the low-spatial-resolution synthetic component image and each MS band histogram matched with I l is estimated to generate the high-resolution component image Thereafter, by using each correlation coefficient, the new high-resolution component image is computed using I h n = CC n PAN +(1 CC n ) MS l n (9) where CC n means the correlation coefficient between the lowspatial-resolution synthetic component image and the nth MS band image In h is the high-spatial-resolution component image corresponding to the nth MS band, and MSn l means the nth MS band histogram matched with the PAN image Zhang and Hong [3] propose a similar method by using partial replacement between the PAN and intensity images based on the wavelet- IHS integrated method However, their method cannot be applied optimally to the MS image over the RGB band and remove the effect of high-frequency injection in the PAN image because their method partially replaces only the LL wavelet coefficients between the PAN and intensity images Therefore, (9) focuses on the construction of a synthetic component image and the optimization of the correlation between the PAN image and each MS band by archiving a partial replacement in order to obtain optimal high-frequency information through minimization of the global spectral/spatial difference in specific regions If the PAN image and specific MS bands have a correlation of one, then the high-spatial-resolution component image of the MS bands is equal to the original PAN image If the correlation between the PAN image and the MS band is below one, then the high-spatial-resolution component image is partially replaced by the low-spatial-resolution MS image The portion of the replacement is decided by the reorganized correlation coefficient between the PAN image and each MS band In the correlation reconstruction procedure, the proportion of PAN is used to keep up the spatial details, whereas that of MS is intended to preserve spectral information in a specific region Notably, the initial intensity image I l may be low correlated to the corresponding MS band It can give rise to another distortion when highfrequency information is calculated to correspond with the spectral characteristics of In h and I l We therefore reorganized the low-resolution synthetic component image based on the newly defined high-resolution synthetic image First, In h was spatially degraded in the replacement of the low-resolution PAN image of (7) The low-resolution synthetic image In l was then recalculated through (7) and (8) using the spatially degraded In h and MS bands repeatedly The high-resolution component image derived through (9) was adaptively used in deriving δ Although the processing described earlier may be applied iteratively, we did not use the low-/high-spatialresolution synthetic component generation method more than once The more that iterative processing is applied, the larger the correlation and local similarity between the PAN and each MS image becomes; however, the high-spatial-resolution synthetic component image can converge into the original MS band It suffers from a decrease of the original high frequency Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

6 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 6 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING in the PAN image For this reason, we did not use the iterative processing again Local dissimilarity problems remained, yet these were resolved by using the following adaptive fusion framework B Adaptive Fusion Framework for Minimizing the Global/Local Dissimilarity In order to inject the high-frequency information of the original PAN image, the δ i of (4) is composed as follows: ( ) ( ) δ n = In h In l In h In l (10) where In h is the mean value of the high-resolution synthetic component image Ii h histogram matched with In l and In l is the mean value of the low-resolution synthetic image In l Although the low-/high-resolution synthetic component image is highly correlated to each MS band, which means that it has similar spectral information to each MS band and includes sufficient high-frequency information, it has statistical characteristics that are different from that of the MS bands, such as dynamic range, mean, and standard variation If the detailed information of the high-resolution component is not considered when it is injected into the respective MS bands, the difference in the dynamic range of DN values among the bands has a significant effect on the spectral and spatial quality of the merged image To preserve the essential characteristics of the MS bands, we applied the statistical ratio of the DN values and added the term for complementing the CS model in general CS fusion The w n of (4) and (6) is given as ( ) w n = β corr In l,msn l std ( ) MSn l (11) 1 N N std (MSn) l n=1 where std is the standard deviation of the corresponding image and In l is the final low-resolution synthetic component image of the nth MS band N is the total number of bands, and corr(in l,msn) l is the coefficient of correlation between In l and MSn l β is a constant parameter for stabilization to the dynamic range of the initial input data The dynamic range of detailed information for each image decreases with the decrease in the ratio of the standard deviation of the corresponding band to the mean standard deviation of all bands This enables detailed information to be injected into the MS image without loss of spectral and spatial information The standard deviation ratio reflects the spectral distortion due to the differences in standard deviation among the bands In addition, the input data have various radiometric resolutions The parameter β adjusts and normalizes the high frequencies so that they lie in the corresponding dynamic range The correlation coefficient of (11) adjusts the relative magnitude of the high-frequency information to minimize the global dissimilarity between the high-spatial-resolution image and each MS band Although the parameter w n modulates the difference of dynamic range and global dissimilarity in the injection of high frequency, the local instability between the high-resolution image and the MS band deteriorates the spatial/spectral quality of the fused image We therefore added an adaptive factor to remove the local spectral Fig 3 Example of local spectral dissimilarity (a) Blue band of the IKONOS image (b) Histogram-matched low-resolution synthetic component image of the blue band Fig 4 (a) Local instability adjustment parameter of the area of Fig 3 (b) Parameter histogram of (a) instability error between the synthetic component image and the MS band In Fig 3, the synthetic component image has different spectral characteristics compared to that of the corresponding MS band In the case of the low-resolution synthetic component image histogram matched to the MS band, the spectral ratio can represent the local dissimilarity between the synthetic image and the MS band If the spectral ratio is not approximate to one, the spectral pixel value of the synthetic component image is locally different to the corresponding MS band pixel By using this property, the local instability adjustment parameter L I_n is generated as L I_n =1 1 corr ( I l,msn l ) MS l n I (12) n l Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

7 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 7 TABLE I CHARACTERISTICS OF SATELLITE SENSORS USED IN EXPERIMENTS Fig 5 Color compositions of the test regions (a) First IKONOS set (R, G, and B bands) (b) Second IKONOS set (NIR, R, and G bands) (c) QuickBird set (R, G, and B bands) (d) LANDSAT ETM+ set (mid-ir, R, and G bands) (e) SPOT-5 set (NIR, R, and G bands) The primary role of (12) is the transformation of spectral ratio At first, the relative values of regions with different spectral characteristics are normalized by using an absolute value The spectral ratio below one and above means that the pixel may be different to that of the MS band, and it may have incurred local spectral/spatial distortion By locally decreasing the magnitude of the injected spectral/spatial information through the use of the L I_n parameter, we maintain the spectral/spatial response of the MS band In considering the global instability, we add the correlation coefficient parameter in (12) to adjust the global spectral difference As seen in Fig 4(a), by using the local instability adjustment parameter, we efficiently extract the local instability area Fig 4(a) shows that the darker the L I_n value, the greater the dissimilarity between the synthetic component image and each MS image Some areas, such as roads and crops, have a dark value below 07; however, some features like houses and areas, including edge information, have an L I_n value similar to one A locally similar area has a tendency to be near one (about 09), as seen in Fig 4(b) This means that L I_n does not detract from the injected high-frequency component and only controls some areas that have a different spectral property to that of the original MS band Summing up, our new adaptive CS-based fusion technique is organized by (13) based on (7) (12), and through (13), we can obtain an optimal fused image that minimizes spectral distortion locally and globally while preserving the spatial quality MS h n = MS l n + w n δ n L I_n (13) IV EXPERIMENTS A Experimental Data Sets To estimate the performance of the proposed fusion algorithm, various satellite sensors, such as IKONOS, QuickBird, LANDSAT ETM+, and SPOT-5, were selected Each MS image was resampled with the equivalent spatial size of the corresponding PAN image by bicubic interpolation and coregistered for each data set All the MS and PAN imagery has the specific characteristics of spectral wavelength, radiometric resolution, and spatial resolution These are summarized in Table I The overview of data sets is shown in Fig 5, and the details are as follows 1) IKONOS: The IKONOS imagery used for fusion performance estimation was acquired on November 19, 2001 The site is Daejeon, Korea, and includes various landcover types In site 1 [Fig 5(a)], forest, golf course, crop, roads, and building areas are presented, mainly for estimating the fusion quality of the vegetated area IKONOS test site 2 [Fig 5(b)] represents a complex urban area, a river, and some crops to evaluate an urban area 2) QuickBird: The QuickBird image was acquired on January 15, 2005 The site and size are equivalent to those of the IKONOS test sites The forest and amusement park area were selected for analysis, as seen in Fig 5(c) 3) LANDSAT ETM+: This is an organized complex urban and forest area, acquired on September 23, 2001 The original satellite image is composed of seven bands The sixth band is excluded in the fusion test because it has a different spatial resolution compared to that of the other Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

8 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 8 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING bands Therefore, we used a total of six bands (1 5 and 7) Only the green, red, and NIR bands are within the spectral wavelength of the PAN image This disagreement has a different effect on the fusion quality with IKONOS and QuickBird data sets Therefore, most fusion methods only fuse the three bands of green, red, and NIR However, we used six bands, with the exception being 6, in that it has a different resolution because of the influence of a wavelength exceeding those of PAN The spatial resolution ratio between the PAN and MS images differs from that of the other satellite data sets, as seen in Table I 4) SPOT-5: The sensor has a 5-m-resolution PAN image and a 10-m-resolution MS image A high-spatial-resolution image of 25 m is obtained by using two 5-m-resolution PAN images acquired simultaneously The spectral wavelength of SPOT-5 contains only the green and red bands, and the fourth MS band is excluded from the experiments because of its coarse spatial resolution of 20 m The site is organized as water, forest, varied crops, and urban areas, and the data set was acquired on June 28, 2002 B Quality Assessment In the quantitative assessment of the fusion result, the fused image should be compared with the ideal high-spatialresolution MS image observed by the MS sensor, which generates the high-spatial-resolution image equal to the PAN image [15] Because it could not obtain the theoretical MS image, many researchers generally derived spatially degraded PAN and MS images from the original data sets to evaluate the result quantitatively This is accomplished by the assumption that spatially degraded data must be optimal for a particular fusion method to be suitable to apply to fine-scale original data [36] Based on this hypothesis, various statistical evaluation indicators of spatially degraded data, such as correlation coefficient, erreur relative globale adimensionnelle de synthese, spectral angle mapper, and Q4 index, are proposed [20], [37] [40] However, Alparone et al [36] and Zhang [41] confirm that these indicators are incapable of evaluating image fusion quality They are usually willing to improve the quality of the image fusion results by systematic adjustments and forced improvements of the measurement values, such as mean shifting and histogram stretching [41] In addition, Alparone et al [36] point out that quantitative evaluation through a downgrading of the original data is not suitable for high-resolution materials Therefore, existing evaluations using downgraded data cannot be an optimal measurement for the original spatial scale because the evaluation result after using downgraded data may be not be identical to that obtained by using original data To overcome this problem, indicators that are more effective should be employed for quantitative evaluation A novel approach is known as quality with no reference (QNR) to assess the fusion quality without referring to a high-spatial MS image [36] The QNR method is based on the quality index Q to measure the local correlation, luminance, and contrast between two images It is assumed that the interband spectral quality of the fused data seen as the similarity relationship between bands is unchanged after fusion The Q index is defined by Wang and Bovik [39] and designed for comparison of a reference image x and a test image y It is calculated as follows: Q(x, y) = 4σ xy x ȳ ( ) σ 2 x + σy 2 [ x2 +ȳ 2 ] Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg (14) where σ xy means the covariance between x and y, and σ 2 x and σ 2 y are the variances of x and y, respectively x and y are the means of x and y, respectively Equation (14) can be rewritten as the combination of three factors as follows: Q(x, y) = σ xy 2 x ȳ σ x σ y x 2 +ȳ 2 2 σ x σ y σx 2 + σy 2 (15) The first component is the correlation coefficient, and its dynamic range is [ 1 1] The second term measures the similarity, ie, the mean luminance between x and y with a range of [0 1] The third factor is how close is the contrast among the images Its range is also [0 1] Therefore, the range of the Q index is [ 1 1], and the best value Q =1is achieved if x = y for all pixels [39] The closer the Q index to 10 is, the more the merged image is similar to the original one The Q index is calculated using a sliding window of size N N that moves all the rows and columns of the image in order to increase the differentiation capability and measure the local distortion of a fused image Finally, the Q index is averaged over all the local Q s to calculate the global score among images Based on the Q index, two distortion indices of the fused image can be derived without a reference image The first index is spectral distortion D λ When l is the number of the MS band and N is the total number of bands, the low-spatial-resolution MS image is { G l } N l=1, and the fused MS images are {Ĝl} N l=1 These can be defined from the dissimilarity of the interband Q index of the fused MS image as follows: D λ = 1 p N(N 1) N N l=1 r=1(r 1) Q(Ĝl, Ĝr) Q( G l, G r ) p (16) where p is a positive integer exponent chosen to emphasize large spectral differences All differences are equally weighted for p =1, and large components are given more relevance as p increases [36] The spectral distortion index measures the relative relationship between the interband Q indices of the fused and original images When spectral distortion in the fused image does not occur, the Q indices of both the fused and original MS images are equivalent, and the spectral distortion index is zero Considering that negative values of the Q index are clipped below zero, D λ will be less than one A spatial distortion index (D s ) is defined as D s = 1 N q Q(Ĝl,P) Q( N G l, P ) q (17) l=1 where P is the original PAN image, P is the spatially degraded data from the PAN image derived by a decimation low-pass filter, and q is a parameter to emphasize higher difference values If the high-frequency information is ideally injected Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

9 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 9 into the fused image, the interband relationship between the Q indices of the fused and PAN images and that of the lowresolution MS and spatially degraded PAN images is identical Therefore, D s has an equal range of [0 1], such as D λ Notably, D s can measure the local spatial instability of the fused image, in contrast with the results using the Zhou protocol that employs a Laplacian filter [42] The two distortion indices are inadequate in ranking the performances of the fusion algorithm, notwithstanding that each index can measure the distortion capacity of a fused image [36], [43] To analyze each fusion method quantitatively, the QNR index is proposed The characteristics of spectral/spatial distortion jointly determine the nonlinear response to achieve a better dissimilarity of the compared fusion results QNR =(1 D λ ) a (1 D S ) b (18) where a and b denote the weighted parameters to emphasize the corresponding distortion index The QN R index therefore equals one if spectral/spatial distortion does not exist, and the fused image can be estimated efficiently by means of a simple quality index at a fine scale that is equivalent to that of a PAN image V R ESULT COMPARISONS The original full-resolution PAN and MS images were fused, and the resulting image was examined visually and quantitatively First, each fusion procedure was accomplished by coregistration of the MS and PAN images and histogram-matching processing Bicubic interpolation was used in the up- and downsampling to resize the MS bands and PAN imagery The proposed fusion method progresses automatically on the basis of the constant parameter β in (11) Empirical analysis of the various sensor data sets determined β as 095 for the 11-bit data and 195 for the 8-bit data For visual estimation, the display of all images was made consistent by employing a 2% linear stretch in ENVI, which is a commonly used remote sensing software tool For a quantitative comparison, we used the QNR index on N =16based on low-resolution MS and PAN images, and parameters p, q, a, and b are set up to equal one The fusion results of our algorithm were compared with those of the widely used fusion algorithms and software packages as follows The QNR value was estimated from the original MS, PAN, and fused images without stretching to determine the absolute difference among images 1) GIHS: GIHS fusion method [44] The intensity image was obtained by the average of the all MS bands 2) AWLP, AWPC: outstanding MRA-based fusion method [14], [15] AWLP was joint winners in the 2006 IEEE Data Fusion Contest For AWPC, the histogram match between the PC1 data and the PAN image was applied The weight factor of AWLP was selected by using the ratio between the MS data set and the mean value of all the MS bands The PAN image histogram was matched by each MS band 3) LMVM: fusion algorithm developed using imageprocessing software packages [22] [25] The matching window filter was set to in all cases 4) UNB-Pansharp: fusion method implemented in PCI Geomatica using the same sharpness as that for the original PAN imagery [21] 5) GSA: GS spectral sharpening in ENVI 41 software [10], [11], [45] The low-resolution PAN image is obtained by (7) and (8) A IKONOS Data Sets Fig 6(a) is an MS image with a resolution of 4 m, and Fig 6(b) is a PAN image with a resolution of 1 m corresponding to the MS image used in evaluating the color information in the fused image In the visual comparison, GIHS fusion shows the worst color preservation for asphalt and some building roofs The MRA-based methods and the LMVM result [Fig 6(d) (f)] are less sharp than other algorithms and result in color distortion in some of the asphalt areas In Fig 6(d) (f) of the MRAbased result, there are blurring effects and spatial dissimilarities for buildings and parking areas, as confirmed by the spatial distortion index in Table II The LMVM fusion result left out spatial information on small objects and the forest area in spite of a good spatial distortion index These omissions result from D s not only reflecting the spatial sharpness of the fused image but also representing the spectral dissimilarity, spatial occlusion, and overall quality of the fused image [45] Therefore, it is reasonable that the LMVM of a blurred area has good D s and QNR values In the UNB-Pansharp and GSA results, the parking lots and areas of asphalt in Fig 6(g) and (h) have darker backgrounds than they do in the EXP result and in our fusion result, while UNB-Pansharp represents the best spatial sharpness visually It means that the spectral quality of the fused image by UNB-Pansharp may be decreased because of the oversharpened image to improve spatial quality Our method displays the most impressive color fidelity and spatial details Table II sets out the quantitative result for the fused images using the QNR index of the various areas, such as the vegetated area of site 1 and the complex area of site 2 EXP displays the result from resampled low-resolution MS imagery by using bicubic interpolation without fusion processing EXP gives the best quality to the spectral distortion index because the interband characteristics are maintained without the injection of spatial information In the quantitative estimation, the GIHS methods result in the lowest spectral/spatial quality The AWLP and AWPC methodology gives a similar result; however, the AWPC approach provides a better QNR index on the whole The results obtained using our fusion method represent the best spectral/spatial values and QN R index The GSA and LMVM fusion result is also good performance for a spatial distortion index The LMVM spatial distortion indices have a higher value than our proposed algorithm However, the LMVM fusion result does not have the visually clear detail of our proposed algorithm Therefore, our result is closest to the original MS image throughout for the IKONOS sensor B QuickBird Data Sets The general results obtained using QuickBird imagery are visually similar to those derived from the IKONOS images Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

10 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 10 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING Fig 6 False color (NIR, R, and G) compositions of IKONOS data set of site 2 ( regions) (a) MS (b) PAN (c) GIHS (d) AWLP (e) AWPC (f) LMVM (g) UNB-Pansharp (h) GSA (i) Proposed method TABLE II COMPARATIVE IKONOS FUSION RESULT LMVM fusion has a higher QNR index than other algorithms, except for our method; however, it suffers from blurring effects and a lack of spatial detail for vegetated areas and building roofs, as shown in Fig 7(f) The AWPC fusion result also appears to have areas of blurring The results of AWLP, GSA, and our method have better spatial/spectral quality than the results of other fusion methods However, the backgrounds of the AWLP and GSA fusion results appear to be darker or brighter than the background in our result UNB-Pansharp shows the soil area of Fig 7(g) with distorted color compared with Fig 7(a), and a low QNR index is caused by overinjection of highfrequency information onto the PAN image Our fusion results show good color preservation of grass areas and good spatial sharpness overall The clear quality of the spectral distortion index shows that our proposed algorithm efficiently deals with the global/local dissimilarity between MS bands Most results are Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

11 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 11 Fig 7 True color (R, G, and B) compositions of QuickBird data set ( regions) (a) MS (b) PAN (c) GIHS (d) AWLP (e) AWPC (f) LMVM (g) UNB-Pansharp (h) GSA (i) Proposed method similar to those for IKONOS in that every spectral estimation index for the proposed algorithm shows better performance than any other fusion algorithm quantitatively, as shown in Table III and Fig 7 C LANDSAT ETM+ Data Sets The LANDSAT ETM+ image has different properties compared to that of the IKONOS or QuickBird imagery The spatial ratio between the PAN and MS images is lower than that of high-spatial-resolution images, and the wavelength of some bands is not overlapped by that of the PAN image Because of these attributes, the spatial distortion of the EXP image is lower than that of the IKONOS or QuickBird sensor, and most fusion algorithms have spectrally distorted images Except GSA fusion and our method, the spectral information of most algorithms for the vegetated area is distorted, such as that in Fig 8 In GIHS fusion, most of the area is not matched with the color of the resampled MS The MRA-based fusion and UNB-Pansharp results show a distorted color for the golf course and mountain areas Fig 8(f) shows LMVM fusion in which the spatial characteristics of the urban regions are less sharp than that for other algorithms GSA fusion and our method show almost no color distortion and efficiently accurate spatial details in the fused image Table IV shows the QNR index result of the corresponding fusion results The MRAbased fusion results have a good D s value; however, D λ has a low value compared with that of the IKONOS and QuickBird fusion results AWPC gives a notably lower spectral distortion value than AWLP because the AWPC method considers the dissimilarity between the PAN wavelength and each MS band, considering the PC transform rather than the weighted factor of AWLP LMVM, UNB-Pansharp, and GIHS fusion represent the least satisfactory spatial distortion It is concluded that overinjection or lack of spatial details was realized during fusion processing GSA fusion and our method show a result similar to the spatial resolution of the original PAN image and the spectral information of the MS image Both GSA and our Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

12 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 12 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING TABLE III COMPARATIVE QUICKBIRD FUSION RESULT Fig 8 False color (Mid-IR, R, and G) compositions of LANDSAT ETM+ data set ( regions) (a) MS (b) PAN (c) GIHS (d) AWLP (e) AWPC (f) LMVM (g) UNB-Pansharp (h) GSA (i) Proposed method TABLE IV COMPARATIVE LANDSAT ETM+ FUSION RESULT algorithms indicate the best QN R value quantitatively and the most similar spectral information for vegetated areas, as seen in Fig 8(h) and (i) Both methods consider the properties of all MS bands when generating a low-resolution synthetic image However, the QNR index of our method is slightly higher than that derived by GSA fusion Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

13 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 13 Fig 9 False color (NIR, R, and G) compositions of LANDSAT ETM+ data set ( regions) (a) MS (b) PAN (c) GIHS (d) AWLP (e) AWPC (f) LMVM (g) UNB-Pansharp (h) GSA (i) Proposed method TABLE V COMPARATIVE SPOT-5 FUSION RESULT D SPOT-5 Data Sets The wavelength of the PAN image is not covered by all MS bands such as LANDSAT ETM+ The GIHS fusion result in Fig 9(c) has an excessively bright background LMVM fusion results in blurring of the urban area The UNB-Pansharp and AWLP results are of low quality because of the overinjection of high-frequency information for urban and grass areas, such as that for the QuickBird and LANDSAT ETM+ results From Fig 9 and Table V, we see that the images obtained using AWPC, GSA, and our method have good visual quality and QNR values, whereas the AWPC image has slight blurring in grass areas This is the trend because such methods reflect the dissimilarity between the PAN and MS images in the processing of synthetic images when injecting spatial details, as is the case with the LANDSAT ETM+ result Through the adjustment of local/global instabilities, our method efficiently preserves spectral information, as seen in Fig 9(i), whereas the use of algorithms that include GSA fusion result in color distortions, such as darker urban and grass areas In spite of little distortion of some grass areas, our fusion result has the best color match to the resampled MS image and sharpness similar to that of the GSA and GIHS results Table V gives good QNR values Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

14 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication 14 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING for our method and GSA fusion Although the GSA method provides a better result in terms of the spectral distortion index and the AWPC result has spatial preservation similar to that for our result, the QN R index indicates that our method achieves more efficient fusion when applied to SPOT-5 sensor imagery VI CONCLUSION In this paper, a new adaptive CS-based fusion algorithm has been proposed to minimize spectral distortion while preserving the spatial resolution of the PAN image We could minimize the color distortion that occurred in high-frequency information injection by generating a high-/low-spatial-resolution component image To minimize global/local instability, we have developed an adaptive fusion framework using a statistics-based ratio and a local instability adjustment parameter To examine the performances of the proposed method more objectively, we have conducted a set of comparisons; the proposed method was compared with the fusion methods that are currently widely accepted in many applications In the experiments on different sensors, such as IKONOS, QuickBird, LANDSAT ETM+, and SPOT-5, the images obtained with the proposed algorithm had higher fusion quality than the images produced using other fusion methods Therefore, our method, with an automatic process, is more efficient in achieving high-spatial-resolution satellite image fusion In addition, the proposed method will efficiently be applicable to the various sensors considered in our experiments REFERENCES [1] Y Zhang, Understanding image fusion, Photogramm Eng Remote Sens, vol 70, no 6, pp , Jun 2004 [2] F Nencini, A Garzelli, S Baronti, and L Alparone, Remote sensing image fusion using the curvelet transform, Inf Fusion, vol 8, no 2, pp , Apr 2007 [3] Y Zhang and G Hong, An IHS and wavelet integrated approach to improve pan-sharpening visual quality of natural colour IKONOS and QuickBird images, Inf Fusion, vol 6, no 3, pp , Sep 2005 [4] W J Carper, T M Lillesand, and R W Kiefer, The use of intensity hue saturation transformations for merging SPOT panchromatic and multispectral image data, Photogramm Eng Remote Sens, vol 56, no 4, pp , Apr 1990 [5] T M Tu, P S Huang, C L Hung, and C P Chang, A fast intensity hue saturation fusion technique with spectral adjustment for IKONOS imagery, IEEE Geosci Remote Sens Lett, vol 1, no 4, pp , Oct 2004 [6] M Choi, A new intensity hue saturation fusion approach to image fusion with a tradeoff parameter, IEEE Trans Geosci Remote Sens, vol 44, no 6, pp , Jun 2006 [7] A Garzelli and F Nencini, Fusion of panchromatic and multispectral images by genetic algorithms, in Proc IEEE Int Geosci Remote Sens Symp, 2006, pp [8] M González-Audícana, X Otazu, O Fors, and J A Alvarex-Mozos, A low computational-cost method to fuse IKONOS images using the spectral response function of its sensors, IEEE Trans Geosci Remote Sens, vol 44, no 6, pp , Jun 2006 [9] W Dou, Y Chen, X Li, and D Z Sui, A general framework for component substitution image fusion: An implementation using the fast image fusion method, Comput Geosci, vol 33, no 2, pp , Feb 2007 [10] C A Laben and B V Brower, Process for enhancing the spatial resolution of multispectral imagery using pan-sharpening, US Patent , Jan 4, 2000 [11] B Aiazzi, S Baronti, and M Selva, Improving component substitution pansharpening through multivariate regression of MS+Pan data, IEEE Trans Geosci Remote Sens, vol 45, no 10, pp , Oct 2007 [12] B Aiazzi, L Alparone, S Baronti, and A Garzelli, Context-driven fusion of high spatial and spectral resolution data based on oversampled multiresolution analysis, IEEE Trans Geosci Remote Sens, vol 40, no 10, pp , Oct 2002 [13] J Nunez, X Otazu, O Fors, A Prade, V Pala, and R Arbiol, Multiresolution-based image fusion with additive wavelet decomposition, IEEE Trans Geosci Remote Sens, vol 37, no 3, pp , Mar 1999 [14] X Otazu, M González-Audícana, O Fors, and J Nunez, Introduction of sensor spectral response into image fusion methods Application to wavelet-based methods, IEEE Trans Geosci Remote Sens, vol 43, no 10, pp , Oct 2005 [15] M González-Audícana, X Otazu, O Fors, and A Seco, Comparison between Mallat s and the a trous discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images, Int J Remote Sens, vol 26, no 3, pp , Feb 2005 [16] L Alparone, B Aiazzi, S Baronti, and A Garzelli, Sharpening of very high resolution images with spectral distortion minimization, in Proc IGARSS, 2003, pp [17] T Ranchin, B Aiazzi, L Alparone, S Barronti, and L Wald, Image fusion The ARSIS concept and some successful implementation schemes, ISPRS J Photogramm Remote Sens,vol58,no1/2,pp4 18, Jun 2003 [18] B Aiazzi, L Alparone, S Baronti, A Garzelli, and M Selva, MTF-tailored multiscale fusion of high-resolution MS and pan imagery, Photogramm Eng Remote Sens, vol 72, no 5, pp , May 2006 [19] V P Shah, N H Younan, and R L King, An efficient pan-sharpening method via a combined adaptive PCA approach and contourlets, IEEE Trans Geosci Remote Sens, vol 46, no 5, pp , May 2008 [20] L Alparone, L Wald, J Chanussot, C Thoma, O Gamaba, and L Mann Bruce, Comparison of pansharpening algorithms: Outcome of the 2006 GRS-S data-fusion contest, IEEE Trans Geosci Remote Sens, vol 45, no 10, pp , Oct 2007 [21] PCI Geomatica Pan-Sharpening Technical Specification, PCI Geomatics, Richmond Hill, ON, Canada, 2005 [Online] Available: pcigeomaticscom/pdfs/pan_sharpeningpdf [22] V Karathanassi, P Kolokousis, and S Ioannidou, A comparison study on fusion methods using evaluation indicators, Int J Remote Sens, vol 28, no 10, pp , May 2007 [23] N G Konstantinos, Comparison of nine fusion techniques for very high resolution data, Photogramm Eng Remote Sens, vol74,no5,pp , May 2008 [24] S De Bethune, F Muller, and M Binard, Adaptive intensity matching filters: A new tool for multiresolution data fusion, in Proc AGARD Conf 595, Multi-Sensor Syst Data Fusion Telecommun Remote Sens Radar, Lisbon, Portugal, Sep 29 Oct 22, 1997 pp 28:1 28:15 [25] S De Bethune, F Muller, and J P Donnay, Fusion of multispectral and panchromatic images by local mean and variance matching filtering techniques, in Proc 2nd Int Conf Fusion Earth Data Merging Point Meas Raster Maps Remotely Sensed Images, T Ranchin and L Wald, Eds, Jan 28 30, 1998, pp [26] K A Kalpoma and J Kudoh, Image fusion processing for IKONOS 1-m color imagery, IEEE Trans Geosci Remote Sens, vol 45, no 10, pp , Oct 2007 [27] D Fasbender, J Radoux, and P Bogaert, Bayesian data fusion for adaptable image pansharpening, IEEE Trans Geosci Remote Sens, vol 46, no 6, pp , Jun 2008 [28] H Aanaes, J R Sveinsson, A A Nielsen, T Bovith, and J A Benediktsson, Model-based satellite image fusion, IEEE Trans Geosci Remote Sens, vol 46, no 5, pp , May 2008 [29] Z Li and H Leung, Fusion of multispectral and panchromatic images using a restoration-based method, IEEE Trans Geosci Remote Sens, vol 47, no 5, pp , May 2009 [30] Z Wang, D Ziou, C Armenakis, D Li, and Q Li, A comparative analysis of image fusion methods, IEEE Trans Geosci Remote Sens, vol 43, no 6, pp , Jun 2005 [31] J H Park and M G Kang, Spatially adaptive multi-resolution multispectral image fusion, Int J Remote Sens, vol 25, no 23, pp , Dec 2004 [32] M Lillo-Saavedra and C Gonzalo, Spectral or spatial quality for fused satellite imagery? A trade-off solution using the wavelet a trous algorithm, Int J Remote Sens, vol 27, no 7, pp , Apr 2006 [33] C Thomas, T Ranchin, L Wald, and J Chanussot, Synthesis of multispectral images to high spatial resolution: A critical review of fusion methods based on remote sensing physics, IEEE Trans Geosci Remote Sens, vol 46, no 5, pp , May 2008 [34] Y Zhang, A new merging method and its spectral and spatial effects, Int J Remote Sens, vol 20, no 10, pp , Jul 1999 Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

15 This article has been accepted for publication in a future issue of this journal, but has not been fully edited Content may change prior to final publication CHOI et al: NEW ADAPTIVE CS-BASED SATELLITE IMAGE FUSION BY USING PARTIAL REPLACEMENT 15 [35] J Li, J Luo, D Ming, and Z Shen, A new method for merging IKONOS panchromatic and multispectral image data, in Proc IGARSS,Jul25 29, 2005, vol 6, pp [36] L Alparone, B Aiazzi, S Baronti, A Garzelli, F Nencine, and M Selva, Multispectral and panchromatic data fusion assessment without reference, Photogramm Eng Remote Sens, vol 74, no 2, pp , Feb 2008 [37] T Ranchin and L Wald, Fusion of high spatial and spectral resolution images: The ARSIS concept and its implementation, Photogramm Eng Remote Sens, vol 66, no 1, pp 49 61, Jan 2000 [38] L Alparone, S Baronti, A Garzelli, and F Nencini, A global quality measurement of pan-sharpened multispectral imagery, IEEE Geosci Remote Sens Lett, vol 1, no 4, pp , Oct 2004 [39] Z Wang and A C Bovik, A universal image quality index, IEEE Signal Process Lett, vol 9, no 3, pp 81 84, Mar 2002 [40] A Garzelli, F Nencini, and L Capobianco, Optimal MMSE pan sharpening of very high resolution multispectral images, IEEE Trans Geosci Remote Sens, vol 46, no 1, pp , Jan 2008 [41] Y Zhang, Methods for image fusion quality assessment A review, comparison and analysis, Int Arch Photogramm Remote Sens Spatial Inf Sci, vol XXXVII, pt B7, pp , 2008 [42] J Zhou, D L Civco, and J A Silander, A wavelet transform method to merge Landsat TM and SPOT panchromatic data, Int J Remote Sens, vol 19, no 4, pp , Feb 1998 [43] M M Khan, J Chanussot, B Siouar, and J Osman, Using QNR index as decision criteria for improving fusion quality, in Proc 2nd Int Conf Adv Space Technol, Nov 2008, pp [44] B Aiazzi, S Baronti, F Lotti, and M Selva, A comparison between global and context-adaptive pansharpening of multisepctral image, IEEE Geosci Remote Sens Lett, vol 6, no 2, pp , Apr 2009 [45] M M Khan, L Alparone, and J Chanussot, Pansharpening quality assessment using the modulation transfer functions of instruments, IEEE Trans Geosci Remote Sens, vol 47, no 11, pp , Nov 2009 Jaewan Choi received the BS and MS degrees in civil, urban, and geo-system engineering from Seoul National University, Seoul, Korea, in 2006, where he is currently working toward the PhD degree in the Department of Civil and Environmental Engineering His research interests include image processing of remote sensing data, image fusion, and image registration Kiyun Yu received the BS and MS degrees in civil engineering from Yonsei University, Seoul, Korea, and the PhD degree in geographic information systems from the University of Wisconsin, Madison, in 1998 He was a Director in the Ministry of Construction and Transportation until 2000 He is currently an Associate Professor with the Department of Civil and Environmental Engineering, Seoul National University, Seoul His research interests include geographic information systems, digital photogrammetry, location-based services, and application services and system designs based on digital maps Yongil Kim (M 06) received the BS degree in urban engineering and the MS and PhD degrees in remote sensing from Seoul National University, Seoul, Korea, in 1986, 1988, and 1991, respectively He joined Seoul National University in 1993, where he is currently a Professor with the Department of Civil and Environmental Engineering During the past 20 years, he has been a Project Leader for several large projects, such as standardization of digital road map databases and development of feature extraction algorithms for remote sensing His major research interests include remote sensing, global positioning systems, and geographic information systems Dr Kim is currently a member of the Surveying Committee of the National Geographic Institute and is also the Director and an Editor of the Journal of the Korean Society for Geo-Spatial Information System, thejournal of the Korean Society of Surveying, Geodesy, Photogrammetry, and Cartography,and the Journal of the Korean Society of Remote Sensing Copyright (c) 2010 IEEE Personal use is permitted For any other purposes, Permission must be obtained from the IEEE by ing pubs-permissions@ieeeorg Authorized licensed use limited to: PSN College of Engineering and Technology Downloaded on August 09,2010 at 12:30:00 UTC from IEEE Xplore Restrictions apply

MANY satellite sensors provide both high-resolution

MANY satellite sensors provide both high-resolution IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 8, NO. 2, MARCH 2011 263 Improved Additive-Wavelet Image Fusion Yonghyun Kim, Changno Lee, Dongyeob Han, Yongil Kim, Member, IEEE, and Younsoo Kim Abstract

More information

Multispectral Fusion for Synthetic Aperture Radar (SAR) Image Based Framelet Transform

Multispectral Fusion for Synthetic Aperture Radar (SAR) Image Based Framelet Transform Radar (SAR) Image Based Transform Department of Electrical and Electronic Engineering, University of Technology email: Mohammed_miry@yahoo.Com Received: 10/1/011 Accepted: 9 /3/011 Abstract-The technique

More information

New Additive Wavelet Image Fusion Algorithm for Satellite Images

New Additive Wavelet Image Fusion Algorithm for Satellite Images New Additive Wavelet Image Fusion Algorithm for Satellite Images B. Sathya Bama *, S.G. Siva Sankari, R. Evangeline Jenita Kamalam, and P. Santhosh Kumar Thigarajar College of Engineering, Department of

More information

METHODS FOR IMAGE FUSION QUALITY ASSESSMENT A REVIEW, COMPARISON AND ANALYSIS

METHODS FOR IMAGE FUSION QUALITY ASSESSMENT A REVIEW, COMPARISON AND ANALYSIS METHODS FOR IMAGE FUSION QUALITY ASSESSMENT A REVIEW, COMPARISON AND ANALYSIS Yun Zhang Department of Geodesy and Geomatics Engineering University of New Brunswick Fredericton, New Brunswick, Canada Email:

More information

Combination of IHS and Spatial PCA Methods for Multispectral and Panchromatic Image Fusion

Combination of IHS and Spatial PCA Methods for Multispectral and Panchromatic Image Fusion Combination of IHS and Spatial PCA Methods for Multispectral and Panchromatic Image Fusion Hamid Reza Shahdoosti Tarbiat Modares University Tehran, Iran hamidreza.shahdoosti@modares.ac.ir Hassan Ghassemian

More information

ISVR: an improved synthetic variable ratio method for image fusion

ISVR: an improved synthetic variable ratio method for image fusion Geocarto International Vol. 23, No. 2, April 2008, 155 165 ISVR: an improved synthetic variable ratio method for image fusion L. WANG{, X. CAO{ and J. CHEN*{ {Department of Geography, The State University

More information

ANALYSIS OF SPOT-6 DATA FUSION USING GRAM-SCHMIDT SPECTRAL SHARPENING ON RURAL AREAS

ANALYSIS OF SPOT-6 DATA FUSION USING GRAM-SCHMIDT SPECTRAL SHARPENING ON RURAL AREAS International Journal of Remote Sensing and Earth Sciences Vol.10 No.2 December 2013: 84-89 ANALYSIS OF SPOT-6 DATA FUSION USING GRAM-SCHMIDT SPECTRAL SHARPENING ON RURAL AREAS Danang Surya Candra Indonesian

More information

What is Remote Sensing? Contents. Image Fusion in Remote Sensing. 1. Optical imagery in remote sensing. Electromagnetic Spectrum

What is Remote Sensing? Contents. Image Fusion in Remote Sensing. 1. Optical imagery in remote sensing. Electromagnetic Spectrum Contents Image Fusion in Remote Sensing Optical imagery in remote sensing Image fusion in remote sensing New development on image fusion Linhai Jing Applications Feb. 17, 2011 2 1. Optical imagery in remote

More information

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching. Remote Sensing Objectives This unit will briefly explain display of remote sensing image, geometric correction, spatial enhancement, spectral enhancement and classification of remote sensing image. At

More information

MULTISCALE DIRECTIONAL BILATERAL FILTER BASED FUSION OF SATELLITE IMAGES

MULTISCALE DIRECTIONAL BILATERAL FILTER BASED FUSION OF SATELLITE IMAGES MULTISCALE DIRECTIONAL BILATERAL FILTER BASED FUSION OF SATELLITE IMAGES Soner Kaynak 1, Deniz Kumlu 1,2 and Isin Erer 1 1 Faculty of Electrical and Electronic Engineering, Electronics and Communication

More information

MOST of Earth observation satellites, such as Landsat-7,

MOST of Earth observation satellites, such as Landsat-7, 454 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 11, NO. 2, FEBRUARY 2014 A Robust Image Fusion Method Based on Local Spectral and Spatial Correlation Huixian Wang, Wanshou Jiang, Chengqiang Lei, Shanlan

More information

QUALITY ASSESSMENT OF IMAGE FUSION TECHNIQUES FOR MULTISENSOR HIGH RESOLUTION SATELLITE IMAGES (CASE STUDY: IRS-P5 AND IRS-P6 SATELLITE IMAGES)

QUALITY ASSESSMENT OF IMAGE FUSION TECHNIQUES FOR MULTISENSOR HIGH RESOLUTION SATELLITE IMAGES (CASE STUDY: IRS-P5 AND IRS-P6 SATELLITE IMAGES) In: Wagner W., Székely, B. (eds.): ISPRS TC VII Symposium Years ISPRS, Vienna, Austria, July 5 7,, IAPRS, Vol. XXXVIII, Part 7B QUALITY ASSESSMENT OF IMAGE FUSION TECHNIQUES FOR MULTISENSOR HIGH RESOLUTION

More information

APPLICATION OF PANSHARPENING ALGORITHMS FOR THE FUSION OF RAMAN AND CONVENTIONAL BRIGHTFIELD MICROSCOPY IMAGES

APPLICATION OF PANSHARPENING ALGORITHMS FOR THE FUSION OF RAMAN AND CONVENTIONAL BRIGHTFIELD MICROSCOPY IMAGES APPLICATION OF PANSHARPENING ALGORITHMS FOR THE FUSION OF RAMAN AND CONVENTIONAL BRIGHTFIELD MICROSCOPY IMAGES Ch. Pomrehn 1, D. Klein 2, A. Kolb 3, P. Kaul 2, R. Herpers 1,4,5 1 Institute of Visual Computing,

More information

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT 1 Image Fusion Sensor Merging Magsud Mehdiyev Geoinfomatics Center, AIT Image Fusion is a combination of two or more different images to form a new image by using certain algorithms. ( Pohl et al 1998)

More information

LANDSAT-SPOT DIGITAL IMAGES INTEGRATION USING GEOSTATISTICAL COSIMULATION TECHNIQUES

LANDSAT-SPOT DIGITAL IMAGES INTEGRATION USING GEOSTATISTICAL COSIMULATION TECHNIQUES LANDSAT-SPOT DIGITAL IMAGES INTEGRATION USING GEOSTATISTICAL COSIMULATION TECHNIQUES J. Delgado a,*, A. Soares b, J. Carvalho b a Cartographical, Geodetical and Photogrammetric Engineering Dept., University

More information

Spectral and spatial quality analysis of pansharpening algorithms: A case study in Istanbul

Spectral and spatial quality analysis of pansharpening algorithms: A case study in Istanbul European Journal of Remote Sensing ISSN: (Print) 2279-7254 (Online) Journal homepage: http://www.tandfonline.com/loi/tejr20 Spectral and spatial quality analysis of pansharpening algorithms: A case study

More information

A New Method to Fusion IKONOS and QuickBird Satellites Imagery

A New Method to Fusion IKONOS and QuickBird Satellites Imagery A New Method to Fusion IKONOS and QuickBird Satellites Imagery Juliana G. Denipote, Maria Stela V. Paiva Escola de Engenharia de São Carlos EESC. Universidade de São Paulo USP {judeni, mstela}@sel.eesc.usp.br

More information

MTF-tailored Multiscale Fusion of High-resolution MS and Pan Imagery

MTF-tailored Multiscale Fusion of High-resolution MS and Pan Imagery HR-05-026.qxd 4/11/06 7:43 PM Page 591 MTF-tailored Multiscale Fusion of High-resolution MS and Pan Imagery B. Aiazzi, L. Alparone, S. Baronti, A. Garzelli, and M. Selva Abstract This work presents a multiresolution

More information

Satellite Image Fusion Algorithm using Gaussian Distribution model on Spectrum Range

Satellite Image Fusion Algorithm using Gaussian Distribution model on Spectrum Range Satellite Image Fusion Algorithm using Gaussian Distribution model on Spectrum Range Younggun, Lee and Namik Cho 2 Department of Electrical Engineering and Computer Science, Korea Air Force Academy, Korea

More information

United States Patent (19) Laben et al.

United States Patent (19) Laben et al. United States Patent (19) Laben et al. 54 PROCESS FOR ENHANCING THE SPATIAL RESOLUTION OF MULTISPECTRAL IMAGERY USING PAN-SHARPENING 75 Inventors: Craig A. Laben, Penfield; Bernard V. Brower, Webster,

More information

A Pan-Sharpening Based on the Non-Subsampled Contourlet Transform and Discrete Wavelet Transform

A Pan-Sharpening Based on the Non-Subsampled Contourlet Transform and Discrete Wavelet Transform A Pan-Sharpening Based on the Non-Subsampled Contourlet Transform and Discrete Wavelet Transform 1 Nithya E, 2 Srushti R J 1 Associate Prof., CSE Dept, Dr.AIT Bangalore, KA-India 2 M.Tech Student of Dr.AIT,

More information

Improving Spatial Resolution Of Satellite Image Using Data Fusion Method

Improving Spatial Resolution Of Satellite Image Using Data Fusion Method Muhsin and Mashee Iraqi Journal of Science, December 0, Vol. 53, o. 4, Pp. 943-949 Improving Spatial Resolution Of Satellite Image Using Data Fusion Method Israa J. Muhsin & Foud,K. Mashee Remote Sensing

More information

Novel Hybrid Multispectral Image Fusion Method using Fuzzy Logic

Novel Hybrid Multispectral Image Fusion Method using Fuzzy Logic International Journal of Computer Information Systems and Industrial Management Applications (IJCISIM) ISSN: 2150-7988 Vol.2 (2010), pp.096-103 http://www.mirlabs.org/ijcisim Novel Hybrid Multispectral

More information

An Improved Intensity-Hue-Saturation for A High-Resolution Image Fusion Technique Minimizing Color Distortion

An Improved Intensity-Hue-Saturation for A High-Resolution Image Fusion Technique Minimizing Color Distortion An Improved Intensity-Hue-Saturation for A High-Resolution Image Fusion Technique Minimizing Color Distortion Miloud Chikr El Mezouar, Nasreddine Taleb, Kidiyo Kpalma, and Joseph Ronsin Abstract Among

More information

Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain

Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain International Journal of Remote Sensing Vol. 000, No. 000, Month 2005, 1 6 Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain International

More information

ILTERS. Jia Yonghong 1,2 Wu Meng 1* Zhang Xiaoping 1

ILTERS. Jia Yonghong 1,2 Wu Meng 1* Zhang Xiaoping 1 ISPS Annals of the Photogrammetry, emote Sensing and Spatial Information Sciences, Volume I-7, 22 XXII ISPS Congress, 25 August September 22, Melbourne, Australia AN IMPOVED HIGH FEQUENCY MODULATING FUSION

More information

MULTI-SENSOR DATA FUSION OF VNIR AND TIR SATELLITE IMAGERY

MULTI-SENSOR DATA FUSION OF VNIR AND TIR SATELLITE IMAGERY MULTI-SENSOR DATA FUSION OF VNIR AND TIR SATELLITE IMAGERY Nam-Ki Jeong 1, Hyung-Sup Jung 1, Sung-Hwan Park 1 and Kwan-Young Oh 1,2 1 University of Seoul, 163 Seoulsiripdaero, Dongdaemun-gu, Seoul, Republic

More information

High-resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution

High-resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution High-resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution Andreja Švab and Krištof Oštir Abstract The main topic of this paper is high-resolution image fusion. The techniques used

More information

Measurement of Quality Preservation of Pan-sharpened Image

Measurement of Quality Preservation of Pan-sharpened Image International Journal of Engineering Research and Development e-issn: 2278-067X, p-issn: 2278-800X, www.ijerd.com Volume 2, Issue 10 (August 2012), PP. 12-17 Measurement of Quality Preservation of Pan-sharpened

More information

Image interpretation and analysis

Image interpretation and analysis Image interpretation and analysis Grundlagen Fernerkundung, Geo 123.1, FS 2014 Lecture 7a Rogier de Jong Michael Schaepman Why are snow, foam, and clouds white? Why are snow, foam, and clouds white? Today

More information

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs

Basic Digital Image Processing. The Structure of Digital Images. An Overview of Image Processing. Image Restoration: Line Drop-outs Basic Digital Image Processing A Basic Introduction to Digital Image Processing ~~~~~~~~~~ Rev. Ronald J. Wasowski, C.S.C. Associate Professor of Environmental Science University of Portland Portland,

More information

Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics

Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics Synthesis of multispectral images to high spatial resolution: a critical review of fusion methods based on remote sensing physics Claire Thomas, Thierry Ranchin, Lucien Wald, Jocelyn Chanussot To cite

More information

MANY satellites provide two types of images: highresolution

MANY satellites provide two types of images: highresolution 746 IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 7, NO. 4, OCTOBER 2010 An Adaptive IHS Pan-Sharpening Method Sheida Rahmani, Melissa Strait, Daria Merkurjev, Michael Moeller, and Todd Wittman Abstract

More information

Chapter 1. Introduction

Chapter 1. Introduction Chapter 1 Introduction One of the major achievements of mankind is to record the data of what we observe in the form of photography which is dated to 1826. Man has always tried to reach greater heights

More information

Benefits of fusion of high spatial and spectral resolutions images for urban mapping

Benefits of fusion of high spatial and spectral resolutions images for urban mapping Benefits of fusion of high spatial and spectral resolutions s for urban mapping Thierry Ranchin, Lucien Wald To cite this version: Thierry Ranchin, Lucien Wald. Benefits of fusion of high spatial and spectral

More information

The optimum wavelet-based fusion method for urban area mapping

The optimum wavelet-based fusion method for urban area mapping The optimum wavelet-based fusion method for urban area mapping S. IOANNIDOU, V. KARATHANASSI, A. SARRIS* Laboratory of Remote Sensing School of Rural and Surveying Engineering National Technical University

More information

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY A PATH FOR HORIZING YOUR INNOVATIVE WORK FUSION OF MULTISPECTRAL AND HYPERSPECTRAL IMAGES USING PCA AND UNMIXING TECHNIQUE

More information

DATA FUSION AND TEXTURE-DIRECTION ANALYSES FOR URBAN STUDIES IN VIETNAM

DATA FUSION AND TEXTURE-DIRECTION ANALYSES FOR URBAN STUDIES IN VIETNAM 1 DATA FUSION AND TEXTURE-DIRECTION ANALYSES FOR URBAN STUDIES IN VIETNAM Tran Dong Binh 1, Weber Christiane 1, Serradj Aziz 1, Badariotti Dominique 2, Pham Van Cu 3 1. University of Louis Pasteur, Department

More information

Indusion : Fusion of Multispectral and Panchromatic Images Using Induction Scaling Technique

Indusion : Fusion of Multispectral and Panchromatic Images Using Induction Scaling Technique Indusion : Fusion of Multispectral and Panchromatic Images Using Induction Scaling Technique Muhammad Khan, Jocelyn Chanussot, Laurent Condat, Annick Montanvert To cite this version: Muhammad Khan, Jocelyn

More information

International Journal of Innovative Research in Engineering Science and Technology APRIL 2018 ISSN X

International Journal of Innovative Research in Engineering Science and Technology APRIL 2018 ISSN X HIGH DYNAMIC RANGE OF MULTISPECTRAL ACQUISITION USING SPATIAL IMAGES 1 M.Kavitha, M.Tech., 2 N.Kannan, M.E., and 3 S.Dharanya, M.E., 1 Assistant Professor/ CSE, Dhirajlal Gandhi College of Technology,

More information

CanImage. (Landsat 7 Orthoimages at the 1: Scale) Standards and Specifications Edition 1.0

CanImage. (Landsat 7 Orthoimages at the 1: Scale) Standards and Specifications Edition 1.0 CanImage (Landsat 7 Orthoimages at the 1:50 000 Scale) Standards and Specifications Edition 1.0 Centre for Topographic Information Customer Support Group 2144 King Street West, Suite 010 Sherbrooke, QC

More information

Increasing the potential of Razaksat images for map-updating in the Tropics

Increasing the potential of Razaksat images for map-updating in the Tropics IOP Conference Series: Earth and Environmental Science OPEN ACCESS Increasing the potential of Razaksat images for map-updating in the Tropics To cite this article: C Pohl and M Hashim 2014 IOP Conf. Ser.:

More information

INTEGRATED DEM AND PAN-SHARPENED SPOT-4 IMAGE IN URBAN STUDIES

INTEGRATED DEM AND PAN-SHARPENED SPOT-4 IMAGE IN URBAN STUDIES INTEGRATED DEM AND PAN-SHARPENED SPOT-4 IMAGE IN URBAN STUDIES G. Doxani, A. Stamou Dept. Cadastre, Photogrammetry and Cartography, Aristotle University of Thessaloniki, GREECE gdoxani@hotmail.com, katerinoudi@hotmail.com

More information

FUSION OF LANDSAT- 8 THERMAL INFRARED AND VISIBLE BANDS WITH MULTI- RESOLUTION ANALYSIS CONTOURLET METHODS

FUSION OF LANDSAT- 8 THERMAL INFRARED AND VISIBLE BANDS WITH MULTI- RESOLUTION ANALYSIS CONTOURLET METHODS FUSION OF LANDSAT- 8 THERMAL INFRARED AND VISIBLE BANDS WITH MULTI- RESOLUTION ANALYSIS CONTOURLET METHODS F. Farhanj a, M.Akhoondzadeh b a M.Sc. Student, Remote Sensing Department, School of Surveying

More information

Image Quality Assessment for Defocused Blur Images

Image Quality Assessment for Defocused Blur Images American Journal of Signal Processing 015, 5(3): 51-55 DOI: 10.593/j.ajsp.0150503.01 Image Quality Assessment for Defocused Blur Images Fatin E. M. Al-Obaidi Department of Physics, College of Science,

More information

Today s Presentation. Introduction Study area and Data Method Results and Discussion Conclusion

Today s Presentation. Introduction Study area and Data Method Results and Discussion Conclusion Today s Presentation Introduction Study area and Data Method Results and Discussion Conclusion 2 The urban population in India is growing at around 2.3% per annum. An increased urban population in response

More information

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego 1 Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana Geob 373 Remote Sensing Dr Andreas Varhola, Kathry De Rego Zhu an Lim (14292149) L2B 17 Apr 2016 2 Abstract Montana

More information

AUTOMATIC DETECTION OF HEDGES AND ORCHARDS USING VERY HIGH SPATIAL RESOLUTION IMAGERY

AUTOMATIC DETECTION OF HEDGES AND ORCHARDS USING VERY HIGH SPATIAL RESOLUTION IMAGERY AUTOMATIC DETECTION OF HEDGES AND ORCHARDS USING VERY HIGH SPATIAL RESOLUTION IMAGERY Selim Aksoy Department of Computer Engineering, Bilkent University, Bilkent, 06800, Ankara, Turkey saksoy@cs.bilkent.edu.tr

More information

A Review on Image Fusion Techniques

A Review on Image Fusion Techniques A Review on Image Fusion Techniques Vaishalee G. Patel 1,, Asso. Prof. S.D.Panchal 3 1 PG Student, Department of Computer Engineering, Alpha College of Engineering &Technology, Gandhinagar, Gujarat, India,

More information

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 14, NO. 10, OCTOBER

IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 14, NO. 10, OCTOBER IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, VOL. 14, NO. 10, OCTOBER 2017 1835 Blind Quality Assessment of Fused WorldView-3 Images by Using the Combinations of Pansharpening and Hypersharpening Paradigms

More information

Comparison of various image fusion methods for impervious surface classification from VNREDSat-1

Comparison of various image fusion methods for impervious surface classification from VNREDSat-1 International Journal of Advanced Culture Technology Vol.4 No.2 1-6 (2016) http://dx.doi.org/.17703/ijact.2016.4.2.1 IJACT-16-2-1 Comparison of various image fusion methods for impervious surface classification

More information

THE CURVELET TRANSFORM FOR IMAGE FUSION

THE CURVELET TRANSFORM FOR IMAGE FUSION 1 THE CURVELET TRANSFORM FOR IMAGE FUSION Myungjin Choi, Rae Young Kim, Myeong-Ryong NAM, and Hong Oh Kim Abstract The fusion of high-spectral/low-spatial resolution multispectral and low-spectral/high-spatial

More information

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur Histograms of gray values for TM bands 1-7 for the example image - Band 4 and 5 show more differentiation than the others (contrast=the ratio of brightest to darkest areas of a landscape). - Judging from

More information

Texture-Guided Multisensor Superresolution for Remotely Sensed Images

Texture-Guided Multisensor Superresolution for Remotely Sensed Images remote sensing Article Texture-Guided Multisensor Superresolution for Remotely Sensed Images Naoto Yokoya 1,2,3 1 Department of Advanced Interdisciplinary Studies, University of Tokyo, 4-6-1 Komaba, Meguro-ku,

More information

Synthetic Aperture Radar (SAR) Image Fusion with Optical Data

Synthetic Aperture Radar (SAR) Image Fusion with Optical Data Synthetic Aperture Radar (SAR) Image Fusion with Optical Data (Lecture I- Monday 21 December 2015) Training Course on Radar Remote Sensing and Image Processing 21-24 December 2015, Karachi, Pakistan Organizers:

More information

THE IMAGE REGISTRATION TECHNIQUE FOR HIGH RESOLUTION REMOTE SENSING IMAGE IN HILLY AREA

THE IMAGE REGISTRATION TECHNIQUE FOR HIGH RESOLUTION REMOTE SENSING IMAGE IN HILLY AREA THE IMAGE REGISTRATION TECHNIQUE FOR HIGH RESOLUTION REMOTE SENSING IMAGE IN HILLY AREA Gang Hong, Yun Zhang Department of Geodesy and Geomatics Engineering University of New Brunswick Fredericton, New

More information

USE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT

USE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT USE OF HISTOGRAM EQUALIZATION IN IMAGE PROCESSING FOR IMAGE ENHANCEMENT Sapana S. Bagade M.E,Computer Engineering, Sipna s C.O.E.T,Amravati, Amravati,India sapana.bagade@gmail.com Vijaya K. Shandilya Assistant

More information

ASSESSMENT OF VERY HIGH RESOLUTION SATELLITE DATA FUSION TECHNIQUES FOR LANDSLIDE RECOGNITION

ASSESSMENT OF VERY HIGH RESOLUTION SATELLITE DATA FUSION TECHNIQUES FOR LANDSLIDE RECOGNITION ASSESSMENT OF VERY HIGH RESOLUTION SATELLITE DATA FUSION TECHNIQUES FOR LANDSLIDE RECOGNITION L. Santurri a, R. Carlà a, *, F. Fiorucci b, B. Aiazzi a, S. Baronti a, M. Cardinali b, A. Mondini b a IFAC-CNR,

More information

CHANGE DETECTION BY THE IR-MAD AND KERNEL MAF METHODS IN LANDSAT TM DATA COVERING A SWEDISH FOREST REGION

CHANGE DETECTION BY THE IR-MAD AND KERNEL MAF METHODS IN LANDSAT TM DATA COVERING A SWEDISH FOREST REGION CHANGE DETECTION BY THE IR-MAD AND KERNEL MAF METHODS IN LANDSAT TM DATA COVERING A SWEDISH FOREST REGION Allan A. NIELSEN a, Håkan OLSSON b a Technical University of Denmark, National Space Institute

More information

Spectral Signatures. Vegetation. 40 Soil. Water WAVELENGTH (microns)

Spectral Signatures. Vegetation. 40 Soil. Water WAVELENGTH (microns) Spectral Signatures % REFLECTANCE VISIBLE NEAR INFRARED Vegetation Soil Water.5. WAVELENGTH (microns). Spectral Reflectance of Urban Materials 5 Parking Lot 5 (5=5%) Reflectance 5 5 5 5 5 Wavelength (nm)

More information

Super-Resolution of Multispectral Images

Super-Resolution of Multispectral Images IJSRD - International Journal for Scientific Research & Development Vol. 1, Issue 3, 2013 ISSN (online): 2321-0613 Super-Resolution of Images Mr. Dhaval Shingala 1 Ms. Rashmi Agrawal 2 1 PG Student, Computer

More information

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

Application of GIS to Fast Track Planning and Monitoring of Development Agenda Application of GIS to Fast Track Planning and Monitoring of Development Agenda Radiometric, Atmospheric & Geometric Preprocessing of Optical Remote Sensing 13 17 June 2018 Outline 1. Why pre-process remotely

More information

Image Degradation for Quality Assessment of Pan-Sharpening Methods

Image Degradation for Quality Assessment of Pan-Sharpening Methods remote sensing Letter Image Degradation for Quality Assessment of Pan-Sharpening Methods Wen Dou Department of Geographic Information Engineering, Southeast University, Nanjing 9, China; douw@seu.edu.cn

More information

Vol.14 No.1. Februari 2013 Jurnal Momentum ISSN : X SCENES CHANGE ANALYSIS OF MULTI-TEMPORAL IMAGES FUSION. Yuhendra 1

Vol.14 No.1. Februari 2013 Jurnal Momentum ISSN : X SCENES CHANGE ANALYSIS OF MULTI-TEMPORAL IMAGES FUSION. Yuhendra 1 SCENES CHANGE ANALYSIS OF MULTI-TEMPORAL IMAGES FUSION Yuhendra 1 1 Department of Informatics Enggineering, Faculty of Technology Industry, Padang Institute of Technology, Indonesia ABSTRACT Image fusion

More information

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications ) Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications ) Why is this important What are the major approaches Examples of digital image enhancement Follow up exercises

More information

Spatially Varying Color Correction Matrices for Reduced Noise

Spatially Varying Color Correction Matrices for Reduced Noise Spatially Varying olor orrection Matrices for educed oise Suk Hwan Lim, Amnon Silverstein Imaging Systems Laboratory HP Laboratories Palo Alto HPL-004-99 June, 004 E-mail: sukhwan@hpl.hp.com, amnon@hpl.hp.com

More information

MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES

MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES MODULE 4 LECTURE NOTES 4 DENSITY SLICING, THRESHOLDING, IHS, TIME COMPOSITE AND SYNERGIC IMAGES 1. Introduction Digital image processing involves manipulation and interpretation of the digital images so

More information

CHARACTERISTICS OF REMOTELY SENSED IMAGERY. Spatial Resolution

CHARACTERISTICS OF REMOTELY SENSED IMAGERY. Spatial Resolution CHARACTERISTICS OF REMOTELY SENSED IMAGERY Spatial Resolution There are a number of ways in which images can differ. One set of important differences relate to the various resolutions that images express.

More information

Digital Image Processing

Digital Image Processing Digital Image Processing 1 Patrick Olomoshola, 2 Taiwo Samuel Afolayan 1,2 Surveying & Geoinformatic Department, Faculty of Environmental Sciences, Rufus Giwa Polytechnic, Owo. Nigeria Abstract: This paper

More information

Enhanced DCT Interpolation for better 2D Image Up-sampling

Enhanced DCT Interpolation for better 2D Image Up-sampling Enhanced Interpolation for better 2D Image Up-sampling Aswathy S Raj MTech Student, Department of ECE Marian Engineering College, Kazhakuttam, Thiruvananthapuram, Kerala, India Reshmalakshmi C Assistant

More information

A. Dalrin Ampritta 1 and Dr. S.S. Ramakrishnan 2 1,2 INTRODUCTION

A. Dalrin Ampritta 1 and Dr. S.S. Ramakrishnan 2 1,2 INTRODUCTION Improving the Thematic Accuracy of Land Use and Land Cover Classification by Image Fusion Using Remote Sensing and Image Processing for Adapting to Climate Change A. Dalrin Ampritta 1 and Dr. S.S. Ramakrishnan

More information

Exercise 4-1 Image Exploration

Exercise 4-1 Image Exploration Exercise 4-1 Image Exploration With this exercise, we begin an extensive exploration of remotely sensed imagery and image processing techniques. Because remotely sensed imagery is a common source of data

More information

On the use of synthetic images for change detection accuracy assessment

On the use of synthetic images for change detection accuracy assessment On the use of synthetic images for change detection accuracy assessment Hélio Radke Bittencourt 1, Daniel Capella Zanotta 2 and Thiago Bazzan 3 1 Departamento de Estatística, Pontifícia Universidade Católica

More information

Effective Pixel Interpolation for Image Super Resolution

Effective Pixel Interpolation for Image Super Resolution IOSR Journal of Electronics and Communication Engineering (IOSR-JECE) e-iss: 2278-2834,p- ISS: 2278-8735. Volume 6, Issue 2 (May. - Jun. 2013), PP 15-20 Effective Pixel Interpolation for Image Super Resolution

More information

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts Remote sensing in archaeology from optical to lidar Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts Introduction Optical remote sensing Systems Search for

More information

Detection of Compound Structures in Very High Spatial Resolution Images

Detection of Compound Structures in Very High Spatial Resolution Images Detection of Compound Structures in Very High Spatial Resolution Images Selim Aksoy Department of Computer Engineering Bilkent University Bilkent, 06800, Ankara, Turkey saksoy@cs.bilkent.edu.tr Joint work

More information

Enhancement of Multispectral Images and Vegetation Indices

Enhancement of Multispectral Images and Vegetation Indices Enhancement of Multispectral Images and Vegetation Indices ERDAS Imagine 2016 Description: We will use ERDAS Imagine with multispectral images to learn how an image can be enhanced for better interpretation.

More information

Chapter 4 Pan-Sharpening Techniques to Enhance Archaeological Marks: An Overview

Chapter 4 Pan-Sharpening Techniques to Enhance Archaeological Marks: An Overview Chapter 4 Pan-Sharpening Techniques to Enhance Archaeological Marks: An Overview 1 2 3 Rosa Lasaponara and Nicola Masini 4 Abstract The application of pan-sharpening techniques to very high resolution

More information

Comparison between Mallat s and the à trous discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images

Comparison between Mallat s and the à trous discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images International Journal of Remote Sensing Vol. 000, No. 000, Month 2005, 1 19 Comparison between Mallat s and the à trous discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic

More information

A Spatial Mean and Median Filter For Noise Removal in Digital Images

A Spatial Mean and Median Filter For Noise Removal in Digital Images A Spatial Mean and Median Filter For Noise Removal in Digital Images N.Rajesh Kumar 1, J.Uday Kumar 2 Associate Professor, Dept. of ECE, Jaya Prakash Narayan College of Engineering, Mahabubnagar, Telangana,

More information

MULTIRESOLUTION SPOT-5 DATA FOR BOREAL FOREST MONITORING

MULTIRESOLUTION SPOT-5 DATA FOR BOREAL FOREST MONITORING MULTIRESOLUTION SPOT-5 DATA FOR BOREAL FOREST MONITORING M. G. Rosengren, E. Willén Metria Miljöanalys, P.O. Box 24154, SE-104 51 Stockholm, Sweden - (mats.rosengren, erik.willen)@lm.se KEY WORDS: Remote

More information

MULTISPECTRAL IMAGE PROCESSING I

MULTISPECTRAL IMAGE PROCESSING I TM1 TM2 337 TM3 TM4 TM5 TM6 Dr. Robert A. Schowengerdt TM7 Landsat Thematic Mapper (TM) multispectral images of desert and agriculture near Yuma, Arizona MULTISPECTRAL IMAGE PROCESSING I SENSORS Multispectral

More information

Texture characterization in DIRSIG

Texture characterization in DIRSIG Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 2001 Texture characterization in DIRSIG Christy Burtner Follow this and additional works at: http://scholarworks.rit.edu/theses

More information

A self-adaptive Contrast Enhancement Method Based on Gradient and Intensity Histogram for Remote Sensing Images

A self-adaptive Contrast Enhancement Method Based on Gradient and Intensity Histogram for Remote Sensing Images 2nd International Conference on Computer Engineering, Information Science & Application Technology (ICCIA 2017) A self-adaptive Contrast Enhancement Method Based on Gradient and Intensity Histogram for

More information

GE 113 REMOTE SENSING. Topic 7. Image Enhancement

GE 113 REMOTE SENSING. Topic 7. Image Enhancement GE 113 REMOTE SENSING Topic 7. Image Enhancement Lecturer: Engr. Jojene R. Santillan jrsantillan@carsu.edu.ph Division of Geodetic Engineering College of Engineering and Information Technology Caraga State

More information

Urban Feature Classification Technique from RGB Data using Sequential Methods

Urban Feature Classification Technique from RGB Data using Sequential Methods Urban Feature Classification Technique from RGB Data using Sequential Methods Hassan Elhifnawy Civil Engineering Department Military Technical College Cairo, Egypt Abstract- This research produces a fully

More information

EVALUATION OF SATELLITE IMAGE FUSION USING WAVELET TRANSFORM

EVALUATION OF SATELLITE IMAGE FUSION USING WAVELET TRANSFORM EVALUATION OF SATELLITE IMAGE FUSION USING WAVELET TRANSFORM Oguz Gungor Jie Shan Geomatics Engineering, School of Civil Engineering, Purdue University 550 Stadium Mall Drive, West Lafayette, IN 47907-205,

More information

AN ASSESSMENT OF SHADOW ENHANCED URBAN REMOTE SENSING IMAGERY OF A COMPLEX CITY - HONG KONG

AN ASSESSMENT OF SHADOW ENHANCED URBAN REMOTE SENSING IMAGERY OF A COMPLEX CITY - HONG KONG AN ASSESSMENT OF SHADOW ENHANCED URBAN REMOTE SENSING IMAGERY OF A COMPLEX CITY - HONG KONG Cheuk-Yan Wan*, Bruce A. King, Zhilin Li The Department of Land Surveying and Geo-Informatics, The Hong Kong

More information

A Study of Slanted-Edge MTF Stability and Repeatability

A Study of Slanted-Edge MTF Stability and Repeatability A Study of Slanted-Edge MTF Stability and Repeatability Jackson K.M. Roland Imatest LLC, 2995 Wilderness Place Suite 103, Boulder, CO, USA ABSTRACT The slanted-edge method of measuring the spatial frequency

More information

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0 TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0 TABLE OF CONTENTS Overview... 3 Color Filter Patterns... 3 Bayer CFA... 3 Sparse CFA... 3 Image Processing...

More information

Remote Sensing Image Fusion Based on Enhancement of Edge Feature Information

Remote Sensing Image Fusion Based on Enhancement of Edge Feature Information Sensors & Transducers, Vol. 167, Issue 3, arch 014, pp. 175-181 Sensors & Transducers 014 by IFSA Publishing, S.. http://www.sensorsportal.com Remote Sensing Image Fusion Based on Enhancement of Edge Feature

More information

Remote Sensing Instruction Laboratory

Remote Sensing Instruction Laboratory Laboratory Session 217513 Geographic Information System and Remote Sensing - 1 - Remote Sensing Instruction Laboratory Assist.Prof.Dr. Weerakaset Suanpaga Department of Civil Engineering, Faculty of Engineering

More information

High Resolution Satellite Data for Mapping Landuse/Land-cover in the Rural-Urban Fringe of the Greater Toronto Area

High Resolution Satellite Data for Mapping Landuse/Land-cover in the Rural-Urban Fringe of the Greater Toronto Area High Resolution Satellite Data for Mapping Landuse/Land-cover in the Rural-Urban Fringe of the Greater Toronto Area Maria Irene Rangel Luna Master s of Science Thesis in Geoinformatics TRITA-GIT EX 06-010

More information

Image Filtering. Median Filtering

Image Filtering. Median Filtering Image Filtering Image filtering is used to: Remove noise Sharpen contrast Highlight contours Detect edges Other uses? Image filters can be classified as linear or nonlinear. Linear filters are also know

More information

IMPROVEMENT IN THE DETECTION OF LAND COVER CLASSES USING THE WORLDVIEW-2 IMAGERY

IMPROVEMENT IN THE DETECTION OF LAND COVER CLASSES USING THE WORLDVIEW-2 IMAGERY IMPROVEMENT IN THE DETECTION OF LAND COVER CLASSES USING THE WORLDVIEW-2 IMAGERY Ahmed Elsharkawy 1,2, Mohamed Elhabiby 1,3 & Naser El-Sheimy 1,4 1 Dept. of Geomatics Engineering, University of Calgary

More information

Fast, simple, and good pan-sharpening method

Fast, simple, and good pan-sharpening method Fast, simple, and good pan-sharpening method Gintautas Palubinskas Fast, simple, and good pan-sharpening method Gintautas Palubinskas German Aerospace Center DLR, Remote Sensing Technology Institute, Oberpfaffenhofen,

More information

Pixel-based Image Fusion Using Wavelet Transform for SPOT and ETM+ Image

Pixel-based Image Fusion Using Wavelet Transform for SPOT and ETM+ Image Pixel-based Image Fusion Using Wavelet Transform for SPOT and ETM+ Image Hongbo Wu Center for Forest Operations and Environment Northeast Forestry University Harbin, P.R.China E-mail: wuhongboi2366@sina.com

More information

GE 113 REMOTE SENSING

GE 113 REMOTE SENSING GE 113 REMOTE SENSING Topic 8. Image Classification and Accuracy Assessment Lecturer: Engr. Jojene R. Santillan jrsantillan@carsu.edu.ph Division of Geodetic Engineering College of Engineering and Information

More information

ADAPTIVE INTENSITY MATCHING FILTERS : A NEW TOOL FOR MULTI-RESOLUTION DATA FUSION.

ADAPTIVE INTENSITY MATCHING FILTERS : A NEW TOOL FOR MULTI-RESOLUTION DATA FUSION. ADAPTIVE INTENSITY MATCHING FILTERS : A NEW TOOL FOR MULTI-RESOLUTION DATA FUSION. S. de Béthune F. Muller M. Binard Laboratory SURFACES University of Liège 7, place du 0 août B 4000 Liège, BE. SUMMARY

More information

Chapter 4 SPEECH ENHANCEMENT

Chapter 4 SPEECH ENHANCEMENT 44 Chapter 4 SPEECH ENHANCEMENT 4.1 INTRODUCTION: Enhancement is defined as improvement in the value or Quality of something. Speech enhancement is defined as the improvement in intelligibility and/or

More information