Comparison of various image fusion methods for impervious surface classification from VNREDSat-1

Similar documents
QUALITY ASSESSMENT OF IMAGE FUSION TECHNIQUES FOR MULTISENSOR HIGH RESOLUTION SATELLITE IMAGES (CASE STUDY: IRS-P5 AND IRS-P6 SATELLITE IMAGES)

Spectral and spatial quality analysis of pansharpening algorithms: A case study in Istanbul

Combination of IHS and Spatial PCA Methods for Multispectral and Panchromatic Image Fusion

ANALYSIS OF SPOT-6 DATA FUSION USING GRAM-SCHMIDT SPECTRAL SHARPENING ON RURAL AREAS

Satellite Image Fusion Algorithm using Gaussian Distribution model on Spectrum Range

Measurement of Quality Preservation of Pan-sharpened Image

Multispectral Fusion for Synthetic Aperture Radar (SAR) Image Based Framelet Transform

Improving Spatial Resolution Of Satellite Image Using Data Fusion Method

Vol.14 No.1. Februari 2013 Jurnal Momentum ISSN : X SCENES CHANGE ANALYSIS OF MULTI-TEMPORAL IMAGES FUSION. Yuhendra 1

What is Remote Sensing? Contents. Image Fusion in Remote Sensing. 1. Optical imagery in remote sensing. Electromagnetic Spectrum

Advanced Techniques in Urban Remote Sensing

A New Method to Fusion IKONOS and QuickBird Satellites Imagery

New Additive Wavelet Image Fusion Algorithm for Satellite Images

ISVR: an improved synthetic variable ratio method for image fusion

DATA FUSION AND TEXTURE-DIRECTION ANALYSES FOR URBAN STUDIES IN VIETNAM

Novel Hybrid Multispectral Image Fusion Method using Fuzzy Logic

Spectral information analysis of image fusion data for remote sensing applications

Benefits of fusion of high spatial and spectral resolutions images for urban mapping

Augment the Spatial Resolution of Multispectral Image Using PCA Fusion Method and Classified It s Region Using Different Techniques.

An Improved Intensity-Hue-Saturation for A High-Resolution Image Fusion Technique Minimizing Color Distortion

Today s Presentation. Introduction Study area and Data Method Results and Discussion Conclusion

MTF-tailored Multiscale Fusion of High-resolution MS and Pan Imagery

USE OF LANDSAT 7 ETM+ DATA AS BASIC INFORMATION FOR INFRASTRUCTURE PLANNING

METHODS FOR IMAGE FUSION QUALITY ASSESSMENT A REVIEW, COMPARISON AND ANALYSIS

IMPROVEMENT IN THE DETECTION OF LAND COVER CLASSES USING THE WORLDVIEW-2 IMAGERY

BEMD-based high resolution image fusion for land cover classification: A case study in Guilin

ILTERS. Jia Yonghong 1,2 Wu Meng 1* Zhang Xiaoping 1

MANY satellites provide two types of images: highresolution

COMPARISON OF PANSHARPENING ALGORITHMS FOR COMBINING RADAR AND MULTISPECTRAL DATA

COMPARISON OF INFORMATION CONTENTS OF HIGH RESOLUTION SPACE IMAGES

A Review on Image Fusion Techniques

MANY satellite sensors provide both high-resolution

MULTISCALE DIRECTIONAL BILATERAL FILTER BASED FUSION OF SATELLITE IMAGES

FUSION OF LANDSAT- 8 THERMAL INFRARED AND VISIBLE BANDS WITH MULTI- RESOLUTION ANALYSIS CONTOURLET METHODS

MULTI-SENSOR DATA FUSION OF VNIR AND TIR SATELLITE IMAGERY

A. Dalrin Ampritta 1 and Dr. S.S. Ramakrishnan 2 1,2 INTRODUCTION

CURRENT SCENARIO AND CHALLENGES IN THE ANALYSIS OF MULTITEMPORAL REMOTE SENSING IMAGES

Fusion of multispectral and panchromatic satellite sensor imagery based on tailored filtering in the Fourier domain

A Pan-Sharpening Based on the Non-Subsampled Contourlet Transform and Discrete Wavelet Transform

United States Patent (19) Laben et al.

COMBINATION OF OBJECT-BASED AND PIXEL-BASED IMAGE ANALYSIS FOR CLASSIFICATION OF VHR IMAGERY OVER URBAN AREAS INTRODUCTION

EVALUATING THE EFFICIENCY OF MULTISENSOR SATELLITE DATA FUSION BASED ON THE ACCURACY LEVEL OF LAND COVER/USE CLASSIFICATION

Increasing the potential of Razaksat images for map-updating in the Tropics

THE CURVELET TRANSFORM FOR IMAGE FUSION

Optimizing the High-Pass Filter Addition Technique for Image Fusion

Classification in Image processing: A Survey

REGISTRATION OF OPTICAL AND SAR SATELLITE IMAGES BASED ON GEOMETRIC FEATURE TEMPLATES

INTEGRATED DEM AND PAN-SHARPENED SPOT-4 IMAGE IN URBAN STUDIES

Survey of Spatial Domain Image fusion Techniques

AN ASSESSMENT OF SHADOW ENHANCED URBAN REMOTE SENSING IMAGERY OF A COMPLEX CITY - HONG KONG

Pixel-based Image Fusion Using Wavelet Transform for SPOT and ETM+ Image

Statistical Analysis of SPOT HRV/PA Data

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego

Wavelet-based image fusion and quality assessment

Image Fusion. Pan Sharpening. Pan Sharpening. Pan Sharpening: ENVI. Multi-spectral and PAN. Magsud Mehdiyev Geoinfomatics Center, AIT

Online publication date: 14 December 2010

Fusing high-resolution SAR and optical imagery for improved urban land cover study and classification

MOST of Earth observation satellites, such as Landsat-7,

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

LANDSAT-SPOT DIGITAL IMAGES INTEGRATION USING GEOSTATISTICAL COSIMULATION TECHNIQUES

A Hierarchical Fuzzy Classification Approach for High-Resolution Multispectral Data Over Urban Areas

Image Fusion Processing for IKONOS 1-m Color Imagery Kazi A. Kalpoma and Jun-ichi Kudoh, Associate Member, IEEE /$25.

Detection of Urban Buildings by Using Multispectral Gokturk-2 and Sentinel 1A Synthetic Aperture Radar Images

IMPLEMENTATION AND COMPARATIVE QUANTITATIVE ASSESSMENT OF DIFFERENT MULTISPECTRAL IMAGE PANSHARPENING APPROACHES

ASSESSMENT OF VERY HIGH RESOLUTION SATELLITE DATA FUSION TECHNIQUES FOR LANDSLIDE RECOGNITION

LAND USE MAP PRODUCTION BY FUSION OF MULTISPECTRAL CLASSIFICATION OF LANDSAT IMAGES AND TEXTURE ANALYSIS OF HIGH RESOLUTION IMAGES

Image interpretation and analysis

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING 1

Detecting Land Cover Changes by extracting features and using SVM supervised classification

Fusion of Heterogeneous Multisensor Data

Chapter 4 Pan-Sharpening Techniques to Enhance Archaeological Marks: An Overview

Direct Fusion of Geostationary Meteorological Satellite Visible and Infrared Images Based on Thermal Physical Properties

Synthetic Aperture Radar (SAR) Image Fusion with Optical Data

The optimum wavelet-based fusion method for urban area mapping

DIGITALGLOBE ATMOSPHERIC COMPENSATION

APPLICATION OF PANSHARPENING ALGORITHMS FOR THE FUSION OF RAMAN AND CONVENTIONAL BRIGHTFIELD MICROSCOPY IMAGES

High-resolution Image Fusion: Methods to Preserve Spectral and Spatial Resolution

Detection of urban expansion in an urban-rural landscape with multitemporal QuickBird images

Morphological Building/Shadow Index for Building Extraction From High-Resolution Imagery Over Urban Areas

International Journal of Innovative Research in Engineering Science and Technology APRIL 2018 ISSN X

IMAGE QUATY ASSESSMENT FOR VHR REMOTE SENSING IMAGE CLASSIFICATION

A MULTISTAGE APPROACH FOR DETECTING AND CORRECTING SHADOWS IN QUICKBIRD IMAGERY

Comparing of Landsat 8 and Sentinel 2A using Water Extraction Indexes over Volta River

EVALUATION OF SATELLITE IMAGE FUSION USING WAVELET TRANSFORM

Satellite Imagery Characteristics, Uses and Delivery to GIS Systems. Wayne Middleton April 2014

Image interpretation I and II

Urban Feature Classification Technique from RGB Data using Sequential Methods

Introduction to Remote Sensing

THE DECISION TREE ALGORITHM OF URBAN EXTRACTION FROM MULTI- SOURCE IMAGE DATA

High Resolution Satellite Data for Mapping Landuse/Land-cover in the Rural-Urban Fringe of the Greater Toronto Area

THE IMAGE REGISTRATION TECHNIQUE FOR HIGH RESOLUTION REMOTE SENSING IMAGE IN HILLY AREA

FOREST MAPPING IN MONGOLIA USING OPTICAL AND SAR IMAGES

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

MULTIRESOLUTION SPOT-5 DATA FOR BOREAL FOREST MONITORING

Chapter 1. Introduction

Texture Feature Extraction for Land-cover Classification of Remote Sensing Data in Land Consolidation District Using Semi-variogram

IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, VOL. 42, NO. 6, JUNE

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts

Comparison between Mallat s and the à trous discrete wavelet transform based algorithms for the fusion of multispectral and panchromatic images

Transcription:

International Journal of Advanced Culture Technology Vol.4 No.2 1-6 (2016) http://dx.doi.org/.17703/ijact.2016.4.2.1 IJACT-16-2-1 Comparison of various image fusion methods for impervious surface classification from VNREDSat-1 Hung V. Luu *, Manh V. Pham, Chuc D. Man, Hung Q. Bui, Thanh T.N. Nguyen Center of Multidisciplinary Integrated Technologies for Field Monitoring E-mail: hunglv@fimo.edu.vn Abstract Impervious surfaces are important indicators for urban development monitoring. Accurate mapping of urban impervious surfaces with observational satellites, such as VNREDSat-1, remains challenging due to the spectral diversity not captured by an individual PAN image. In this article, five multi-resolution image fusion techniques were compared for the task of classifting urban impervious surfaces. The result shows that for VNREDSat-1 dataset, UNB and Wavelet tranformation methods are the best techniques in reserving spatial and spectral information of original MS image, respectively. However, the UNB technique gives the best results when it comes to impervious surface classification, especially in the case of shadow areas included in non-impervious surface group. Key words: image fusion, classification, impervious surface. 1. INTRODUTION Impervious surface are mainly artificial structure that are covered by impenetrable material and has been recognized as an important indicator in urban development monitoring [1][2]. The increasing availability of Very High Resolution (VHR) imagery provides a great opportunity for detail impervious surface mapping in urban area. Due to cost and complexity issues, recently launched VHR satellites often provide us a PANchromatic (PAN) images with finer spatial resolution than MultiSpectral (MS) images. However, MS images have higher spectral resolution than PAN images, thus were more applicable for pixel based classification task. Since combining the output from different sensors makes the best use of data obtained from existing satellites [3], the good fusion of the MS and PAN images is able to utilize the advantages of both. A fusion image are preserving the spectral resolution of MS images spatial resolution of PAN image, which can avoid mixed pixel problem in classification of sparse resolution MS images. The goal of the article is to investigate a fusion method that would increase spatial resolution without Manuscript Received: Apr. 5, 2016 / Revised : Apr. 17, 2016 / Accepted: May 5, 2016 Corresponding Author: hunglv@fimo.edu.vn Tel: +84-437-549-727, Fax: +84-437-549-727 Center of Multidisciplinary Integrated Technologies for Field Monitoring

2 International Journal of Advanced Culture Technology Vol.4 No.2 1-6 (2016) degrading spectral discrimination for mapping of impervious surface in urban area. Five widely used multi-resolution merging methods are compared. In our experiments, we used VNREDSat-1 images over Saigon Port area in Ho Chi Minh city, Vietnam. The VNREDSat-1 was launched in May 2013 as the first satellite of Vietnam aiming to capture high resolution image for natural resources, environment and disaster monitoring and management. In the next section, images fusion is discussed. Results of impervious surface classification using different fusion images are given in Section 3. Finally, conclusion and future work is presented. 2. MULTI RESOLUTION IMAGE FUSION The study area is Saigon Port area at Ho Chi Minh city, Vietnam. It is one of the most urbanized areas in the city characterized by various impervious artificial construction types including small roads, large roads, bridges, rooftops, etc Besides, non-impervious surface including trees, park, water, are also presented and thus creating the diversity of area. The dataset is a VNREDSat-1 image recorded in Jan 30th, 2014 with detailed information presented in Table 1. VNREDSat-1 provides a dataset with a PAN band at 2.5 m of spatial resolution and MS images of four bands including Red, Green, Blue, and Near InFrared (NIR) at m spatial resolution. Band Name Multispectral Table 1. The spectrum of the VNREDSat-1 bands Name Blue Green Red Nir Spatial Resolution (m) Central wavelength (µm) 490 550 660 830 Wavelength (µm) 0.45 0.52 0.53 0.59 0.625 0.695 0.76 0.89 Panchromatic - 2.5 600 0.45 0.75 Image fusion can be done at three levels including pixel, feature and decision level. In this study, we consider fusion at the pixel level with five widely used algorithms including IHS [4], PCA [5, 6], Gram-Schmidt (GS) [7], Wavelet transform [8, 9] and University of New Brunswick (UNB) method []. For each fused image, true and false color composites were produced and visually inspected. The visual analysis includes the following controls: existence of color distortion locally or globally in the image, existence of color tonality differences, detection of linear distortion in roads, buildings, bridges, soil.. and general appearance of the image (brightness, contrast, etc...). Visual comparison (qualitative metrics) of different five fused images shows that UNB is the best one which reserves the representation of object details, then IHS and GS fusion images. Resulted images by PCA and Wavelet transformation make blurred details (Figure 1). For qualitative assessment, a series of quality metrics was used to evaluate the spatial and spectral fidelity between fused images and original MS data. We consider Bias, Difference In Variance (DIV), Correlation Coefficient (CC), ERGAS, the Universal Image Quality Index (UIQI), Relative Average Spectral Error (RASE), Root Mean Square Error (RMSE) and Entropy following definitions in[11-15]. Those metrics were calculated in each of four bands and then averaged (Table 2). Wavelet transform is the best technique keeping spectral characteristics of the original MS image (i.e. DIV, CC, ERGA, UIQI, RMSE are smallest) and then, IHS, PCA, UNB and GS are following.

Comparison of various image fusion methods for impervious surface classification from VNREDSat-1 3 Table 2 Quantitative assessment of VNREDSat-1 fusion images Bias DIV CC 1) ERGA RASE UIQI 1) RMSE PCA 0.006-0.193 0.789 7.667 0.496 0.774 0.028 GS -0.015-0.141 0.881 6.627 0.267 0.863 0.021 IHS -0.014 0.058 0.911 5.056 0.198 0.9 0.016 Wavelet -0.008 0.049 0.958 3.576 0.233 0.957 0.013 UNB -0.014 0.085 0.896 5.487 0.230 0.894 0.018 a- Original Pan b- IHS c- PCA d- GS e- WL f- UNB Figure 1. VNREDSat-1 2.5m original PAN and generated fusion images a) PAN b) HIS c) PCA d) GS e) WL f) UNB 3. IMPERVIOUS CLASSIFICATION In this section, we investigate impevious surface classification from different fusion images in which UNB, IHS and GS can keep spatial consistency while Wavelet, IHS and PCA reserve spectral characteristics for the

4 International Journal of Advanced Culture Technology Vol.4 No.2 1-6 (2016) VNREDSAT 1 image. Impervious surface (IS) and non-impervious surface (NIS) are combination of various land cover types. Impervious surface can be made up of dark impervious surface (DIS) and bright impervious surface (BIS). Non-impervious surface consists of diverse materials including vegetation (VEG), water body (WAT), bare soil/sand (BSS), and shaded area (SHA). In this study, a two-step approach was employed. Firstly, different land cover classes were grouped into impervious and non-impervious surfaces for classification. PAN and 5 fusion images are used to analyze the effectiveness of different fusion methods for classification task. Secondly, the best fusion dataset will be further analyzed for detail land cover. Ground truth data for each land cover class in Saigon Port area have been selected manually from the VNREDSat-1 image. For each class, we selected 500 samples in which a half (i.e. 250 samples) is for training and the rest is for testing. For the classification, the Support Vector Machine was used with the Gaussian Radial Basis function for the kernel and the training parameters estimated by a grid-search on each dataset. Table 3 shows the overall accuracy of impervious and non-impervious classification using different fusion images. Most fusion images improve the accuracy in compare to original PAN data (84.4%) with exception of Wavelet (77.9%). The test accuracy improved significantly when using UNB image (89.6%) follow by PCA (89.1%) and GS (87.1%). In fact, different fusion images have smaller difference in spectral presentation (Table 2) than spatial presentation (Figure 1). The classification result seems strongly related to spatial than spectral aspects presented in fusion images and impevious and non-impervious classes themselves. Table 3. Confusion matrix and overall accuracy impervious/non-impervious classification IS (# pixels) NIS (# pixels) Accuracy (%) PAN PCA GS IHS Wavelet UNB IS NIS IS NIS IS NIS IS NIS IS NIS IS NIS 420 80 451 49 417 83 398 2 306 194 464 36 172 828 128 872 114 886 136 864 130 870 139 861 84.4 89.1 87.1 84.5 77.9 89.6 The further investigation on impevious and non-impervious classes is carried out on UNB fusion image and PAN image. The result is listed in Table 4. Several important findings can be observed. First, several impervious and non-impervious classes are easily confused when using only PAN image. For instance, 66 pixels of BIS are classified as BSS, 0 pixels of VEG are classified as DIS, 43 pixels of BSS are classified as BIS and 29 pixels of SHA are classified as DIS. Moreover, classes in some group (impervious or non-impervious) are also confused. 34 pixels of DIS are classified as BIS, 3 and 114 pixels of SHA are classified as VEG and WAT respectively. However, after combining PAN with MS using the UNB method, the mistakes are dramatically reduced since all pixels of DIS, VEG, WAT, BSS are classified correctly. The ineffective performance of UNB just happen by classification of SHA the class.

Comparison of various image fusion methods for impervious surface classification from VNREDSat-1 5 Table 4. Confusion matrix for detail classification using UNB and PAN images UNB PAN BIS DIS VEG WAT BSS SHA BIS DIS VEG WAT BSS SHA BIS 240 0 0 0 0 184 0 0 0 66 0 DIS 0 250 0 0 0 0 34 202 14 0 0 0 VEG 0 0 250 0 0 0 0 0 134 16 0 0 WAT 0 0 0 250 0 0 0 0 2 248 0 0 BSS 0 0 0 0 250 0 43 0 0 0 207 0 SHA 15 116 0 22 0 97 0 29 3 114 0 4 4. CONCLUSION In the study, five different image fusion methods, including PCA, GS, IHS, Wavelet and UNB, were firsly investigated to produce higher resolution MS images, then used for impervious surface classification. The fusion results show that for VNREDSat-1 dataset, UNB and Wavelet tranform methods are the best techniques reserving spatial and spectral information of original MS image, respectively. The application of fusion images for current impervious classification points out strong relationship of classification to spatial than spectral aspects presented in fusion images. Therefore, the UNB is the best candidate for impervious surface classification, especially in the case of shadow area included in non-impervious surface group. ACKNOWLDEGEMENT This work was supported by Vietnamese Science and Space Program in 2014-2015. REFERENCESE [1] Lu, D. and Weng, Q., Extraction of urban impervious surfaces from an IKONOS image, International Journal of Remote Sensing, 30(5), pp. 1297-1311, 2009. [2] Yuan, F., and Bauer, M. E., Mapping impervious surface area using high resolution imagery: a comparison of object-oriented classification to per-pixel classification, in Proceedings of American Society of Photogrammetry and Remote Sensing Annual Conference, May 1-5, Reno, NV, CD-ROM, 2006. [3] Palsson, F.; Sveinsson, J.R.; Benediktsson, J.A.; Aanaes, H., "Image fusion for classification of high resolution images based on mathematical morphology," in Geoscience and Remote Sensing Symposium (IGARSS), 20 IEEE International, vol., no., pp.492-495, 25-30 July 20. [4] M. Ehlers, S. Klonus, P.J. Astrand, Quality assessments for multi-sensor multi-date image fusion, in Procedding of XXIth International Congresss ISPRS, Beijing, pp. 499-506, 2008. [5] Chavez, P.S., S.C. Sides, and J.A. Anderson, Comparison of Three Different Methods to Merge Multi-Resolution and Multispectral Data: Landsat TM and SPOT Panchromatic, Photogrammetric Engineering & Remote Sensing, 5 7(3):295-303, 1991.

6 International Journal of Advanced Culture Technology Vol.4 No.2 1-6 (2016) [6] C. Pohl, J.L.Van Genderen, Multisensor image fusion in remote sensing: Concepts, methods and applications, International Journal of Remote Sensing, Vol. 19, Iss. 5, 1998. [7] Laben, C. A., Brower, B. V.. Process for enhancing the spatial resolution of multispectral imagery using pan-sharpening. United States Patent No. 6011875, 2000. [8] Myungjin Choi; Rae Young Kim; Myeong-Ryong Nam; Hong Oh Kim, "Fusion of multispectral and panchromatic Satellite images using the curvelet transform," in Geoscience and Remote Sensing Letters, IEEE, vol.2, no.2, pp.136-140, April 2005. [9] K.A. Kalpoma, and J. Kudoh, Image Fusion Processing for IKONOS 1m Color Imagery, IEEE Transactions on Geoscience and Remote Sensing, 45, 1-12, 2007. [] Y. Zhang, K. Mishra, From UNB PanSharp to Fuze Go the success behind the pan-sharpening algorithm, International Journal of Image and Data Fusion, 2013. [11] Gonzalez-Audicana, M., Otazu, X., Fors, O., Seco, A., "Comparision between Mallat's and the 'a trous' discrete wavelet transform based algorithm for the fusion of multispectral and panchromatic images", Int. J. Remote Sens. 26, 595-614, 2005. [12] Guo, Q. and Liu, Shutian,. Novel image fusion method based on discrete fractional random transform, Chinese Optics Letters. 8(7), pp. 656-660, 20. [13] Qingxiong Yang; Kar-Han Tan; Culbertson, B.; Apostolopoulos, J., "Fusion of active and passive sensors for fast 3D capture," in Multimedia Signal Processing (MMSP), 20 IEEE International Workshop on, vol., no., pp.69-74, 4-6 Oct. 20 [14] S Abdikan, F Balik Sanli, F Bektas Balcik, C Goksel, Fusion of sar images (PALSAR and RADARSAT-1) with multispectral spot image: a comparative analysis of resulting images, The International Archives of the Photogrammetry Remote Sensing and Spatial Information Sciences, 37. 1197-1202, 2008.