A SYNERGETIC USE OF REMOTE-SENSED DATA TO ASSESS THE EVOLUTION OF BURNT AREA BY WILDFIRES IN PORTUGAL

Similar documents
Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Preparing for the exploitation of Sentinel-2 data for agriculture monitoring. JACQUES Damien, DEFOURNY Pierre UCL-Geomatics Lab 2 octobre 2013

COMPATIBILITY AND INTEGRATION OF NDVI DATA OBTAINED FROM AVHRR/NOAA AND SEVIRI/MSG SENSORS

Lecture 6: Multispectral Earth Resource Satellites. The University at Albany Fall 2018 Geography and Planning

Satellite data processing and analysis: Examples and practical considerations

AVHRR/3 Operational Calibration

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS

Multi-Resolution Analysis of MODIS and ASTER Satellite Data for Water Classification

Limb Correction of Infrared Imagery in Cloudy Regions for the Improved Interpretation of RGB Composites

INNOVATIVE IDEAS FOR USING THE HYPERESPECTRAL LEVEL 1 DATA OF THE NEXT GEOSTATIONARY MTG-IRS IN NOWCASTING

Development of normalized vegetation, soil and water indices derived from satellite remote sensing data

SEN3APP Stakeholder Workshop, Helsinki Yrjö Rauste/VTT Kaj Andersson/VTT Eija Parmes/VTT

An Introduction to Remote Sensing & GIS. Introduction

Advanced satellite image fusion techniques for estimating high resolution Land Surface Temperature time series

Image interpretation and analysis

REMOTE SENSING INTERPRETATION

F2 - Fire 2 module: Remote Sensing Data Classification

Lab 1: Introduction to MODIS data and the Hydra visualization tool 21 September 2011

Validating MODIS burned area products over Cerrado region

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

Interrogating MODIS & AIRS data using HYDRA

The AATSR LST retrieval: State of knowledge and current developments

Spectral compatibility of vegetation indices across sensors: band decomposition analysis with Hyperion data

PROFILE BASED SUB-PIXEL-CLASSIFICATION OF HEMISPHERICAL IMAGES FOR SOLAR RADIATION ANALYSIS IN FOREST ECOSYSTEMS

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego

Interactive comment on PRACTISE Photo Rectification And ClassificaTIon SoftwarE (V.2.0) by S. Härer et al.

The Moderate Resolution Imaging Spectroradiometer (MODIS): Potential Applications for Climate Change and Modeling Studies

Monitoring agricultural plantations with remote sensing imagery

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

CHARACTERISTICS OF REMOTELY SENSED IMAGERY. Spatial Resolution

Int n r t o r d o u d c u ti t on o n to t o Remote Sensing

DISTINGUISHING URBAN BUILT-UP AND BARE SOIL FEATURES FROM LANDSAT 8 OLI IMAGERY USING DIFFERENT DEVELOPED BAND INDICES

MERIS instrument. Muriel Simon, Serco c/o ESA

Environmental Remote Sensing GEOG 2021

How to Access Imagery and Carry Out Remote Sensing Analysis Using Landsat Data in a Browser

XSAT Ground Segment at CRISP

GIS Data Collection. Remote Sensing

Satellite Remote Sensing: Earth System Observations

Satellite Imagery and Remote Sensing. DeeDee Whitaker SW Guilford High EES & Chemistry

Caatinga - Appendix. Collection 3. Version 1. General coordinator Washington J. S. Franca Rocha (UEFS)

FOREST MAPPING IN MONGOLIA USING OPTICAL AND SAR IMAGES

Lecture 13: Remotely Sensed Geospatial Data

Crop Scouting with Drones Identifying Crop Variability with UAVs

Co-ReSyF RA lecture: Vessel detection and oil spill detection

DIGITALGLOBE ATMOSPHERIC COMPENSATION

Copernicus Introduction Lisbon, Portugal 13 th & 14 th February 2014

Introduction to Remote Sensing

Meteosat Third Generation (MTG) Lightning Imager (LI) instrument on-ground and in-flight calibration

Feedback on Level-1 data from CCI projects

Dirty REMOTE SENSING Week 2 Interpreation

Automatic processing to restore data of MODIS band 6

Present and future of marine production in Boka Kotorska

Precise error correction method for NOAA AVHRR image using the same orbital images

Intersatellite Calibration of infrared sensors onboard Indian Geostationary Satellites using LEO Hyperspectral Observations

Remote Sensing Exam 2 Study Guide

Sentinel-2 Products and Algorithms

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

Multi-sensor data base over desert sites for calibration purpose. P. Henry ¹, X. Briottet ², C. Miesch ², F. Cabot ¹ ¹CNES, ²ONERA

Interpreting land surface features. SWAC module 3

On the use of water color missions for lakes in 2021

Remote Sensing and Image Processing: 4

large area By Juan Felipe Villegas E Scientific Colloquium Forest information technology

remote sensing? What are the remote sensing principles behind these Definition

AUTOMATIC DETECTION OF HEDGES AND ORCHARDS USING VERY HIGH SPATIAL RESOLUTION IMAGERY

Image Registration Issues for Change Detection Studies

Fundamentals of Remote Sensing

On the use of synthetic images for change detection accuracy assessment

P5.15 ADDRESSING SPECTRAL GAPS WHEN USING AIRS FOR INTERCALIBRATION OF OPERATIONAL GEOSTATIONARY IMAGERS

VALIDATION OF THE CLOUD AND CLOUD SHADOW ASSESSMENT SYSTEM FOR LANDSAT IMAGERY (CASA-L VERSION 1.3)

P1.53 ENHANCING THE GEOSTATIONARY LIGHTNING MAPPER FOR IMPROVED PERFORMANCE

Image transformations

Wildfires in GEMS Johannes Kaiser Martin Schultz, Tony Hollingsworth

The use of satellite images to forecast agricultural

1. INTRODUCTION. GOCI : Geostationary Ocean Color Imager

Assessment of different spectral indices in the red near-infrared spectral domain for burned land discrimination

Detecting Greenery in Near Infrared Images of Ground-level Scenes

Bias correction of satellite data at ECMWF. T. Auligne, A. McNally, D. Dee. European Centre for Medium-range Weather Forecast

Urban Feature Classification Technique from RGB Data using Sequential Methods

ESA Agency Report. Bojan R. Bojkov

IASI L0/L1 NRT Monitoring at EUMETSAT: Comparison of Level 1 Products from IASI and HIRS on Metop-A

Imagers as Environmental Sensors

Final Examination Introduction to Remote Sensing. Time: 1.5 hrs Max. Marks: 50. Section-I (50 x 1 = 50 Marks)

Important Missions. weather forecasting and monitoring communication navigation military earth resource observation LANDSAT SEASAT SPOT IRS

The studies began when the Tiros satellites (1960) provided man s first synoptic view of the Earth s weather systems.

Module 11 Digital image processing

Calibrating ASTER for Snow Cover Analysis

European space sector, an industry view

Theme: ocean colour observations from the geostationary orbit

DETERMINATION OF THE EFFECTIVE ACCURACY OF SATELLITE-DERIVED GLOBAL, DIRECT AND DIFFUSE IRRADIANCE IN THE CENTRAL UNITED STATES

Use of FORMOSAT images over the Gourma site (Mali)

Evaluation of FLAASH atmospheric correction. Note. Note no SAMBA/10/12. Authors. Øystein Rudjord and Øivind Due Trier

IKONOS High Resolution Multispectral Scanner Sensor Characteristics

Active Fire Monitoring with Level 1.5 MSG Satellite Images

A New Lossless Compression Algorithm For Satellite Earth Science Multi-Spectral Imagers

SEMI-SUPERVISED CLASSIFICATION OF LAND COVER BASED ON SPECTRAL REFLECTANCE DATA EXTRACTED FROM LISS IV IMAGE

Southern Africa Fire Network overview

PLANET SURFACE REFLECTANCE PRODUCT

Remote Sensing for Fire Management. FOR 435: Remote Sensing for Fire Management

Monitoring of mine tailings using satellite and lidar data

Remote Sensing Platforms

Transcription:

A SYNERGETIC USE OF REMOTE-SENSED DATA TO ASSESS THE EVOLUTION OF BURNT AREA BY WILDFIRES IN PORTUGAL Teresa J. Calado and Carlos C. DaCamara CGUL, Faculty of Sciences, University of Lisbon, Campo Grande, 1749-016 Lisbon, Portugal mtcalado@fc.ul.pt; cdcamara@fc.ul.pt Abstract Pre-operational since January 2005, the Satellite Application Facility for Land Surface Analysis (LSA SAF) has been taking full advantage of data available from SEVIRI on-board Meteosat-8 to monitor land surface properties and to derive a large set of land surface variables. A wide variety of human activities beyond those strictly related to meteorology and climatology will benefit from this kind of information, namely agriculture and forest management. In the forthcoming Continuous Development Operational Phase (CDOP) information from new sensors will be incorporated in the LSA SAF system, namely from the AVHRR and MODIS sensors, respectively on-board the NOAA and Metop series and the TERRA and AQUA satellites. In this new framework it is especially important to assess the potential of a synergetic usage of satellite information from sensors with different spatial, temporal, spectral and orbital characteristics. We present a strategy to combine remote-sensed information from different satellite platforms with the aim of performing a continuous monitoring of burnt areas by wildfires in Continental Portugal. First we use maps of burnt scars, based on visual interpretation and on-screen digitising of TERRA/MODIS red-green-blue (RGB) colour composites. The information is then downgraded to the NOAA/AVHRR scale and further downgraded to the Meteosat-8/SEVIRI scale. Finally we rely on neuro-fuzzy techniques to develop models that are able to discriminate burnt scars at the different spatial scales. The method was applied to the 2005 fire season and the obtained results suggest that a synergetic use of information from TERRA/MODIS, NOAA/AVHRR and Meteosat8/SEVIRI allows for a proper spatial and temporal characterisation of burnt scars. INTRODUCTION Like the other Satellite Application Facilities (SAFs), the SAF for Land Surface Analysis (LSA SAF) is a specialised development and processing centre within the EUMETSAT Applications Ground Segment and its overall objective is the provision of operational services, ensuring a cost-effective and synergetic balance between the central and distributed services (EUMETSAT, 2006). The LSA SAF system provides detailed information about the land surface, the interaction between land and atmosphere, and the biosphere, and relies on its capability to deliver services, namely by providing a framework of understanding that will help stakeholders to comprehend, select and apply the data to be supplied. Many human activities beyond meteorology and climatology benefit from this type of information and the steady growing number of LSA SAF research areas is related to the users demands that have been evolving with the availability of improved and new data sources and the rise of potential applications for new products. These applications include agricultural and forestry applications, land use and the broader topics of climate and environment monitoring. The forthcoming Continuous Development Operational Phase (CDOP) of the LSA SAF will be devoted to consolidating operational activities (i.e., the generation, dissemination and archiving of land surface related products) as well as to maintaining and steering R&D activities, namely the validation and improvement of LSA SAF products and the development of algorithms to derive new products. Improvement of existing products may be related either to the need of following new user

requirements or to the advantages of adapting the algorithms to new sensors, e.g. MODIS. New products will be derived in the domains of Fire Detection and Monitoring (FD&M) and the Mapping of Fire Risk (MFR) and, in this respect, the LSA SAF will explore the capability of Meteosat/SEVIRI to detect and monitor active fires (especially in Africa) as well as to identify signals of vegetation water stress on SEVIRI channels over Europe. The LSA SAF has also been making a significant contribution to the Global Monitoring for Environment and Security (GMES) initiative and is currently involved in GEOLAND, a GMES-related integrated project under the Sixth Framework Programme. GEOLAND aims to develop and integrate a range of geo-information services and products focusing on land cover and vegetation monitoring with the purpose of helping authorities to fulfil their environmental monitoring and reporting obligations and manage their natural resources more effectively. In this particular LSA SAF products may contribute to the tasks of forest monitoring, namely in what respects to the mapping of vegetation and burnt scars. The aim of the present work is to present a strategy to combine remote-sensed information from different satellite platforms with the aim of performing a continuous monitoring of burnt areas by wildfires in Continental Portugal. For this purpose we have relied on information from TERRA/MODIS, NOAA/AVHRR and Meteosat-8/SEVIRI and have adapted a previously developed neuro-fuzzy Adaptive Neuro-Fuzzy Inference System (ANFIS) model (Calado and DaCamara, 2002) to explore the synergy between geostationary and polar-orbit satellites. The technique was applied to the nearinfrared (NIR) channels of AVHRR and SEVIRI and the performance of the method was tested in the 2005 fire season. DATA We will make use of three different sets of remote-sensed data covering the month of August 2005: 1. Since the LSA SAF is planning to use MODIS in an operational mode we have relied on TERRA/MODIS data (Figure 1), which consist of daily maps of burnt scars based on visual interpretation and on-screening digitising of RGB colour composites of channel 7 (2.105 μm 2.155 μm), channel 2 (0.8411 μm 0.876 μm) and channel 1 (0.620 μm 0.670 μm). The spatial resolution of channels 1 and 2 is 250 m, whereas that of channel 7 is 500 m. Figure 1. Pseudo-colour MODIS burnt map, respecting to August 25, 2005. The colour bar identifies the label numbers of the fire scars. 2. NOAA/AVHRR data (Figure 2a), which consist of 585 345 pixel images of the morning and/or early afternoon orbits. The images were previously calibrated, geometrically corrected and geo-referenced and pixels were re-sampled to the size of 1.1 1.1 km 2. The study area was restricted to Continental Portugal and the channel used to detect burnt areas was the NIR (channel 2, 0.72-1.10 μm).

3. Meteosat-8 data (Figure 2b) consist of 146 86 pixel images, covering August 2005, on an hourly basis from 9:00 until 15:00 UTC. The channel used to detect burnt areas was the NIR (channel 2, centred at 0.8 μm). Pixels were kept in the original satellite projection. The study area was again restricted to Continental Portugal. Figure 2. RGB images, respecting to August 22, 2005 of NOAA (channels 10.3-11.3 μm, 0.72-1.10 μm, 0.58-0.68 μm) and Meteosat-8 (channels centred at 10.8 μm, 0.8 μm, 0.6 μm). METHODOLOGY The method generally consisted in the following two steps: 1. MODIS burnt scars were degraded to the NOAA scale (Figure 3a) by computing the percentage of MODIS pixels inside a given NOAA pixel. The obtained burnt scars were then used as learning data in the neuro-fuzzy ANFIS model. 2. The output from the NOAA model was used as learning data to discriminate burnt areas with Meteosat. NOAA burnt scars were degraded to Meteosat scale (Figure 3b) by computing the percentage of NOAA pixels into each Meteosat pixel. Figure 3. Pseudo-colour images of MODIS burnt map degraded to NOAA scale, respecting to August 25, 2005 and burnt map as obtained from ANFIS model degraded to Meteosat scale, respecting to August 22, 2005. Colours correspond to percentage of burnt pixels. Black pixels correspond to cloud systems and white pixels correspond to water bodies.

Cloud and water masks Before applying the ANFIS model, a pre-processing of both NOAA and Meteosat images was required in order to remove pixels contaminated by clouds and water. The cloud mask was obtained by means of a threshold technique. The rationale behind the used approach took into account that images that are useful for burnt area discrimination are characterised by a low amount of cloudy pixels. In this context, rather than delineating cloud-free pixels with a high confidence degree, the aim is to reject all pixels whose radiative signatures are different from those considered as characteristic of clear sky pixels. Accordingly and in what respects to NOAA images, any pixel fulfilling one of the following conditions was rejected (Figure 4a): Reflective test: Ch () 1 + Ch ( 2) > 25000 Thermal test: Ch () 5 < 290 K In the case of Meteosat-8 images, a similar test was applied (Figure 4b): Reflective test: Ch () 1 + Ch ( 2) > 850 Thermal test: Ch () 5 < 298 K The NOAA water mask (white pixels in Figure 4a) was obtained by applying a subtractive clustering method (Chiu, 1994) to a clear sky image that allows computing membership values of each pixel to the two identified (land and water) clusters. Possibility of a given pixel to belong to land/water was then evaluated by maximizing the respective membership values to the clusters found. Defuzzification of results was finally performed by applying a threshold of 0.5 (this value was tuned by visual inspection). A few number of misclassified pixels were corrected by on-screening digitising. In the case of Meteosat, the water mask (white pixels in Figure 4b), was obtained by degrading the NOAA mask; the value of 1/0 was set to each Meteosat pixel containing any/no NOAA water pixels. Figure 4. NOAA and Meteosat RGB images, respecting to August 24, after applying water and cloud masks. Pixels in black correspond to cloud systems and pixels in white correspond to water bodies. Identification of burnt areas Identification of burnt areas was performed by applying a previously developed neuro-fuzzy ANFIS model (Jang, 1993) to the NIR channels of NOAA/AVHRR and Meteosat-8/SEVIRI. The method essentially consists in building up a Fuzzy Inference System (FIS), i.e., given an input membership function, the output will be a fuzzy inference curve that translates the concept that a burnt area is characterised by low values of reflectivity (dark areas).

The ANFIS model was applied to the NIR channel of NOAA/AVHRR, using as learning data burnt pixels from TERRA/MODIS degraded maps, i.e., all pixels with values greater than zero. In what respects to Meteosat/SEVIRI, we have used as input the NIR channel and as learning data the output from the NOAA model, which was downscaled to the Meteosat scale. Figure 5 shows the FIS curves as obtained from the ANFIS models respectively for NOAA and Meteosat. Results confirm what is to be expected over burnt areas, i.e., low values of the NIR channel. Figure 5. FIS curve for NOAA/AVHRR and Meteosat/SEVIRI. Figures 6 and 7 show an example for August 22 of the membership degrees that were obtained with NOAA/AVHRR and Meteosat/SEVIRI, respectively. Results show that high values of membership (yellow and red areas in the right panels) are in agreement with areas covered by burnt scars (reddier pixels in the left panels). It is worth noting that maximum values of Meteosat membership are much smaller than the ones of NOAA, reflecting the fact that the much larger size of Meteosat pixel diminishes the possibility of a pixel to be completely burnt. Figure 6. RGB (4,2,1) NOAA image of August 22 and corresponding membership values as obtained from ANFIS model. Pixels in black and white correspond to cloud systems and water bodies, respectively.

Figure 7. As in Figure 6, but respecting to RGB (9,2,1) Meteosat image of August 22 at 11:00 UTC. Thresholds of 0.55 and 0.4 were used to defuzzify the results for NOAA and Meteosat, respectively (Figure 8). However in order to classify a pixel as burnt, the NOAA (Meteosat) pixel needed to be classified as burnt on a certain day (hour) as well as on the next day (hour). We have also considered that a pixel once classified as burnt would remain classified as burnt for the remaining days (hours). Figure 8. Pseudo-colour images for August 22 of defuzzified burnt scars with thresholds of 0.55 for NOAA and 0.4 for Meteosat. Black, white and light yellow pixels identify cloud systems, water bodies and pixels classified, respectively. VALIDATION AND RESULTS Figure 9 shows the comparison of burnt areas as obtained from ANFIS model using NOAA and Meteosat data and burnt areas detected by MODIS and output from NOAA model, respectively. The Figure presents a rather large amount of false alarms (blue pixels), but most of these errors are located at the borders of the burnt scars (red pixels), clouds (black pixels) and water bodies (white pixels).

The misclassified pixels near the borders of water bodies and clouds were corrected by simply stating that a pixel would only be classified as burnt if it was not in the border of neither water nor clouds; otherwise it would not be classified. Finally misclassified pixels on the border of burnt areas were still classified as burnt. Figure 9 also shows that some of the omission errors (pixels in green) appear as burnt in the contiguous pixels. However, we have considered that such pixels were correctly classified as burnt ones because the errors are likely to be due to errors in georeferentiation. Figure 9. Pseudo-colour images for August 22 of burnt scars identified by TERRA/MODIS (green pixels), by NOAA/AVHRR (blue pixels) and both sensors (red pixels); NOA/AVHRR (green pixels), Meteosat/SEVIRI (blue pixels) and both sensors (red pixels). Pixels in light yellow correspond to pixels not classified. Another kind of errors detected was due to some types of soil/vegetation that presented radiative signatures similar to burnt areas. In order to mitigate these errors we have analysed NDVI in the day (hour) before and accordingly a given pixel was only classified as burnt if normalised values of NDVI were positive in that pixel or in the surroundings. Although this additional condition has excluded some truly burnt pixels, errors were overall smaller than when the NDVI was not taken into account. Validation of results was performed by means of confusion matrices. Table 1 respects to August 22 and it is worth noting that MODIS burnt scars and the NOAA output were taken as the truth when using NOAA and Meteosat data as inputs, respectively. Performance was assessed by means of Producer s Accuracy (PA) and User s Accuracy (UA). PA and UA values of 76% and 86% (76% and 63%) were obtained for NOAA (Meteosat). NOAA ANFIS model MODIS burnt map Burnt Non- Burnt NOAA ANFIS model Burnt Nonburnt Burnt 945 152 Burnt 68 40 Non-burnt 305 70174 Meteosat ANFIS model Non-burnt 21 5417 Table 1. Confusion matrix for August 22 using NOAA data as input and using Meteosat data as input.

Figure 10 shows the spatial distribution of the results, where we may see that there is an overall good agreement between modelled burnt areas and observed ones (red pixels), especially in what respects to the larger burnt areas. Figure 10.Pseudo-colour images for August 22 of burnt scars as identified by TERRA/MODIS (green pixels), by NOAA/AVHRR (blue pixels) and by both sensors (red pixels); NOA/AVHRR (green pixels), Meteosat/SEVIRI (blue pixels) and both sensors (red pixels). Pixels in light yellow correspond to pixels not classified. CONCLUDING REMARKS Obtained results support that a synergistic use of information from NOAA/AVHRR and Meteosat/SEVIRI allows a proper identification of evolution of burnt areas in space and time. They also suggest that, although the spatial resolution of Meteosat does not allow quantifying burnt areas, it gives good indications about the evolution of fires in time. Some problems of the applied methodology are related with the types of soil/vegetation but it is expected to mitigate them by incorporating information on land cover, soil types and surface emissivity. Finally it is worth mentioning that we intend to incorporate as input data the Surface Albedo and the Land Surface Temperature that are currently pre-operational products of the LSA SAF. REFERENCES Calado, T.J. and DaCamara, C.C., 2002. Burnt area mapping in Portugal with a neuro-fuzzy approach. Proceedings from the EUMETSAT Meteorological Satellite Data User s Conference, Dublin, Ireland, September 2002, 577-584. Chiu, S., 1994: Fuzzy model identification based on cluster estimation. Journal of Intelligent and Fuzzy Systems, 2, 267-278. EUMETSAT, 2006: Eumetsat What We Do Monitoring Weather, Climate and the Environment: SAF Introduction. Available at http://www.eumetsat.int. Jang, J.-S.R., 1993: ANFIS: Adaptive Network-based Inference System. IEEE Transactions on Systems, Man and Cybernetics, 23, nº 3, 665-68