Generic Tools Description and User Manual

Similar documents
Co-ReSyF RA lecture: Vessel detection and oil spill detection

Impact toolbox. ZIP/DN to TOA reflectance. Principles and tutorial

Day One 12/07/ Introduction to Co-ReSyF Miguel Terra-Homem. Building on the SenSyF project.

GeoBase Raw Imagery Data Product Specifications. Edition

Sentinel-2 Products and Algorithms

Application of GIS to Fast Track Planning and Monitoring of Development Agenda

PLANET IMAGERY PRODUCT SPECIFICATIONS PLANET.COM

Comprehensive Vicarious Calibration and Characterization of a Small Satellite Constellation Using the Specular Array Calibration (SPARC) Method

BV NNET User manual. V0.2 (Draft) Rémi Lecerf, Marie Weiss

Remote sensing image correction

PLANET SURFACE REFLECTANCE PRODUCT

SENTINEL-1 Toolbox. Polarimetric Tutorial Issued March 2015 Updated August Luis Veci

Managing Imagery and Raster Data. Peter Becker

RADAR ANALYST WORKSTATION MODERN, USER-FRIENDLY RADAR TECHNOLOGY IN ERDAS IMAGINE

ASSESSMENT BY ESA OF GCOS CLIMATE MONITORING PRINCIPLES FOR GMES

SATELLITE OCEANOGRAPHY

PLANET IMAGERY PRODUCT SPECIFICATION: PLANETSCOPE & RAPIDEYE

Lesson 3: Working with Landsat Data

Introduction to image processing for remote sensing: Practical examples

Assessment of Spatiotemporal Changes in Vegetation Cover using NDVI in The Dangs District, Gujarat

S3 Product Notice SLSTR

Towards the Intercalibration of EO medium resolution multi-spectral imagers : MEREMSII Final Report Executive Summary

APCAS/10/21 April 2010 ASIA AND PACIFIC COMMISSION ON AGRICULTURAL STATISTICS TWENTY-THIRD SESSION. Siem Reap, Cambodia, April 2010

STM Product Evolution for Processing Baseline 2.24

WGISS-42 USGS Agency Report

[GEOMETRIC CORRECTION, ORTHORECTIFICATION AND MOSAICKING]

Change Detection using SAR Data

From Proba-V to Proba-MVA

The Radar Ortho Suite is an add-on to Geomatica. It requires Geomatica Core or Geomatica Prime as a pre-requisite.

remote sensing? What are the remote sensing principles behind these Definition

RGB colours: Display onscreen = RGB

Planet Labs Inc 2017 Page 2

Evaluation of FLAASH atmospheric correction. Note. Note no SAMBA/10/12. Authors. Øystein Rudjord and Øivind Due Trier

ENVISAT ASAR DATA PROCESSING

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

Copernicus Introduction Lisbon, Portugal 13 th & 14 th February 2014

DISCRIMINANT FUNCTION CHANGE IN ERDAS IMAGINE

ENMAP RADIOMETRIC INFLIGHT CALIBRATION, POST-LAUNCH PRODUCT VALIDATION, AND INSTRUMENT CHARACTERIZATION ACTIVITIES

Sentinel-1 Overview. Dr. Andrea Minchella

Using the ModelBuilder of ArcGIS 9 for Landscape Modeling

AT-SATELLITE REFLECTANCE: A FIRST ORDER NORMALIZATION OF LANDSAT 7 ETM+ IMAGES

PLANET IMAGERY PRODUCT SPECIFICATION: PLANETSCOPE & RAPIDEYE

Fundamentals of ModelBuilder

LANDSAT 8 Level 1 Product Performance

Final Examination Introduction to Remote Sensing. Time: 1.5 hrs Max. Marks: 50. Section-I (50 x 1 = 50 Marks)

Data Preparation. Warren Vick Europa Technologies Ltd.

Module 11 Digital image processing

Shallow Water Remote Sensing

EE 529 Remote Sensing Techniques. Introduction

SUGAR_GIS. From a user perspective. Provides spatial distribution of a wide range of sugarcane production data in an easy to use and sensitive way.

MERIS instrument. Muriel Simon, Serco c/o ESA

How to Access Imagery and Carry Out Remote Sensing Analysis Using Landsat Data in a Browser

Fundamentals of Remote Sensing

Image Extraction using Image Mining Technique

Analysis & Geoprocessing: Case Studies Problem Solving

Multilook scene classification with spectral imagery

Baldwin and Mobile Counties, AL Orthoimagery Project Report. Submitted: March 23, 2016

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

SARscape Modules for ENVI

SAR Othorectification and Mosaicking

Extending GDAL/OGR and UMN MapServer

SARscape for ENVI. A Complete SAR Analysis Solution

COMPATIBILITY AND INTEGRATION OF NDVI DATA OBTAINED FROM AVHRR/NOAA AND SEVIRI/MSG SENSORS

2007 Land-cover Classification and Accuracy Assessment of the Greater Puget Sound Region

Introduction to Remote Sensing

White paper brief IdahoView Imagery Services: LISA 1 Technical Report no. 2 Setup and Use Tutorial

Digital database creation of historical Remote Sensing Satellite data from Film Archives A case study

Remote Sensing Analysis Framework for Maritime Surveillance Application

Sentinels Data Collection

Specificities of Near Nadir Ka-band Interferometric SAR Imagery

Landsat 8, Level 1 Product Performance Cyclic Report November 2016

ORFEO program : Methodological part Report

TerraSAR-X. Value Added Product Specification

Forest mapping and monitoring in Russia using EO data: R&D activity overview

Spatial Analysis with ArcGIS Pro. Krithica Kantharaj, Esri

Landsat 8, Level 1 Product Performance Cyclic Report February 2017

Landsat 8, Level 1 Product Performance Cyclic Report July 2016

Model-Based Design for Sensor Systems

ArcGIS Pro: What s New in Analysis. Rob Elkins

CanImage. (Landsat 7 Orthoimages at the 1: Scale) Standards and Specifications Edition 1.0

HTEP - Water Quality Application

Warren Cartwright, Product Manager MDA Geospatial Services, Canada

ArcGIS Pro: What s New in Analysis

ERDAS IMAGINE Suite Comparison

Camera Requirements For Precision Agriculture

ATCOR Workflow for IMAGINE 2018

On the use of water color missions for lakes in 2021

Landsat 8, Level 1 Product Performance Cyclic Report January 2017

Ground Truth for Calibrating Optical Imagery to Reflectance

INTRODUCTION TO SNAP TOOLBOX

Managing and serving large collections of imagery

Camera Requirements For Precision Agriculture

8th ESA ADVANCED TRAINING COURSE ON LAND REMOTE SENSING

Spatial Analyst is an extension in ArcGIS specially designed for working with raster data.

University of Texas at San Antonio EES 5053 Term Project CORRELATION BETWEEN NDVI AND SURFACE TEMPERATURES USING LANDSAT ETM + IMAGERY NEWFEL MAZARI

Chapter 5. Preprocessing in remote sensing

ATCOR Workflow for IMAGINE 2016

MONITORING OF FOREST DAMAGE CAUSED BY GYPSY MOTH IN HUNGARY USING ENVISAT MERIS DATA ( )

A Web Application and Subscription Service for Landsat Forest Area Change Tools (LandsatFACT)

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Transcription:

This project has received funding from the European Union s Horizon 2020 research and innovation programme under grant agreement No 687289 Coastal Waters Research Synergy Framework Tools Document Code: CORESYF-DME-TEC-SUM02-E-R Date of delivery: SAR-1 meeting (T0+19) Deliverable identifier: D3.05 Version of document: 1.1 last updated 05/12/2017 Dissemination level for document: PU Table of Signatures Name Function Signature Stephen Emsley Developer Prepared by Romain Serra Developer Ricardo Capote Developer Nuno Costa Developer Reviewed by Nuno Grosso WP3 Leader Approved by Miguel Terra-Homem Executive Board Chair Signatures and approvals appear on original Project start date: 01/01/2016 Project duration: 36 months

Revision Records Version Date Changes Authors 1.0 03/11/2017 First issue of document Nuno Grosso Stephen Emsley Romain Serra 1.1 05/12/2017 Second issue of document Nuno Grosso Stephen Emsley Romain Serra PUBLIC Page 2 of 25

Table of Contents 1 Introduction... 6 1.1 Purpose and Scope... 6 1.2 Document Structure... 6 2 TOOLS IN THE CO-RESYF FRAMEWORK... 7 3 CO-RESYF TOOLS GENERAL DESCRIPTION AND USER INSTRUCTIONS... 8 3.1 Co-ReSyF Toolkit v1.0... 8 3.1.1 Image Inter-calibration... 9 3.1.2 Atmospheric correction and Flagging... 12 3.1.3 User instructions... 12 3.1.4 Radiometric Correction... 13 3.1.5 Point measurements to gridded maps... 16 3.1.6 Image crop... 16 3.1.7 Image mask... 17 3.1.8 SAR speckle filtering... 18 3.1.9 Image statistics... 19 3.1.10 Reprojection and Coordinate System Definition tool... 20 3.1.11 Error metrics... 21 3.1.12 Vector creation and edition... 22 3.1.13 Layer stack creation... 23 3.2 External Libraries and Third Party Tools... 23 3.2.1 Overall Description... 23 3.2.2 User instructions... 24 4 References... 24 PUBLIC Page 3 of 25

List of Tables No table of figures entries found. List of Figures No table of figures entries found. PUBLIC Page 4 of 25

Acronyms and Abbreviations CMS Corporate Management System Co-ReSyF Coastal Waters Research Synergy Framework EO Earth Observation GUI Graphical User Interface OS Operating System SPR Software Problem Report PUBLIC Page 5 of 25

1 Introduction The Co-ReSyF project will implement a dedicated data access and processing infrastructure, with automated tools, methods and standards to support research applications using Earth Observation (EO) data for monitoring of Coastal Waters, levering on the components deployed SenSyF (www.sensyf.eu). The main objective is to facilitate the access to Earth Observation data and pre-processing tools to the research community, towards the future provision of future Coastal Waters services based on EO data. Through Co-ReSyF s collaborative front end, even inexperienced researchers in EO will be able to upload their applications to the system to compose and configure processing chains for easy deployment on the cloud infrastructure. They will be able to accelerate the development of high-performing applications taking full advantage of the scalability of resources available in the cloud framework. The system s facilities and tools, optimized for distributed processing, include EO data access catalogues, discovery and retrieval tools, as well as a number of pre-processing tools and toolboxes for manipulating EO data. Advanced users will also be able to go further and take full control of the processing chains and algorithms by having access to the cloud back-end, and to further optimize their applications for fast deployment for big data access and processing. The Co-ReSyF capabilities will be supported and initially demonstrated by a series of early adopters who will develop new research applications on the coastal domain, guide the definition of requirements and serve as system beta testers. A competitive call will be issued within the project to further demonstrate and promote the usage of the Co-ReSyF release. These pioneering researchers in will be given access not only to the platform itself, but also to extensive training material on the system and on Coastal Waters research themes, as well as to the project's events, including the Summer School and Final Workshop. 1.1 Purpose and Scope This document provides a generic tools description and respective user manual details the architecture and the functionalities of the different proposed tools to support the research applications that were defined according to the System Requirements Document that was based on the research application user stories. It also describes the interfaces between those different tools and the other framework components. 1.2 Document Structure The structure of the document is as follows: Chapter 2 : gives an overview on how the Co-ReSyF tools how currently those tools can be accessed in the framework; Chapter 3 : provides information on the tools current development status and a user manual for their use, building on the information provided in the System Detailed Design Tools document and focusing on how the user can use each tool through the Co-ReSyF framework; PUBLIC Page 6 of 25

2 TOOLS IN THE CO-RESYF FRAMEWORK Within the Co-ReSyF platform two major components can be identified that support the operation of the research activities performed within the platform. One component is the Framework, which is composed of all the features that support the environment where the applications are defined and executed, and the other component are the Tools which are elements that can be used to build an application and to analyse/visualize the results of the application. The Framework includes the Cloud back-end, which is the infrastructure that runs the applications in the cloud and is in charge of coordinating and creating the VMs for distributed processing and collection of input and output data. It also includes the Data Access API which is a set of tools that allows the query and retrieval of the data within the Co-ReSyF catalogue and also any open data catalogue available online. The other part of the framework is related to the user interaction and it is the part that directly interfaces with the user, this includes the Front-end (GUI that provides the connection to all the platform functionalities to the user). Complementarily there user support systems such as the the Expert Center, Knowledge Base (wiki with relevant information for newcomers of the platform to start using it), the User Discussion Forum and the User Support Service Desk. All these components are outside the scope of this document. More information on them can be seen in the System Detailed Design Framework v1 (Co-ReSyF, 2016) and Generic Framework Description and User Manual v1 (Co-ReSyF, 2017) documents. On the other hand, the Tools live within the Framework and are a set of components (usually executables or libraries) that can be used by the researchers to build and manage their applications or handle the data. For instance, the Image Inter-calibration, Atmospheric corrections, Data Co-registration and Fusion are geospatial data processing tools that form the Co-ReSyF tool-kit, available to all users of the platform. They can be used either independently to process the data or as part of applications/processing chains. A first description of these tools was given in the System Detailed Design - Tools v1 document (Co-ReSyF, 2016). On the other hand, the Workflow Manager or Automated Orchestration is a different type of tool since it is not responsible for processing data. It is designed to configure and monitor the execution of the sequence of tasks that compose one application. It is considered a Tool in the scope of this document but can also be seen as part of the Framework since it provides an interface for users to use the existing EO products processing tools to edit existing processing chains or create their own. In this deliverable and all of that will follow, the Workflow manager is not included since it was considered to be more appropriate to include it in the Framework related deliverables (Co-ReSyF, 2017). PUBLIC Page 7 of 25

Finally there are the External Libraries and Third Party Tools that include GDAL, Python, the SNAP toolbox and QGIS. These tools and libraries extend the capabilities of the Co-ReSyF toolkit by providing access to a wide range of additional geospatial data processing functions that the user can use when developing his application/processing chain. It should be noted that most tools of the Co-ReSyF toolkit are built using the functionalities of the listed external libraries and third party tools, customised to the needs of the research application users, including, for instance, image batch processing. The main difference between the Co-ReSyF tool-kit and these external libraries and tools is that the tool-kit is developed to take full advantage of the Co- ReSyF s cloud architecture while the external libraries are mainly desktop tools which still require customization to be able to be integrated into that distributed processing environment. How can these different types of tools be accessed in the Co-ReSyF framework? The Co-ReSyF toolkit can be accessed in two distinct ways, according to your user profile: the expert user can integrate the different tools in his processing chain when developing his application by: o installing the Co-ReSyF toolkit in the Sandbox development Environment. (further details can be found in the System Detailed Design Framework v1 (Co- ReSyF, 2016) and Generic Framework Description and User Manual v1 (Co- ReSyF, 2017) documents) o accessing directly the Co-ReSyF Github repository (https://github.com/eccoresyf) and downloading them to their work environment the intermediate user can include those tools through the Workflow Manager as an additional processing block in either the already existing chain he is customizing or in the new chain he is starting to put together. The developed processing chains can be accessed through either the Workflow Manager or the Geoportal (https://geoportal.coresyf.eu/). The External Libraries and Third Party Tools are to be used mainly by expert users and can be accessed in a similar fashion as described in the first bullet above (either through the Sandbox environment or the Github repository). The next chapter will focus on providing a generic description of each tool and user instructions for each one of these type of tools within the Co-ReSyF framework. 3 CO-RESYF TOOLS GENERAL DESCRIPTION AND USER INSTRUCTIONS 3.1 Co-ReSyF Toolkit v1.0 The Co-ReSyF Toolkit will include all image processing tools that will be developed in the framework of this project during the implementation of the first version of the platform. Those PUBLIC Page 8 of 25

tools will be deployable in the cloud back-end through either calls from the Workflow Manager or the Developer s Sandbox. The following sections will focus on providing an overall description of each tool and user instructions on how to use it through a command line interface. The deployment and execution instructions of each tool via the Workflow Manager interface is covered in the previous chapter since the tools described here should be available in the Workflow Manager as workflow components to be used by him, when composing or editing a workflow. 3.1.1 Image Inter-calibration 3.1.1.1 Overall Description Time series of remotely sensed images acquired by different sensors under variable atmospheric conditions, solar illumination and view angles require normalization in order to make the images comparable. Effects of artefacts, surface directionality and atmosphere can be corrected in an absolute or a relative way. The purpose of the image inter-calibration module tool is to perform relative normalization of a set of images, based on a well-characterized and stable reference sensor image. Relative calibration eliminates the need for both radiative transfer models and information about atmospheric optical properties. Two approaches are considered for the tool: Histogram equalization and matching Relative radiometric normalization (RRN) A pre-processing data product reader is also considered (if not provided by the framework). In version 1 the tool is applicable to images from the same sensor for the same target areas of interest (AOI). In version 2 the tool is extended to be applicable to images from different sensors for the same target AOI. In version 1 histogram matching is implemented, in version 2 relative radiometric normalization will be implemented. Sentinel-2 MSI Level 1C, Sentinel-3 OLCI Level 1B, Landsat 8 Level 1T optical sensor images are compatible with the image inter-calibration module. Note that these three sensors use differing data formats and measurements: Landsat 8 records Digital Numbers (DN) stored as GeoTIFF files. Sentinel-2 MSI records TOA reflectance stored as JPEG2000 packaged in SAFE format Sentinel-3 OLCI records TOA radiance stored as netcdf packaged in SAFE format In order to be compatible with data products from different sensors the adapter pattern are used to convert the data structure of the input file to the data structure expected by the intercalibration tool. The inter-calibration module exposes an interface that expects GeoTIFF files as input. Histogram Equalization and Matching PUBLIC Page 9 of 25

The first step is to distribute the intensities of the reference image using histogram equalization, either to implement a palette or an image change. Note that image histogram equalization can be performed by Third Party tools, such as the MATLAB histeq function and/or the MATLAB adapthisteq function, which implements contrast-limited adaptive histogram equalization (CLAHE). Image matching aims to transform the input images such that the histogram of the output images matches the histogram derived from the reference image. This is achieved off-the-shelf using the MATLAB imhistmatch function. Relative Radiometric Normalization The algorithm is based on the assumption that the relationship between the TOA radiances acquired at two different times from regions of constant reflectance is spatially homogeneous and can be approximated by a linear function such that normalization is performed by applying a linear regression for each spectral band. The main challenge for RRN is identifying images features whose reflectance can be assumed to be constant over time. For absolute sensor calibration specific high altitude target sites are used, pseudo-invariant calibration sites (PICS), such as Lake Tuz Gölü. However, CoReSyF is focusing on coastal research applications whose AOI are far from PICS sites. Invariant targets may be manually selected from images but this approach is time-consuming, subjective and requires stalling the processing workflow for human interaction. Both these modules will be available through the Workflow Manager, an automated orchestration tool implemented within the CoReSyF project. 3.1.1.2 User instructions The module will be available through the Workflow Manager, an automated orchestration tool implemented within the CoReSyF project. This tool will be called via a simple command line which provides instructions about the required input files and the directory to write the outputs. command-name <referenceimage> <inputfiles> <outputfolder> For the same reason the input images are assumed to have been cropped to cover the area of interest (AOI) and a pre-condition to the module is that the cropped image contains both land and ocean pixels. If raw images require processing they must be cropped to correspond to the AOI as a pre-processing step. <inputfiles> A series of GeoTIFF images that have been ortho-rectified and are geolocated, have undergone radiometric and geometric corrections, have been spatially resampled to a reference grid and have associated metadata including illumination and observation geometry and classification flags (e.g. cloud masks). PUBLIC Page 10 of 25

<referenceimage> One of the images of the time series must be chosen as a reference to which all other scenes will be related. This image must be the least cloud-contaminated, timewise adequate for the application, and must have a good spectral dynamic range. <outputfolder> files. The location for the inter-calibrated output images, stored as GeoTIFF Histogram Equalization and Matching Although the functions mentioned in the overall description above are included in the MATLAB Imaging Processing Toolkit if that is unavailable Python and GDAL will be used. The implementation will be as a script, whether in MATLAB or Python, that processes all target images in relation to the specified reference image. Relative Radiometric Normalization This module requires that both a reference image and two invariant targets (IT) are selected, one dark and the other bright (but not saturated). In version 1 of the module these operations require manual intervention. From a series of data from a reference sensor and target sensor(s) derive calibration coefficients to align the radiometry of the target sensors with the reference sensor. By re-scaling all sensors to the same radiometric scale a consistent set of acquisitions over a given site, covering a broad range of geometries, is generated. select a reference image select dark and light invariant targets calculate linear relationship between targets foreach image to be calibrated to the reference image if image viewing geometry matches reference image geometry if targets are free of cloud foreach reference sensor band use polynomial to recalibrate the target image endfor endif endif endfor save to filesystem. Ideally images should only be separated by a few days and bands by a few nanometers. If these conditions are not met the alternative is to perform a relative calibration for each sensor and then apply a least squares regression to apply a polynomial fit to the temporal differences between the calibration and reference sensor doublets for each intercomparable band, from which the entire calibration sensor time series can be recalibrated over a validation site to the same radiometric scale as the reference sensor. PUBLIC Page 11 of 25

3.1.2 Atmospheric correction and Flagging 3.1.2.1 Overall Description ACRI-HE has developed the Land Atmospheric Corrections (LAC) algorithm which is combining land side and sea side atmospheric corrections techniques. To be started, the only assumption made by the algorithm is that both water and dark targets are located within the image. No input or in-situ measurement of the atmosphere is required. It is fully image-dependent. The aim was to design an algorithm easily adaptable to every coastal area and without the need for in-situ measurements of the atmosphere. The state of the atmosphere is to be determined solely using the image and its spectral bands. Rayleigh scattering, absorptions and aerosol optical depth are some of the parameters to correct during the atmospheric corrections processing. Over a water target, the share of these parameters in the signal can be illustrated as follow. Rayleigh scattering can be interpolated using a tabulated text file and the aerosol determination requires a deep image analysis to deduce their optical thickness and the Angstrom coefficient describing the impact on each wavelength. The way it is implemented is described below. 3.1.3 User instructions The module will be available through the Workflow Manager, an automated orchestration tool implemented within the Co-Resyf project. It is part of a processing chain starting from the input raw image to the delivery of a final product presenting sea bottom characteristics or bathymetry. This tool is called via a simple command line which provides instructions about the required input files and the wanted directory to write the outputs. PUBLIC Page 12 of 25

Ancillary data Some ancillary data are requested for this module. They mostly describe the absorption of the atmosphere depending on the wavelength (spectral band) and the solar zenith angles. This data are: A tabulated text file providing the Rayleigh scattering according to the spectral band (there are 2 files, one for Sentinel-2 and the other one for Landsat-8) and the solar zenith angles A text file to get the Ozone absorption depending on the wavelength It is to be noted that the impact of the atmosphere in the TOA signal also depends on the altitude. For that reason, there must be a global topographic map stored in the system. The SRTM is a Digital Elevation Database proposed by NASA at a 90m resolution, enough for this purpose. The module must have an easy access to this data. Input Landsat-8 and Sentinel-2 are both compatible with this module and are taken as input. Within a processing chain, this module is called as the first step or right after the cropping of the raw image. So the module must adapt to both possibilities. It is to be noted that if the image is cropped, the scene must have at least a 80km² surface and covers both water and land areas. In the other way, it is high likely that the corrected reflectance will be wrong. In order to be compliant with both cropped and raw images, the function must be called indicating the path and filename format of the images to process and the metadata filename. Finally, an output directory must be indicated to the module. Output As an output, the module delivers: A GeoTiff file for each atmospherically corrected spectral band at the exact same resolution and coverage as the input files A GeoTiff binary mask for clouds These files are published in the output directory indicated in the command line. In this way, they may be used as input for the following module described in the Workflow Manager. 3.1.4 Radiometric Correction 3.1.4.1 Overall Description This module is responsible for handling all radiometric correction operations for both SAR and optical imagery. In the context of the Co-ReSyF toolkit, radiometric correction means: PUBLIC Page 13 of 25

for Optical imagery o Procedures to pass from Digital Numbers (DN) to radiance or reflectance, correcting for sun angle and sensor sensitivity and, in the case of reflectance, normalizing it taking into account solar irradiance and earth-sun distance; for SAR imagery: o to provide imagery in which the pixel values can be directly related to the radar backscatter of the scene by deriving σo (sigma nought),βo (gamma nought) and γo (beta nought) values from the Level 1 DN values. Therefore it excludes: geometry correction radiometric inter-calibration between different sensors that will be included in the Image Inter-calibration tool. atmospheric correction that will be included in the Atmospheric Corrections and Flagging tool. 3.1.4.2 User instructions This feature will be available to all users, as a Co-ReSyF tool, through the Workflow Manager or via command line interface for expert users developing, testing or using their application in the Development Sandbox. It is part of a SAR or optical image pre-processing modules, usually starting from a Level-1 GeoTiff raster file containing Digital Numbers and outputting an image containing either images containing with σo, βo and γo values, in the case of SAR imagery or reflectances/radiances for optical ones. The implementation is based on the SNAP framework and the Sentinel 1 toolbox batch processing capabilities, using the command line Graph Processing Tool (GPT). The Co-ReSyF tool encapsulates this tool using a Python script to customize to its users needs allowing it, for instance, to automatically process full image directories. In the case of SAR imagery the processing chain uses the Sentinel-1 toolbox Radar Radiometric Calibration tool via the aforementioned GPT tool. Therefore, it is operational for all sensors currently supported by the toolbox function: Sentinel-1 (IW, EW, SM, SLC and GRD) are fully supported; ASAR (IMS, IMP, IMM, APP, APS, APM, WSM) are fully supported; ERS products (SLC, IMP) are fully supported; Radarsat-2 products are fully supported; As mentioned in the help SNAP documentation, other third party SAR missions may not be fully supported for all modes. For optical imagery, the implementation uses a similar SNAP GPT Tool/Python script setup but, since no specific Radiometric Calibration function is available for optical sensors, a band math PUBLIC Page 14 of 25

operator that allow arbitrary mathematical expressions to be applied to raster imagery is used. In the first version of this tool only Sentinel 2 and Landsat 8 will be supported. The algorithms for Landsat 8, required for this operation, are documented in the Landsat 8 (L8) Data Users Handbook. The Landsat 8 algorithm requires several parameters that have to read from the metadata file. The Python script reads that file and extract the necessary values to include them as variable input parameters in the GPT command line. For Sentinel 2, since the Sentinel SciHub only offers Level 1C products which contains TOA reflectance scaled to integers, the operator only applies the scale factor (10000). The command line call to the Python script has the following general layout: command-name <inputfiles or inputlocation> <auxfiles or AuxFolder> <auxfilenamepattern> <outputparameters> <outputfolder> The main and auxiliary inputs and the outputs present in this command line are described below. Ancilliary Data For some SAR imagery (e.g. ENVISAT ASAR) auxiliary files, containing the necessary calibration parameters might be required. For Landsat optical imagery metadata files containing all necessary calibration parameters will be needed to perform the conversion from DN to radiances or reflectance values. Input <inputfiles or inputlocation>: SAR or optical raster image filename or alternatively a location containing all those images; <auxfiles or AuxFolder>: Auxiliary (for some SAR images) or metadata (required for Landsat imagery) file name or location of those files in case of batch processing <auxfilenamepattern>: Filename pattern for auxiliary or metadata files (e.g. <image filename>_mtd.xml) <outputparameters>: in the case of SAR imagery the user could choose if he only wants the σo images or also the βo and γo. In the case of Optical imagery the user can opt for a radiance or reflectance product (or both). Output As an output, the tool generates a single raster file with the appropriate radiometric corrections or a set of them, depending on the chosen output parameters or if a input location was given containing more than one file. PUBLIC Page 15 of 25

3.1.5 Point measurements to gridded maps 3.1.5.1 Overall Description The aim of this tool is to create regular grids from scattered data from point measurements. The rasterize algorithm from the GDAL enables the conversion of attribute fields of vector geometries (points, lines and polygons) into the bands of a new raster image (gridded map). 3.1.5.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from an input vector layer to the delivery of the final product that could be the converted raster layer or the product of a geo-processing operation that requires an input raster layer. This tool is called via simple command line which provides instructions about the input and the wanted output raster file (new or existing one). Using the capabilities of the Workflow Manager, the tool will be able to perform batch processing of multiple input files. Input Input vector layer; Attribute field(s) - that are to be converted into raster bands; Output raster width; Output raster height; Output raster type data types (Byte, Int, Float...); Output As an output, the tool generates a new raster file (or updates an existing one) with the data contained within the attribute fields of the input vector layer. 3.1.6 Image crop 3.1.6.1 Overall Description The geospatial images are usually very large with multidimensional arrays, which impair the performance of geo-processing tasks. In order to reduce the amount of processing time, these images are usually clipped to a specific area of interest. The gdalwarp algorithm (GDAL) allows clipping a raster based on any polygon shapefile. The extent of the cropped raster is defined according with the bounding box or limits of the polygon. 3.1.6.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from raster and vector layers (inputs) to the delivery of the final product that could be the output of the tool or the product of a geo-processing operation that requires a cropped raster. PUBLIC Page 16 of 25

This tool is called via simple command line which provides instructions about the inputs and the wanted raster file to write the cropped image. Using the capabilities of the Workflow Manager, the tool will be able to perform batch processing of multiple input files. Input Input vector layer used to clip the geospatial image; Input raster image; Output As an output, the tool generates a new raster file with the same extent/size of the input vector layer. 3.1.7 Image mask 3.1.7.1 Overall Description The aim of this tool is to be able to take any raster and use it as a mask to extract values from a geospatial image. The mask layer is a raster in which each pixel contains no-data or a single valid value. The no-data pixels on the mask are assigned as no-data values on the output raster. For the remaining pixels, the output raster contains the values extracted from the input image. The QgsRasterCalculator (QGIS API) allows performing mathematical operations on each pixel in a raster, which can be used to mask out part of a raster. An example might be the case of a raster with elevation values ranging from below sea level to mountain tops. The raster calculator can be used to create a mask (or use an existing one) and apply it to the input raster to create a masked output raster. 3.1.7.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from the input raster layers (the original raster and the one to be used as a mask) to the delivery of the final product that could be the output of the tool or the product of a geo-processing operation that requires only part of a raster image. This tool is called via simple command line which provides instructions about the inputs and the wanted output file. Using the capabilities of the Workflow Manager, the tool is able to perform batch processing of multiple input files. Input Input raster image from which pixel values will be extracted; Input mask raster defining areas to extract; The following expression may be used to perform the mask processing: ( input_mask <= threshold) * input_image PUBLIC Page 17 of 25

Another possible expression would be: (input_mask = value1 OR input_mask = value2) * input_image The first part of both expressions (in parentheses) define the values of the input mask that will set to either 1 or 0. This creates the possibility of using the input image to create a mask on the fly. In the second part of the expression, we multiply our raster (input_image) by the 0 or 1 mask values, creating a masked raster image. Output As an output, the tool generates a new raster file with the same extent/size of the input mask raster and the pixel values extracted from the input raster image. 3.1.8 SAR speckle filtering 3.1.8.1 Overall Description Speckle noise is a common phenomenon in SAR imagery as well as other coherent imaging systems such as laser or acoustic. It originates from random interference on the radar backscatter signal due to the presence of multiple sub image pixel surface scatterers. Its presence diminishes the ability of extracting meaningful signal from SAR imagery, crucial for information extraction of surface features and the application of automatic scene analysis algorithms. Several speckles filter have been developed to minimise this noise through spatial filtering or or multilook processing. The implementation of a tool dealing with speckle filtering was considered a very important preprocessing feature for most coastal research applications dealing with SAR imagery and for that reason a Speckle Filtering tool will be included in this first version, based on the speckle filters available in the Sentinel-1 Toolbox. 3.1.8.2 User instructions Since the Speckle Filtering tool is based in the functionalities available in the Sentinel 1 toolbox its implementation will follow a similar setup as the one described for the Radiometric Correction Toolbox. It uses the command line based Graph Processing Tool (GPT) and its link with the XML graph files that define the image processing chain to apply the different speckle filter to the SAR input images. Those GPT calls are encapsulated in a Python script to customize to its users needs allowing it, for instance, to automatically process full image directories. The command line call to the Python script will have the following general layout: command-name <inputfiles or inputlocation> <specklefiltermethod> <auxparameters> <outputfolder> PUBLIC Page 18 of 25

The main inputs and the outputs present in this command line are described below. Input Required o <inputfiles or inputlocation>: SAR raster image filename or alternatively a location containing all those images; o <specklefiltermethod>: an identifier of the speckle filter method the user would like to use. According to the Sentinel Toolbox Documentation the following methods are available: Boxcar (mean) Median Frost Lee Refined Lee Gamma-MAP Lee Sigma IDAN Optional o <auxparameters>: input parameter values required for the chosen speckle filter method (if none given, defaults values will be used) Output As an output, the tool generates a single raster file or several ones, if a location was given containing more than one file, with the results of the speckle filter operation. 3.1.9 Image statistics 3.1.9.1 Overall Description The image statistics are required for a raster file to perform certain tasks, such as applying contrast or classifying data. The QGIS provides various analysis and geo-processing algorithms using mostly only QGIS API. Therefore, almost all algorithms can work out of the box without the need of additional configuration. The raster layer statistics is one of these algorithms specifically designed to calculate basic statistics of a raster layer. The statistics are calculated for each raster band, and include the minimum and maximum pixel values as well as the mean and standard deviation of the calculated pixel values. Note that, zonal statistics algorithm can also be used if more advanced features are required, such as: calculate statistics values for pixels, of input raster, inside certain zones (defined by a polygon layer) or raster bands. PUBLIC Page 19 of 25

3.1.9.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from an input raster layer to the delivery of the final product that could be the file containing the statistical results or the product of a geo-processing operation that requires statistical information. This tool is called via simple command line which provides instructions about the input and the wanted HTML file to write the results. Using the capabilities of the Workflow Manager, the tool is able to perform batch processing of multiple input files. Input The raster file to be analyzed, in a file format supported by GDAL: GeoTIFF, ArcInfo Binary Grid, ArcInfo ASCII Grid, and many more. Output As an output, the tool delivers a file in HTML format with the analysis results, such as: maximum and minimum pixel values, sum, mean value, standard deviation, and number of valid pixel values. This file is stored in the output directory indicated in the command line and it will be available for consulting in the Workflow Manager. 3.1.10 Reprojection and Coordinate System Definition tool 3.1.10.1 Overall Description This tool is responsible for performing reprojection between different coordinate systems for all vector and raster data formats present in co-resyf as well as attributing a coordinate system when this information is absent from the data file. 3.1.10.2 User instructions The tool will be available through the Workflow Manager, as others in the Co-ReSyF toolkit. It will be based in the GDAL Python library for raster formats, namely the gdalwarp function, and in the OGR Python library for shapefiles, specifically the ogr2ogr function. This tool is called via simple command line which provides instructions about the inputs and the wanted output file. Using the capabilities of the Workflow Manager, the tool will be able to perform batch processing of multiple input files. Input Input raster or vector data format; Source Coordinate system (defined by its EPSG code and name) the current coordinate system of the input data file; Output Coordinate System (defined by its EPSG code and name) the coordinate system the user wishes to transform the data into PUBLIC Page 20 of 25

The mathematical expression used in the calculation is set accordingly with the wanted error metric. Output As an output, the tool generates a new raster or vector file of the same type as the input in the coordinate system defined as the output. 3.1.11 Error metrics 3.1.11.1 Overall Description The aim of this tool is to calculate different error metrics, such as error difference, root-meansquare error, bias and the accuracy between a classified map and the data that is considered to be ground truth (reference data). The error metrics can be calculated against gridded (raster layer) or point observations (vector layer). The QgsRasterCalculator (QGIS API) allows performing mathematical operations on each pixel in a raster, which can be used to calculate the different error metrics between an input raster and the reference data. 3.1.11.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from the input layers (the original dataset and the reference data) to the delivery of the final product that could be the output of the tool or the product of a geoprocessing operation that requires a matrix/raster with the error metrics. This tool is called via simple command line which provides instructions about the inputs and the wanted output file. Using the capabilities of the Workflow Manager, the tool is able to perform batch processing of multiple input files. Input Input raster image the comparison layer; Input reference data the reference layer; The mathematical expression used in the calculation is set accordingly with the wanted error metric. Output As an output, the tool generates a new raster file with the error values. PUBLIC Page 21 of 25

3.1.12 Vector creation and edition 3.1.12.1 Overall Description Vector creation and editing allows the user to draw or modify basic shapes (points, lines, polygons) in vectorial format on top of the map, that can be exported in GeoJSON or Shapefile formats, compatible with most software. This type of operation allows the user to provide further information on its research application results, for example, by using points to mark objects of interest, lines to define data transects or polygons to define areas of interest. By exporting the data using standard formats, the user can share this work among other software or users. 3.1.12.2 User instructions The vector creation and edition tool will be available through the geoportal, where features can be interactively modified on top of existing layers of information a user might have. There are two ways to use this tool: either by using existing data (for example, in a CSV file), which can then be modified in the portal, or by drawing new features using the drawing tools available in the portal. The drawn features are initially stored in a GeoJSON object, a standard format based on the JavaScript Object Notation (JSON), which enables the representation of geographic data, such as: Points Lines Polygons Features Feature Collections The user can choose to download data in GeoJSON or convert the GeoJSON object into a shapefile. Shapefiles are binary files and cannot be directly edited or generated in JavaScript. In order to be able to generate these files, the GeoJSON data must be sent through an HTTP request to a web service, which shall use tools from GDAL to convert the data and return the generated shapefile. Input There are two possible inputs for the tool: 1. A file containing a representation of the data 2. Geographic features interactively drawn by the user Output PUBLIC Page 22 of 25

The output is either a GeoJSON object or a shapefile. 3.1.13 Layer stack creation 3.1.13.1 Overall Description The aim of this tool is to create a raster with different bands from independent raster files. The gdal_merge algorithm from the GDAL enables the creation of a raster from a stack of images. 3.1.13.2 User instructions This feature will be available through the Workflow Manager, as a Co-ReSyF tool. It is part of a processing chain starting from a set of raster layers to the delivery of the final product that could be the generated raster layer or the product of a geo-processing operation that requires an input raster with several bands. This tool is called via simple command line which provides instructions about the inputs and the wanted output raster file. Using the capabilities of the Workflow Manager, the tool is able to perform batch processing of multiple input files. Input Input raster files stack of raster images to be merged into one single raster file. The gdal_merge option -separate must be used to place each input file into a separate band. Output As an output, the tool generates a new raster file, where each band corresponds to a separate input raster file. 3.2 External Libraries and Third Party Tools 3.2.1 Overall Description In this first version, the platform provides access to the following external libraries and Third Party Tools: GDAL Python 2.7 or Python 3.4 SNAP Sentinel 1, 2 and 3 Toolboxes QGIS These libraries and tools are available to the user in the Sandbox development environment which is described in detail in the Generic Framework Description and User Manual v1 document (Co-ReSyF, 2017). PUBLIC Page 23 of 25

3.2.2 User instructions The instructions to install and use these external libraries and third party tools are covered in the Generic Description and User Manual Framework V1 document (Co-ReSyF, 2017). 4 References Co-ReSyF (2016). System Detailed Design - Tools, issue 1.0. European Commission, Research Executive Agency. Co-ReSyF (2017). Generic Framework Description and User Manual, issue 1.0. European Commission, Research Executive Agency. PUBLIC Page 24 of 25

END OF DOCUMENT PUBLIC Page 25 of 25