Module 11 Digital image processing

Similar documents
Enhancement of Multispectral Images and Vegetation Indices

Exercise 4-1 Image Exploration

This week we will work with your Landsat images and classify them using supervised classification.

RGB colours: Display onscreen = RGB

Remote Sensing Instruction Laboratory

Remote Sensing. The following figure is grey scale display of SPOT Panchromatic without stretching.

Seasonal Progression of the Normalized Difference Vegetation Index (NDVI)

GEOG432: Remote sensing Lab 3 Unsupervised classification

Unsupervised Classification

GEOG432: Remote sensing Lab 3 Unsupervised classification

Spectral Signatures. Vegetation. 40 Soil. Water WAVELENGTH (microns)

1. Start a bit about Linux

ANNEX IV ERDAS IMAGINE OPERATION MANUAL

Image Classification (Decision Rules and Classification)

The (False) Color World

GEO/EVS 425/525 Unit 3 Composite Images and The ERDAS Imagine Map Composer

AmericaView EOD 2016 page 1 of 16

QGIS LAB SERIES GST 101: Introduction to Geospatial Technology Lab 6: Understanding Remote Sensing and Analysis

Using QuickBird Imagery in ESRI Software Products

REMOTE SENSING INTERPRETATION

Viewing Landsat TM images with Adobe Photoshop

Image interpretation I and II

8. EDITING AND VIEWING COORDINATES, CREATING SCATTERGRAMS AND PRINCIPAL COMPONENTS ANALYSIS

How to Access Imagery and Carry Out Remote Sensing Analysis Using Landsat Data in a Browser

Software requirements * : Part I: 1 hr. Part III: 2 hrs.

Basic Hyperspectral Analysis Tutorial

F2 - Fire 2 module: Remote Sensing Data Classification

Spatial Analyst is an extension in ArcGIS specially designed for working with raster data.

8th ESA ADVANCED TRAINING COURSE ON LAND REMOTE SENSING

In late April of 1986 a nuclear accident damaged a reactor at the Chernobyl nuclear

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego

Lab 6 Profiles of DEMs and change detection by using the DEMs

EXERCISE 1 - REMOTE SENSING: SENSORS WITH DIFFERENT RESOLUTION

Supervised Land Cover Classification An introduction to digital image classification using the Multispectral Image Data Analysis System (MultiSpec )

Software requirements * : Part I: 1 hr. Part III: 2 hrs.

Due Date: September 22

GEO/EVS 425/525 Unit 2 Composing a Map in Final Form

Raster is faster but vector is corrector

Preparing Remote Sensing Data for Natural Resources Mapping (image enhancement, rectifications )

Land Cover Type Changes Related to. Oil and Natural Gas Drill Sites in a. Selected Area of Williams County, ND

CHANGE DETECTION USING OPTICAL DATA IN SNAP

Applications of satellite and airborne image data to coastal management. Part 2

Satellite image classification

Land Remote Sensing Lab 4: Classication and Change Detection Assigned: October 15, 2017 Due: October 27, Classication

APCAS/10/21 April 2010 ASIA AND PACIFIC COMMISSION ON AGRICULTURAL STATISTICS TWENTY-THIRD SESSION. Siem Reap, Cambodia, April 2010

GST 101: Introduction to Geospatial Technology Lab Series. Lab 6: Understanding Remote Sensing and Aerial Photography

LAB 2: Sampling & aliasing; quantization & false contouring

Aim of Lesson. Objectives. Background Information

Remote sensing in archaeology from optical to lidar. Krištof Oštir ModeLTER Scientific Research Centre of the Slovenian Academy of Sciences and Arts

EE/GP140-The Earth From Space- Winter 2008 Handout #16 Lab Exercise #3

PASS Sample Size Software

Remote Sensing in an

Land Cover Change Analysis An Introduction to Land Cover Change Analysis using the Multispectral Image Data Analysis System (MultiSpec )

GE 113 REMOTE SENSING. Topic 7. Image Enhancement

Assessment of Spatiotemporal Changes in Vegetation Cover using NDVI in The Dangs District, Gujarat

ArcGIS Tutorial: Geocoding Addresses

Files Used in This Tutorial. Background. Calibrating Images Tutorial

Digital Image Processing

Keywords: Agriculture, Olive Trees, Supervised Classification, Landsat TM, QuickBird, Remote Sensing.

Remote Sensing in an

Hydraulics and Floodplain Modeling Managing HEC-RAS Cross Sections

Remote Sensing in an

Apply Colour Sequences to Enhance Filter Results. Operations. What Do I Need? Filter

Lab 3: Image Enhancements I 65 pts Due > Canvas by 10pm

GE 113 REMOTE SENSING

White paper brief IdahoView Imagery Services: LISA 1 Technical Report no. 2 Setup and Use Tutorial

Lab 3: Introduction to Image Analysis with ArcGIS 10

Using Soil Productivity to Assess Agricultural Land Values in North Dakota

Interpreting land surface features. SWAC module 3

ADMS 5 MapInfo Link. User Guide CERC

v References Nexus RS Workshop (English Version) August 2018 page 1 of 44

Lab 1 Introduction to ENVI

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur.

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

Hydraulics and Floodplain Modeling Managing HEC-RAS Cross Sections

Land Cover Change Analysis An Introduction to Land Cover Change Analysis using the Multispectral Image Data Analysis System (MultiSpec )

Lab 1: Introduction to MODIS data and the Hydra visualization tool 21 September 2011

Lecture 13: Remotely Sensed Geospatial Data

Lab #10 Digital Orthophoto Creation (Using Leica Photogrammetry Suite)

University of Technology Building & Construction Department / Remote Sensing & GIS lecture

An Introduction to Remote Sensing & GIS. Introduction

BV NNET User manual. V0.2 (Draft) Rémi Lecerf, Marie Weiss

Introduction to Remote Sensing Part 1

FLIR Tools for PC 7/21/2016

Mod. 2 p. 1. Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur

The first part of Module three, data and tools, presents some of the resources available on the internet to get images from the satellites presented

Lesson 3: Working with Landsat Data

CanImage. (Landsat 7 Orthoimages at the 1: Scale) Standards and Specifications Edition 1.0

GEO/EVS 425/525 Unit 9 Aerial Photograph and Satellite Image Rectification

(Refer Slide Time: 1:28)

INTRODUCTION TO SNAP TOOLBOX

Application of GIS for earthquake hazard and risk assessment: Kathmandu, Nepal. Part 2: Data preparation GIS CASE STUDY

The New Rig Camera Process in TNTmips Pro 2018

Some Basic Concepts of Remote Sensing. Lecture 2 August 31, 2005

LSM 780 Confocal Microscope Standard Operation Protocol

Excel Tool: Plots of Data Sets

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Lesson 9: Multitemporal Analysis

Abstract Urbanization and human activities cause higher air temperature in urban areas than its

Transcription:

Introduction Geo-Information Science Practical Manual Module 11 Digital image processing

11. INTRODUCTION 11-1 START THE PROGRAM ERDAS IMAGINE 11-2 PART 1: DISPLAYING AN IMAGE DATA FILE 11-3 Display of DN-range 0... 255 (no stretch)...11-3 Display after linear stretch of DN-range minimum... maximum...11-6 Display after linear stretch of DN-range 40... 90...11-6 Display after standard deviation stretch...11-7 Display of color composites...11-8 PART 2: SUPERVISED CLASSIFICATION 11-9 Examining land cover types using spectral profiles...11-9 Digitizing training areas & estimation of signatures...11-10 Collecting signatures...11-10 Evaluating Signatures...11-11 Land cover classification...11-13 Minimum distance classification...11-14 Maximum likelihood classification...11-14 Updating a color palette...11-15 Exporting to ArcGIS file format...11-15 LGN Database...11-15 IMAGE SOURCES 11-16 RELATED INTERNET SITES 11-17

11. Introduction The aim of the exercises in this module is to acquire a first experience in understanding remote sensing data by handling multi-spectral image data with the GIS and Remote Sensing package Leica Erdas Imagine for Windows. For the exercises, we will use image data of Wageningen and its surroundings (Figure 5). This is a subset of a much larger scene taken by the remote sensing satellite Landsat-5 TM (Thematic Mapper) on 11 July 1995. A spatial subset of the entire scene (185x185 km 2 ), with seven spectral bands is available (Table 1). TM band Spectral band Color name 1 0.45...0.52 µm blue 2 0.52...0.60 µm green 3 0.63...0.69 µm red 4 0.76...0.90 µm near-infrared 5 1.55...1.75 µm mid-infrared 6 10.4...12.5 µm thermal-infrared 7 2.08...2.35 µm mid-infrared Table 1. The seven spectral bands of the Landsat-5 TM sensor. The image covers an area of 15.3 15.3 km 2 and consist of 510 columns 510 rows. Each pixel represents an area of 30 30 m 2. The sensor of band 6 observes pixels with a size of 120 120 m 2. In this module you will practice with different image processing techniques including different display methods, the use of color composites and supervised classification. The results of a digital image classification can be used as input in a GIS. In this module: An introduction to the software package Erdas Imagine. Displaying an image data file: stretching and color composites. Selecting training sites for classification. Collecting spectral signatures of training sites. Three supervised classification methods. Objectives After having completed this module you will be capable: to understand the principle behind various image display techniques; to perform a supervised classification with Erdas Imagine; to describe the differences between three supervised classification methods. Erdas Imagine Images: Wag95.img, Meris_wag.img, Quickbird_27032002_rd.img Literature: Remote Sensing reader, Jan Clevers (Ed.) 11-1

Start the program Erdas Imagine 1. Start the Erdas Imagine package. Click start, select Programs Leica Geosystems GIS & Mapping Erdas Imagine 8.7 Erdas Imagine 8.7. Or click the Error! Objects cannot be created from editing field codes. icon on the desktop. 2. Click Session in the main menu bar (Figure 1), click preferences. 3. Set Default data folder to: D:\IGI\...* \Erdas_imagine\data (*morning or afternoon). 4. Set Default output folder to: D:\IGI\...* \Erdas_imagine\workspace (*morning or afternoon). On top of the Erdas Imagine window you see the main menu bar (Figure 1). Clicking one of the items of the menu bar gives a pull down menu with a number of options. Figure 1. The Erdas Imagine menu bar. Just below the menu bar you see the viewer (Figure 2). The menu and icons in the viewer can be used to open an image and applying basic viewer functions. If you move the cursor over the icons, you see a short indication of the function in the lower left corner of the viewer. Figure 2. The Erdas Imagine Viewer, where images are displayed.. For questions about the tools you will use, you are encouraged to press the context sensitive help button in the dialogue box of the selected tool. In the dialogue boxes often default settings are given. In general they are used; if not, then you will be notified. 11-2

PART 1: Displaying an IMAGE data file In the Erdas package an image data file is usually stored in the unsigned 8-bit (or 1 byte) data type. This means that integer values from 0 to 255 can be stored. Pixel values are often called DN-values (Digital Number), being simply a value without a unit. They represent a distinct level of electromagnetic radiation received by the sensor. Speaking in terms of attribute scales, this type of data belongs to the ratio category. The image data file names have the extension.img which is accompanied by a.rrd file where the so-called pyramid layers are stored. These are used for fast zooming and panning in the image. In order to get familiar with image processing and remote sensing data we start with displaying and processing of one image data file. We use the image data of a Landsat-5 TM recording of band 4 (see Table 1) during this exercise. This band contains spectral information of a near-infrared band: 0.76-0.90 µm. You will find that different image stretching techniques of the same image data file produce different pictures on the screen. The following cases will be investigated: display of DN-range 0... 255; display of DN-range minimum... maximum; display after linear stretch of DN-range 40... 80 ; display after linear stretch with saturation; displaying color composites. Keep the resulting pictures on the screen to notice the differences!! Display of DN-range 0... 255 (no stretch) 1. Click in the viewer menu bar either File Open Raster layer or click the Open layer button. 2. Select image wag95.img. DO NOT OPEN THE IMAGE YET!!! 3. Click the Raster Options tab (Figure 3), click the Display as dropdown arrow and select Gray Scale, select Layer 4, and switch on the No Stretch option. 4. Press the OK button to open the image. Figure 3. The Raster Options tab. This way, the Grey Scale Palette produces a picture on the screen with 256 grey tones. The range in grey tones is a linear scale from black (DN-value 0) to white (DN-value 255); each DN-value of the image will basically have its own grey value (Figure 4). Figure 4. Principle of no stretch of image values (DN) into display levels. 11-3

Since not all DN-values from 0 up to 255 are present in the original image, not all grey tones are used in the picture. Although you will recognize Wageningen and surroundings, the picture can be made brighter. But first you examine the DN distribution of the image. In order to examine the DN distribution of an image, display the histogram. 1. 1. Click the ImageInfo button Error! Objects cannot be created from editing field codes. in the standard toolbar or click Utility Layer info. 2. The ImageInfo window opens, showing file, layer, statistics and map information. 3. Select layer 4. 4. Click the Histogram tab or the histogram button in the toolbar of the ImageInfo window. 5. If the cursor is placed inside the histogram, three vertical lines are displayed showing the minimum, maximum and mean values. a. What is plotted at the horizontal axis and what at the vertical axis? b. Write down the values for minimum, maximum, mean and standard deviation for band 4. When an image is displayed on the screen, the DN-values (File Pixel values) are translated to a grey tone (Lookup Table (LUT) -value). In case of an image displayed without stretch, the DN-value is the same as the LUT-value (Figure 4). You can view these values with Inquire cursor (click or Utility Inquire cursor). It is important to zoom in to a level where you can distinguish the individual pixels. 2. a. Check the DN-values and LUT-values of water, grass, forest and heath land. You can find the location of these objects in Figure 5. Write down the values in Table 2 in the no stretch columns. b. Which cover types has a DN-value of less than 30 in band 4? Explain this in terms of absorption/reflectance. c. Which two factors determine the grey tone of a pixel on the screen? d. Where is the origin of the column/row coordinate system? Land cover type Water Grass Forest Heath land DN-value (no stretch) LUT-value (no stretch) DN-value (linear stretch) Table 2. DN-values of land use types using different display techniques. LUT-value (linear stretch) 11-4

Figure 5. Selected training fields in the Landsat TM band 5 scene of 11 July 1995 of the area around Wageningen. 11-5

Display after linear stretch of DN-range minimum... maximum Within Erdas Imagine an option is available to stretch the original DN-values of the image for display. The minimum DN-value of the image will be presented by the minimum grey tone (black) on the screen; the maximum DN-value will get the maximum grey tone (white) if the grey tone palette with 256 levels is used (Figure 6). Figure 6. Principle of linear stretch of image values (DN) into display levels. The linear relationship between DN-value and LUT-value is in this example: LUT = 2.60*DN-156. 3. 1. To apply a linear stretch to your image, click in the viewer menu bar: Raster Data scaling. 2. Make sure Linear selected in selected in the Binning field. 3. Replace the values for Min and Max with the minimum and maximum DN-values from the image info. 4. Click OK. a. Check the DN-values and LUT-values of the four land cover types again. Add these values to Table 2 in the linear stretch columns. b. Can you explain the changes in LUT-values? Explain why some land cover types get a higher LUT value, while other land cover types get a lower LUT-value. Display after linear stretch of DN-range 40... 90 Linear stretch of the minimum and maximum DN-values does improve the image somewhat compared to the image without stretch but contrast is still relatively low. The histogram shows that the majority of the DN-values are distributed between 40 and 90. You can gain more contrast in your image by emphasizing this DN-range on your screen. 4. Use the data scaling function to apply a linear stretch of the DN-range from 40 to 90. a. Which land cover types can you distinguish now with more grey tones in a smaller DN-range? b. Investigate the DN and LUT-values of the four land cover types. Explain the linear stretch principle. 11-6

Display after standard deviation stretch Standard Deviation Stretch is based on the idea that image stretching for display in the DN-range of the minimum up to the maximum value may not give a good picture because the minimum and/or maximum value may be unfortunate extreme(s). When using this function, results in a linear stretch between -2 and +2 standard deviation from the average. In practice, this means that from both sides of the histogram 2.5% of the observations are skipped. As a result, single observations with very low or high values are ignored during the stretching. Standard Deviation Stretch is the default stretch function used in Erdas Imagine. 1. Open a new viewer, select band 4 for a display in grey scale, but do not switch the no stretch button on this time. The image will now be opened with Standard Deviation Stretch. 2. This stretch function can also be assessed through the menu bar. Click Raster Contrast Standard Deviation Stretch. 3. You can use Tile Viewers to put the viewers easily in one screen. Click in the viewer menu bar: View Tile Viewers. 5. Open four viewers (click the viewer button in the main menu bar) and put the four pictures with different image stretching next to each other. a. Compare layer 4 of wag95.img with the different stretching options. Which stretch function gives in your opinion the best picture? Display layer 3 of wag95.img according to your answer to exercise 5a in a new viewer. b. Which grey tone has grassland (see e.g. the meadows near the river) in band 3; is this different from band 4? In what way? Explain the difference (remember the typical spectral signature of green vegetation). 6. Landsat-5 TM band 6 contains the thermal-infrared image data. Display layer 6 of wag95.img. You can re-open the image, or change the band which is displayed with a. Why is the image of band 6 so coarse? b. Which cover type has a relative low temperature and which one has a relatively high temperature? Close all viewers. 11-7

Display of color composites Color composites of remote sensing data can be very helpful during investigation and interpretation in the field or for presentation purposes. Color composites have three spectral bands displayed simultaneously. 1. Open a viewer. 2. Add a raster layer to the viewer. Click the Raster Options tab. 3. Display as: True Color. 4. Attach bands to the Red, Green and Blue colors. 5. Click OK. 6. When a color composite is opened, you can always change the band combination. Click Raster Band Combinations. 7. Change the spectral bands for the three channels. If the Auto Apply box is ticked, band changes appear immediately on screen. Note: the terminology used by Erdas Imagine may be confusing. The fact that you use the option true color in the selection menu does not mean that you display a true-color image. This depends on the spectral bands you attach to the Red, Green and Blue band respectively. 7. Open three color composites of image wag95.img with band combinations as described in Table 3. a. Why are the composites called true, false or pseudo color? b. Check the colors for the cover types water, forest and bare soil in each composite. Write your findings down in Table 4. c. Which band combination or color composite shows the largest contrast between the different land cover types? Why? Close all viewers. Red Green Blue True Color 3 2 1 False Color 4 3 2 Pseudo Color e.g. 4 e.g. 5 e.g. 3 Table 3. Band combinations of three types of color composites. Water Forest Bare soil True Color False Color Pseudo Color Table 4. Land cover colors in each composite. 11-8

PART 2: Supervised classification Supervised classification is one of the techniques to transform remote sensing data into useful thematic information that could be used as input to a geographic information system. As a preparation for supervised classification, one decides beforehand which cover types must be classified and one selects proper training areas. These training areas are known cover types, based on field visits or general knowledge of parts of the area. Since we assume that you have some knowledge of the area around Wageningen, you will make several classifications without extensive fieldwork. Statistical characteristics of the spectral data of the selected training areas are set down in signature files. These signature files are then used by the classification method to derive the class boundaries for each cover type in the feature space. The actual classification of all pixels is performed in this feature space. The following activities will be executed: examining spectral profiles; digitizing training areas; estimation of signatures; classifications; updating a color palette (optional exercise) 8. a. Give a description of a 2-dimensional feature space. What is plotted on the axes of the feature space? Examining land cover types using spectral profiles You will start the classification procedure by examining the spectral profiles of several land cover types. 9. 1. Open the spectral profile tool. Click in the viewer menu bar: Raster Profile Tools. 2. Click Spectral and click OK. The Spectral Profile window opens. 3. Click to activate the inquire tool 4. Click with the inquire cursor a land cover type in the image. The spectral profile of this pixel will be drawn in the graph. The line represents the value of the selected pixel for each band (Figure 7). 5. To display wavelength on the x-axis click Edit Use Sensor Attributes. Click the Sensor type dropdown arrow and select landsattm. Try to locate a few different land cover types (water, forest, agricultural land, and town) and show their spectral profiles in the graph. a. Which two bands show the largest difference in pixel value between water and vegetation? 11-9

Figure 7. Spectral profiles of three land cover types. Digitizing training areas & estimation of signatures During the first phase of the classification process you choose a band combination that shows a clear discrimination between most land cover types in order to digitize training fields of the cover types you are going to classify: grass; bare soil; deciduous forest; pine forest; heather; maize; town; water. Representative examples of these cover types are shown in figure 5. You will use user-defined polygons in the image for the selection of training samples Note: The training areas are in general small areas with at least 25 pixels. These areas should be chosen as pure (homogenous) as possible, so if you digitize e.g. a training site of water in the river, do not include the river borders! Collecting signatures 1. Open a new viewer and display your most expressive composite (see your answer to exercise 7a) and zoom in to get a more detailed look at the picture during digitizing. 2. Click in the main menu bar and click Signature Editor... A new window will be opened, move it so the area with the training fields can be seen clearly. 3. Click in the viewer menu bar AOI Tools... 4. Click the AOI Tool palette button to create a polygon. Draw a polygon in one of the training areas (see figure 5). Digitize polygon points by clicking the LMB (Left Mouse button) and finish it by double clicking the LMB. 11-10

5. Click in the Signature Editor the button to add the signature of the digitized training area to the signature list. 6. Give this signature a name according to the land cover (e.g. Water, Beets, Town, etc.). Notice that the color assigned to this class is the same as the color inside the AOI in the picture in default display (R=4; G=3; B=2). You can change the color combination if you wish. 10. a. Digitize the 8 training areas (7 indicated in figure 5 and the class town) according to the steps described above, and add the signatures to the signature list. Save the signature file in the workspace folder located in the Erdas Imagine folder. Name the signature file wag95_your_name.sig. Evaluating Signatures Before you perform a classification you need to study the signatures to get an accurate idea about the position and size of the classes in the feature space. You can present the results of the signature computation in a mean plot or histogram. You can compare the signatures of the different cover types; see if they are well separated. If not, then perhaps you did not choose the correct training area or it is a matter of different growth conditions or a registration error is made during field visit at the time of image recording. This way you can also get an idea if it is useful to perform the classification with all available bands. For this exercise you need a viewer with the source image wag95.img and the Signature Editor with wag95_your_name.sig. Mark the signature you want to investigate by clicking the row in the column with the > mark. In the ERDAS IMAGINE package the signatures can be studied in different ways. Add statistical data 1. Click in the Signature Editor window View Columns, the Viewer Signature Columns window opens. 2. Select all rows except red, green and blue, click Statistics and click min, max and mean in the Column Statistics window. 3. Click Apply in the View Signature Columns window; close this window and the Column Statistics window. 4. If you move the slide bar in the Signature Editor window to the right and you will see that all statistical values appear. 11. a. Which spectral bands show the clearest (spectral) distinction between land use classes? 11-11

Show the mean value(s) in a graph 1. Click in the Signature Editor window View Mean Plots..., the Signature Mean Plot window opens. Depending on the option you choose you can display either the marked signature or selected signatures or all signatures. You can select more than one signature by keeping the shift key down during selection in the signature editor. Show histograms 1. Click in the Signature Editor window View Histograms..., the Histogram Plot Control Panel opens and simultaneously the histogram of the first band of the marked signature appears. 2. Select the classes you want to display in a histogram in the Signature Editor if you want to visualize multiple classes in one plot. 3. The chosen options in the Histogram Plot Control Panel are activated when you click the Plot... button. 12. a. Check the separability of the classes in all spectral bands by examining the histograms. b. Which bands can be used to differentiate between deciduous and pine forest? c. Which land use classes will be hard to distinguish? d. What is the consequence of poorly distinguishable spectral signatures during classification? c. Suppose you could only use three spectral bands for land use classification. Which three bands would you choose? 11-12

Land cover classification For classification of a remote sensing image, the ERDAS IMAGINE package is equipped with parametric and non-parametric decision rules. The difference between these decision rules will be treated in more detail during the course Remote Sensing (GRS 20306). For the classifications in this module, you will use the parametric decision rules Minimum distance and Maximum likelihood. 1. The supervised classification is started from the Signature Editor. Click Classify Supervised... The Supervised Classification window opens (Figure 8). 2. Select the Input Raster File, this is the image you want to classify. 3. Select the Input Signature File, this is the file in which you stored the spectral signatures of the training areas. 4. Give the output image a name in the Output File box. 5. Select classification decision rules: Non-parametric Rule, Overlap Rule, Unclassified Rule and Parametric Rule. 6. Click OK. Figure 8. The supervised classification window where you name the output files and set the decision rules. 13. a. Open the Supervised Classification window. Which Parametric Rules are available? 11-13

Minimum distance classification Minimum distance classification method assigns a pixel to a land cover class based on the distance of the pixel to the center of the mean signature value of that class in the feature space. 14. Carry out a supervised classification of wag95.img with the Minimum Distance (MD) rule. Name the output image wag95-md.img. Use the following classification setting: Non-parametric Rule : None Overlap Rule : - Unclassified Rule : - Parametric Rule : Minimum Distance a. Display the classification result in a new viewer and notice that all pixels are classified. Note this image has nothing to do with spectral reflectance. You are looking at a land cover map, where pixel values indicate a land cover class. b. Table 5 lists four control points. Write down the land cover class of each control point in the MD column. Use the inquire cursor tool to retrieve the value of the control point. The value corresponds to a land cover class. Location Map X Map Y MD MLHD 1 435-109 2 217-267 3 454-206 4 443-395 Table 5. Results of different classification methods. Maximum likelihood classification The maximum likelihood classification method is based on the probability that a pixel belongs to a particular class. 15. Carry out a supervised classification of wag95.img with the Maximum Likelihood (MLHD) rule. Name the output image wag95-mlhd.img. Use the following classification setting: Non-parametric Rule : None Overlap Rule : - Unclassified Rule : - Parametric Rule : Maximum likelihood a. Display the classification result in a new viewer. b. Write down the land cover class of each control point in the MLHD column of Table 5. Use the inquire cursor tool to retrieve the value of the control point. The value corresponds to a land cover class. 11-14

Updating a color palette It might be that the colors of the different land cover classes are not well chosen. You can change these colors. Changing colors here is only possible for thematic data. 1. Click in the viewer menu bar Raster Attributes. Select the layers (spectral bands) you want to use for classification. The Raster Attributes Editor opens. 2. Click a colored cell in the color column and select a color from the list. 16. Change the colors of one of the classification results to land use map colors (town = red, forest = dark green, water = blue, etc ). Exporting to ArcGIS file format The classification result can be used for further analysis, which is usually done with GIS-software like ArcGIS. To avoid processing problems it is advised to export imagine files with img format to ArcGIS raster format. 17. 1. Click in the main menu bar Import. Select the classification result you want to use for further analysis. Export it to GRID format. Save it in the proper location. 2. Open the exported file in ArcGIS (ArcMap of ArcCatalog). Export one of the land cover classification results to GRID format. Save the dataset in the ArcGIS workspace folder. LGN Database Many remote sensing satellites collect Remote Sensing data. Image data from the Landsat-5 TM satellite is used, a comparable way you just did, to make land-use classifications for the "Landelijke Grondgebruiksclassificatie van Nederland" (LGN). The LGN database covers The Netherlands and is created and updated at the Centre for Geo- Information of the Wageningen University and Research Centre (WUR). The data can be obtained from the Geodesk of the Centre for Geo-Information. The LGN database is updated on a regular basis and is very useful for all kinds of applications, e.g. for planning and for environmental scenario studies. A part containing the surroundings of Wageningen is copied from LGN4 (2000) and available in waglgn4.img. 18. Give a few reasons why wag-lng4 and your classification results are not exactly the same? 11-15

Image sources In the previous part of the exercise you investigated a Landsat TM image, with a pixel size of 30 meter and 7 spectral bands. However, there are numerous sensors with their own specifications. For which applications the recorded images can be used depends on the spatial, spectral and temporal resolution of the sensor. Compare three images of different sensors and describe the strong and weak points of each data-set. The available images are a MERIS, Landsat TM and Quickbird image, all covering the area around Wageningen. MERIS has a high spectral and radiometric resolution and a dual spatial resolution; 1200m and 300m. MERIS Band Band centre (nm) MERIS Band Band centre (nm) MERIS Band Band centre (nm) 1 412.5 6 620 11 760 2 442.5 7 665 12 775 3 490 8 681.25 13 865 4 510 9 705 14 890 5 560 10 753.75 15 900 Table 6. The spectral bands of the MERIS sensor Quickbird images can be either panchromatic with a spatial resolution of 0.61m, or multi-spectral, which results in a pixel size of about 2.5m. Quickbird band Spectral band Colour name 1 0.45...0.52 µm blue 2 0.52...0.60 µm green 3 0.63...0.69 µm red 4 0.76...0.90 µm near-infrared Table 7. The spectral bands of the Quickbird sensor 19. 1. Open a false color composite of the three images (Meris_wag.img, wag95.img and quickbird_27032003_rd.img) in separate viewers. 2. Investigate the spectral profile of several landcover types for the three images. 3. To display wavelength on the x-axis click Edit Use Sensor Attributes. Click the Sensor type dropdown arrow and select MERIS, landsattm, or QuickbirdMS respectively. a. Which bands did you select for each false color composite? b. Are the images geometrically and atmospherically corrected? How did you determine this? c. Discuss the strong and weak points of the three images in terms of spatial, spectral and temporal resolution. 11-16

d. Why is it not possible to build a space-born remote sensing sensor, which has a good spatial, spectral and temporal resolution? Related internet sites More concerning LGN is available at http://www.lgn.nl/. The following paper gives an elaborate description about the creation of the LGN data base: http://www.dow.wur.nl/internet/webdocs/internet/geoinformatie/lgn/isprs_2000_lgn3.pdf For more concerning the basic principle and display of color composites, check: http://chesapeake.towson.edu/data/all_composite.asp 11-17

11-18