Making NDVI Images using the Sony F717 Nightshot Digital Camera and IR Filters and Software Created for Interpreting Digital Images.

Similar documents
Interpreting land surface features. SWAC module 3

How can we "see" using the Infrared?

Using Color-Infrared Imagery for Impervious Surface Analysis. Chris Behee City of Bellingham Planning & Community Development

TRUESENSE SPARSE COLOR FILTER PATTERN OVERVIEW SEPTEMBER 30, 2013 APPLICATION NOTE REVISION 1.0

Infrared Photography. John Caplis. Joyce Harman Harmany in Nature

Lecture 13: Remotely Sensed Geospatial Data

Introduction to Remote Sensing Fundamentals of Satellite Remote Sensing. Mads Olander Rasmussen

Digital Imaging Rochester Institute of Technology

FOR 353: Air Photo Interpretation and Photogrammetry. Lecture 2. Electromagnetic Energy/Camera and Film characteristics

Viewing New Hampshire from Space

Physics Learning Guide Name:

An Introduction to Remote Sensing & GIS. Introduction

How to Access Imagery and Carry Out Remote Sensing Analysis Using Landsat Data in a Browser

2017 REMOTE SENSING EVENT TRAINING STRATEGIES 2016 SCIENCE OLYMPIAD COACHING ACADEMY CENTERVILLE, OH

RGB colours: Display onscreen = RGB

Figure 1: Percent reflectance for various features, including the five spectra from Table 1, at different wavelengths from 0.4µm to 1.4µm.

Test 1: Example #2. Paul Avery PHY 3400 Feb. 15, Note: * indicates the correct answer.

An NDVI image provides critical crop information that is not visible in an RGB or NIR image of the same scene. For example, plants may appear green

Global Systems Science

The (False) Color World

COLOR FILTER PATTERNS

Detecting Greenery in Near Infrared Images of Ground-level Scenes

Lesson Title: The Science of Light and Photography Subject Grade Level Timeline. Physical Science minutes. Objectives

2. Pixels and Colors. Introduction to Pixels. Chapter 2. Investigation Pixels and Digital Images

Satellite Imagery and Remote Sensing. DeeDee Whitaker SW Guilford High EES & Chemistry

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters

Enhancement of Multispectral Images and Vegetation Indices

Reflected ultraviolet digital photography with improvised UV image converter

The New Rig Camera Process in TNTmips Pro 2018

Philpot & Philipson: Remote Sensing Fundamentals Color 6.1 W.D. Philpot, Cornell University, Fall 2012 W B = W (R + G) R = W (G + B)

Aerial photography and Remote Sensing. Bikini Atoll, 2013 (60 years after nuclear bomb testing)

Geo/SAT 2 INTRODUCTION TO REMOTE SENSING

1. Start a bit about Linux

Digital Photography: Fundamentals of Light, Color, & Exposure Part II Michael J. Glagola - December 9, 2006

It s All About Image Analyzing Thermal Images

Remote Sensing for Rangeland Applications

ELECTROMAGNETIC WAVES AND LIGHT. Physics 5 th Six Weeks

Lidar stands for light detection and ranging. Lidar imagery is created with a laser beam composed of a very narrow light band.

Image Band Transformations

Introduction to Remote Sensing

Digital Image Processing COSC 6380/4393. Lecture 20 Oct 25 th, 2018 Pranav Mantini

Land Cover Analysis to Determine Areas of Clear-cut and Forest Cover in Olney, Montana. Geob 373 Remote Sensing. Dr Andreas Varhola, Kathry De Rego

Crop Scouting with Drones Identifying Crop Variability with UAVs

Remote Sensing Platforms

Spectral and Polarization Configuration Guide for MS Series 3-CCD Cameras

Camera Requirements For Precision Agriculture

Exposure Triangle Calculator

Physics for Kids. Science of Light. What is light made of?

Lens: Lenses are usually made of and have 2 curved surfaces. Draw figure 5.23 on Page 191. Label it clearly and use a ruler for the light rays.

Fig Color spectrum seen by passing white light through a prism.

Lab 3: Image Enhancements I 65 pts Due > Canvas by 10pm

CS559: Computer Graphics. Lecture 2: Image Formation in Eyes and Cameras Li Zhang Spring 2008

10 Tips for Shooting Autumn Foliage

Introduction to Remote Sensing Part 1

Chapter 16 Light Waves and Color

Digitization and fundamental techniques

What Eyes Can See How Do You See What You See?

National 3 Physics Waves and Radiation. 1. Wave Properties

11. What happens if two complementary colors are projected together at the correct intensities onto a white screen?

Camera Requirements For Precision Agriculture

Remote Sensing of Environment (RSE)

Exploring the Earth with Remote Sensing: Tucson

Wonderlab The Statoil Gallery

Filters for the digital age

CHAPTER 7 - HISTOGRAMS

Capturing Light in man and machine. Some figures from Steve Seitz, Steve Palmer, Paul Debevec, and Gonzalez et al.

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

In the last chapter we took a close look at light

How does prism technology help to achieve superior color image quality?

Smithsonian. Reflections National Earth: AirExploring and Space Planet Earth Museu from Space program is made possible by support from Honda.

Understanding Color Theory Excerpt from Fundamental Photoshop by Adele Droblas Greenberg and Seth Greenberg

A Beginner s Guide To Exposure

INDIAN INSTITUTE OF TECHNOLOGY ROORKEE NPTEL NPTEL ONLINE CERTIFICATION COURES. Digital Image Processing of Remote Sensing Data

Plant Health Monitoring System Using Raspberry Pi

Sensors and Data Interpretation II. Michael Horswell

OUTDOOR PORTRAITURE WORKSHOP

Introduction to Color Theory

CAMERA BASICS. Stops of light

REMOTE SENSING. Topic 10 Fundamentals of Digital Multispectral Remote Sensing MULTISPECTRAL SCANNERS MULTISPECTRAL SCANNERS

Density vs. Contrast

Choosing the Best Optical Filter for Your Application. Georgy Das Midwest Optical Systems, Inc.

Introduction to Remote Sensing. Electromagnetic Energy. Data From Wave Phenomena. Electromagnetic Radiation (EMR) Electromagnetic Energy

Assignment: Light, Cameras, and Image Formation

Home-made Infrared Goggles & Lighting Filters. James Robb

Characteristic Primary Color Primary Pigment. Colors red, green, blue magenta, cyan, yellow

Using Multi-spectral Imagery in MapInfo Pro Advanced

Acquisition and representation of images

Sommersemester Prof. Dr. Christoph Kleinn Institut für Waldinventur und Waldwachstum Arbeitsbereich Fernerkundung und Waldinventur.

Remote Sensing Platforms

Troop 61 Self-Teaching Guide to Photography Merit Badge

remote sensing? What are the remote sensing principles behind these Definition

Term 1 Study Guide for Digital Photography

Lecture 2. Electromagnetic radiation principles. Units, image resolutions.

Acquisition and representation of images

CPSC 4040/6040 Computer Graphics Images. Joshua Levine

9/19/16. A Closer Look. Danae Wolfe. What We ll Cover. Basics of photography & your camera. Technical. Macro & close-up techniques.

Term Info Picture. A wave that has both electric and magnetic fields. They travel through empty space (a vacuum).

Visualizing a Pixel. Simulate a Sensor s View from Space. In this activity, you will:

Camera Setup and Field Recommendations

Transcription:

Making NDVI Images using the Sony F717 Nightshot Digital Camera and IR Filters and Software Created for Interpreting Digital Images Draft 1 John Pickle Museum of Science October 14, 2004 Digital Cameras Digital cameras that have a CCD sensor (Charge Coupled Device) are sensitive to a broad range of the electromagnetic spectrum from the ultraviolet to the near infrared (for more information of how digital cameras work, see http://www.howstuffworks.com/digital-camera2.htm). Scientists and artists take advantage of this sensitivity to make images of nature that appear to be from other worlds. A valuable Internet site that describes how to make such pictures and has many fascinating examples is http://www.naturfotograf.com/uv_ir_rev00.html. To be more efficient, IR or infrared will refer to only near infrared light, not thermal infrared (also commonly referred to as heat). Wavelengths of visible light range from 400 to 700 nm, yet the infrared ranges from 700 to 300,000 nm a much greater range than what we sense with our eyes! A good discussion of infrared light is at http://imagers.gsfc.nasa.gov/ems/infrared.html. Most digital cameras with a CCD have an IR-blocking filter placed in front of the sensor so only visible light is used to make the picture. The ultraviolet light is almost completely filtered out by the camera s glass lens. Some Sony cameras have a way to remove the IR-blocking filter with the flip of a switch, so the camera may take pictures of both visible and IR light. This option is called nightshot. Using filters that block visible light yet are transparent to the near infrared, you can make high quality digital images using only near infrared light. Although other digital cameras may still take IR pictures using a visible light blocking filter, exposure times are often quite long on the order of several seconds, which then requires very still wind conditions to take pictures of plants and landscapes outdoors). Also, the glass lenses are designed to focus visible light onto the CCD, not infrared. The focus of the camera needs to be adjusted to a shorter distance to make a crisp image with the longer wavelengths. Most digital cameras are not designed to do these easily, which often results in poor image quality. Light Reflected from Plants Plants absorb red and blue light for photosynthesis, yet reflect large amounts of green and infrared. The following gray-scale images illustrate the amount of red, green, blue, and infrared light various land covers reflect. The lighter the

gray, the greater amount of light being reflected. The darker the object, the greater amount being absorbed. Color Image Green Light in Shades of Gray Red Light in Shades of Gray Blue Light in Shades of Gray Notice that plants absorb more red and blue light, which are used in photosynthesis, and reflect larger amounts of green light. Hence, more plants are green! Three of the Interpreting Digital Images software programs could be used to create these images: ColorPicture, ImageAnalysis (which is being currently enhanced in the MVHimage version), and SurfaceTypeRGB. The latter two programs are designed for this effort: open the software, select and image, and then change the display to the proper enhancements and save this new image (don t forget to rename the image from its original name). ColorPicture requires that you understand how to make a grayscale image (black and white).

For the purpose of the NASA-funded Measuring Vegetation Health project, we have been experimenting on collecting pairs of visible and infrared images of plants and landscapes, and combining the information within both images to visualize and analyze vegetation stress. The camera was placed on a tripod during a relatively sunny and calm wind day, and images were made with the visible light-blocking filter and in NightShot mode and without the filter and in regular visible light mode (internal IR-blocking filter was in front of the CCD sensor). The pairs of images were taken within seconds of each other so the sun angle and lighting conditions did not change appreciably. The corresponding IR image of the scene above follows: Near Infrared Light in Shades of Gray Notice that the vegetation reflects most of the incoming infrared light, yet the pavement behind the fountain absorbs much more. Water absorbs most of the incoming IR, but in this example, the mirror surface of the water is reflecting infrared that is reflected from the plants. Vegetation Index An additional software program, SatelliteImageMaker, was created so that these sets of grayscale images could be brought together to create one of the

standard color composites used for Landsat images: infrared being displayed as red, red light being displayed as green, and green light measurements displayed as blue on the computer screen. This color composite, which is completely false in color with respect to what humans see, is often referred to as NRG (near IR Red Green color scheme). Healthy vegetation reflects more of the IR and absorbs large amounts of the red for photosynthesis, so in this color composite, plants appear bright red to pink. Notice that the originally red flowers appear to yellow in this composite infrared (now displayed as red) and red light (now displayed as green) are both reflected and green (displayed as blue) was absorbed. Red and green light make yellow. To make these images, open SatelliteImageMaker, and sequentially select which gray scale image you want to view in the image to be displayed: IR grayscale image -> Red layer of Displayed Image Red grayscale image -> Green layer of Displayed Image Green grayscale image -> Blue layer of Displayed Image

Vegetation Index One of the standard vegetation indices is based on a comparison of the amount of infrared to red light being reflected: IR Red. This image was created by comparing red (representing IR measurements) and green (representing red measurements) values at each pixel (a pixel is a picture element, or the smallest piece of the image with uniform color) in the NRG image. A pure red in the above image would be created if there were maximum intensity of IR light measured and no red light. If IR and red light were of equal intensities, regardless of their magnitudes, the color in the image would be displayed as black, and pure green would be created if all of the red light had been reflected and all of the infrared absorbed. This image was created using the SurfaceTypeRGB software by selecting the NRG image created earlier with SatelliteImageMaker and displaying the Red versus Green of the image (remember, that the red of this image represents the IR and the green represents the red light measured in the original picture). To compensate for shadowing and varying sun-surface-camera angles, the above vegetation index (IR-Red) is divided by (or normalized) by the sum of the

infrared and red light measurements. Therefore, a difference of 5 units will be much larger in a shaded area compared to the same 5-unit difference for a brightly lit object. This Normalized Difference Vegetation Index (NDVI) has been used to identify surface cover and indicate the health of plants from satellite data for decades. Dense, healthy vegetation produces NDVI values near +1.0, which is shown as pure red. As NDVI values decrease from one, the intensity of red in the image also decreases proportionately. Bare soil and rock reflect similar levels of infrared and red light, so these surfaces produce NDVI values near 0, and are shown as black. Clouds, water, and snow reflect more visible light than infrared, which is the opposite of vegetation, and so produce NDVI values near -1.0, which are shown in increasing intensities of green. Comparing the two vegetation index images, notice that the effects of shadowing appears to be more limited in the NDVI image. The NDVI image could be created using ImageAnalysis, MVHimage, and SurfaceTypeRGB programs by changing the display to the Red vs Green (normalized).