The Airborne Optical Systems Testbed (AOSTB)

Similar documents
Airborne Optical Systems Test Bed (AOSTB)

Photon-Counting Lidar for Aerosol Detection and 3-D Imaging

Spatially Resolved Backscatter Ceilometer

Polarimetric Imaging Laser Radar (PILAR) Program

PERFORMANCE OF A NEW EYE-SAFE 3D-LASER-RADAR APD LINE SCANNER

By Pierre Olivier, Vice President, Engineering and Manufacturing, LeddarTech Inc.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY LINCOLN LABORATORY 244 WOOD STREET LEXINGTON, MASSACHUSETTS

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R-2 Exhibit)

Passive Microwave Sensors LIDAR Remote Sensing Laser Altimetry. 28 April 2003

Airborne Laser Scanning. Long-Range Airborne Laser Scanner for Full Waveform Analysis. visit our webpage LASER MEASUREMENT SYSTEMS

Panoramic 3D-Imaging Using Single-Photon Counting Laser Radar

NASTER System Definition Proposal

Hyperspectral Imager for Coastal Ocean (HICO)

1. INTRODUCTION 2. LASER ABSTRACT

Hyperspectral Sensor

Low Cost Earth Sensor based on Oxygen Airglow

Detectors that cover a dynamic range of more than 1 million in several dimensions

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Compact Dual Field-of-View Telescope for Small Satellite Payloads

Lecture 03. Lidar Remote Sensing Overview (1)

Preliminary Datasheet

Receiver Signal to Noise Ratios for IPDA Lidars Using Sine-wave and Pulsed Laser Modulation and Direct Detections

High-Power, Passively Q-switched Microlaser - Power Amplifier System

THREE DIMENSIONAL FLASH LADAR FOCAL PLANES AND TIME DEPENDENT IMAGING

MSPI: The Multiangle Spectro-Polarimetric Imager

A novel tunable diode laser using volume holographic gratings

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

746A27 Remote Sensing and GIS

Polarimetric Imaging Laser Radar (PILAR) Program

Phase One 190MP Aerial System

Lecture 21. Wind Lidar (3) Direct Detection Doppler Lidar

Photonic-based spectral reflectance sensor for ground-based plant detection and weed discrimination

Atlantic. series. Industrial High Power Picosecond DPSS Lasers

Design of a Free Space Optical Communication Module for Small Satellites

Ronald Driggers Optical Sciences Division Naval Research Laboratory. Infrared Imaging in the Military: Status and Challenges

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

High power VCSEL array pumped Q-switched Nd:YAG lasers

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

March 31, 2003 Single-photon Detection at 1.55 µm with InGaAs APDs and via Frequency Upconversion Marius A. Albota and Franco N.C.

OPAL Optical Profiling of the Atmospheric Limb

AIRBORNE LASER SCANNER FOR FULL WAVEFORM ANALYSIS. visit our webpage

Lecture 08. Fundamentals of Lidar Remote Sensing (6)

BMC s heritage deformable mirror technology that uses hysteresis free electrostatic

Special Projects Office. Mr. Lee R. Moyer Special Projects Office. DARPATech September 2000

Chapters 1-3. Chapter 1: Introduction and applications of photogrammetry Chapter 2: Electro-magnetic radiation. Chapter 3: Basic optics

Laser Telemetric System (Metrology)

Microwave Remote Sensing (1)

Hyper-spectral, UHD imaging NANO-SAT formations or HAPS to detect, identify, geolocate and track; CBRN gases, fuel vapors and other substances

Polarization Gratings for Non-mechanical Beam Steering Applications

IR Laser Illuminators

Vixar High Power Array Technology

RECONNAISSANCE PAYLOADS FOR RESPONSIVE SPACE

NON-PHOTOGRAPHIC SYSTEMS: Multispectral Scanners Medium and coarse resolution sensor comparisons: Landsat, SPOT, AVHRR and MODIS

Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, 77. Table of Contents 1

Status of MOLI development MOLI (Multi-footprint Observation Lidar and Imager)

Kazuhiro TANAKA GCOM project team/jaxa April, 2016

Norsk Elektro Optikk AS (NEO) HySpex Airborne Sensors System Overview

Challenges in Imaging, Sensors, and Signal Processing

PRODUCT OVERVIEW FOR THE. Corona 350 II FLIR SYSTEMS POLYTECH AB

HALS-H1 Ground Surveillance & Targeting Helicopter

746A27 Remote Sensing and GIS. Multi spectral, thermal and hyper spectral sensing and usage

RIEGL VQ-480-U. Airborne Laser Scanning. Lightweight Airborne Laser Scanner with Online Waveform Processing. visit our website

NIRCam optical calibration sources

Instructions for the Experiment

INNOVATIVE SPECTRAL IMAGING

Configuration, Capabilities, Limitations, and Examples

SNP High Performances IR Microchip Series

EARLY DEVELOPMENT IN SYNTHETIC APERTURE LIDAR SENSING FOR ON-DEMAND HIGH RESOLUTION IMAGING

Active and Passive Microwave Remote Sensing

The Hyperspectral UAV (HyUAV) a novel UAV-based spectroscopy tool for environmental monitoring

Spectroscopy in the UV and Visible: Instrumentation. Spectroscopy in the UV and Visible: Instrumentation

SNP High Performances IR Microchip Series

Remote Sensing 1 Principles of visible and radar remote sensing & sensors

Chapter 3 OPTICAL SOURCES AND DETECTORS

ECEN. Spectroscopy. Lab 8. copy. constituents HOMEWORK PR. Figure. 1. Layout of. of the

The below identified patent application is available for licensing. Requests for information should be addressed to:

Criteria for Optical Systems: Optical Path Difference How do we determine the quality of a lens system? Several criteria used in optical design

Developing An Optical Ground Station For The CHOMPTT CubeSat Mission. Tyler Ritz

Atlantic. Industrial High Power Picosecond Lasers. features

DURIP Distributed SDR testbed for Collaborative Research. Wednesday, November 19, 14

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

RIEGL VUX-240 PRELIMINARY NEW. Airborne Laser Scanning. Lightweight UAV Laser Scanner with Online Waveform Processing. visit our website

Tunable wideband infrared detector array for global space awareness

Compact High Resolution Imaging Spectrometer (CHRIS) siraelectro-optics

Undersea Communications

EE119 Introduction to Optical Engineering Spring 2003 Final Exam. Name:

An Introduction to Remote Sensing & GIS. Introduction

Fugro commence new Airborne Lidar Bathymetry trials

AFRL-RY-WP-TP

Instruction manual and data sheet ipca h

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Ultra-high resolution 14,400 pixel trilinear color image sensor

Ultra-sensitive, room-temperature THz detector using nonlinear parametric upconversion

The introduction and background in the previous chapters provided context in

Reprint (R43) Polarmetric and Hyperspectral Imaging for Detection of Camouflaged Objects. Gooch & Housego. June 2009

Remote Sensing Platforms

The V-Line Airborne Laser Scanner RIEGL

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

TSBB09 Image Sensors 2018-HT2. Image Formation Part 1

MERLIN Mission Status

Transcription:

Dr. Marius Albota, Dr. Rajan Gurjar, Dr. Anthony Mangognia, Mr. Daniel Dumanis, Mr. Brendan Edwards Massachusetts Institute of Technology Lincoln Laboratory Intelligence, Surveillance and Reconnaissance Division 244 Wood St., Lexington, MA 02421 USA Distribution A: Public Release ABSTRACT ALBOTA@LL.MIT.EDU Over the last two decades MIT Lincoln Laboratory (MITLL) has pioneered the development of enabling technologies and systems for high-sensitivity, photon-counting, scanning three-dimensional imaging laser radar (3D ladar). Examples include the ALIRT mapping and MACHETE foliage-penetrating ladars. While these and other systems have been transitioned to operation, there is a need to maintain a testbed for novel phenomenology investigation and validation of new sensor architectures. To that end, MITLL has developed an airborne optical system testbed (AOSTB) that is re-configurable, allows for roll-on roll-off capability, and can accommodate multiple sensors on a low-operating-cost Twin Otter aircraft. AOSTB mission areas include wide-area downlooking high-resolution imaging, side-looking and up-looking laser ranging and tracking, and sensor fusion with EOIR cameras and hyperspectral payloads. We describe the AOSTB ladar system, present recently collected airborne down and side-look 3D data, and discuss testbed configurations that can support various defense and non- DoD applications. Key Words/Topics: Laser Applications, LIDAR, Imaging 3D LADAR, Remote Sensing, Photon Counting, Active Optics 1.0 INTRODUCTION We report the development of an airborne 3D ladar utilizing a single-photon-sensitive avalanche photodiode (APD) array and short-pulse laser. The detector technology is based on arrays of short-wave infrared (SWIR) APDs operating in Geiger mode. The laser technology is based on diode-pumped passively Q-switched lasers. We describe the ladar system performance when operated from a Twin Otter aircraft and present high-resolution daytime and nighttime 3D imagery. We discuss the implication of these developments and the future direction of imaging 3D ladars as stand-alone payloads and as part of a fusion of active/passive EOIR sensors. The 3D ladar concept is straightforward. Light from a short-pulse laser illuminates a scene of interest. The reflected light is imaged onto a two-dimensional (2D) grid of detectors. Rather than measuring intensity, as in a conventional camera, these detectors measure the photon time-of-flight and therefore the object distance [1, 2]. With each pixel encoded with range information, photon-counting ladars can produce an angle-angle-range or 3D image on a single laser pulse. As a result of the receiver having a Geiger-mode avalanche photodiode (GMAPD) array, which is sensitive to single photons, one can relax the requirement on the laser transmitted power, and hence reduce the overall system size, weight, and power (SWaP) requirements, while allowing for closure of the optical link budget. This is important for airborne applications where payloads are constrained by numerous practical factors. When coupled with a fast scanning beam-pointing mechanism, photon-counting laser radar systems can provide high area coverage rates in excess of 100 km 2 /hr. Various ground-based and airborne systems have been demonstrated at MITLL over the last two decades, such as JIGSAW, ALIRT (Airborne Ladar Imaging Research Testbed), MACHETE (Multi-look Airborne Collector for Human Encampment and Terrain Extraction), and the Airborne Optical Systems Testbed (AOSTB), described herein. These systems have logged thousands of operational hours over the years, demonstrating their utility in open-terrain 3D mapping and foliage-penetrating (FOPEN) imaging for a variety of commercial and military applications [4,5].

1.1 SYSTEM OVERVIEW The Airborne Optical Systems Testbed (AOSTB) leverages technologies and systems developed for the ALIRT openterrain mapping system and the MACHETE FOPEN 3D ladar. The AOSTB engineering team took advantage of important improvements in detector technology and implemented unique scanning modalities, resulting in a relatively low-cost airborne ladar system. The hardware components have a flexible roll-on/roll-off capability and the testbed is suitable for operation as a single sensor or as part of a fused-sensor suite that can combine down- and side-looking ladar and various passive imaging modalities. A low-operating-cost Twin Otter (TO) aircraft was selected to host the payload for our airborne demonstration. This platform offers flight endurance in excess of 4 hrs at ground speeds of 100 knots and all the necessary space and power for various research and development activities. Figure 1 shows a picture of MITLL s AOSTB on the tarmac at Hanscom Airforce Base (HAFB), Bedford, MA. 1.1.1 Laser and Detector Modules The laser is a 1 W, Q-switched, Nd:YAG laser originally built at MITLL for operation on the ALIRT platform [3, 4]. The laser is constructed by diffusion bonding a short piece of Nd:YAG (laser gain medium) to a similar piece of Cr4+:YAG (saturable absorber). The pump-side face is coated to transmit the pump light (808 nm) and reflect the laser light (1064 nm). The output pulses have a transform-limited frequency spectrum, fundamental transverse mode with diffraction-limited divergence, and linear polarization. The laser has a pulse width of approximately 500 ps and operates in the SWIR spectral region at a wavelength of 1064 nm. The electro-optical receiver is a state-of-the-art 256x64-pixel, 50-micron-pitch, GMAPD array optimized for operation at the 1064 nm laser transmitter wavelength. The array was built on a framed ROIC, designed and fabricated in the Microelectronics group at MITLL. Lincoln-built 256x64 arrays of photon-counting APD detectors are integrated with on-chip CMOS digital timing circuits. The APDs are biased to operate above breakdown voltage in Geiger mode, in which an electron-hole pair generated by the absorption of a single photon initiates an avalanche process, causing the APD to break down. These Geiger-mode devices are advantageous because, in response to a single photoelectron, they yield a fast, high-amplitude, high-precision digital pulse. At 3-V overbias voltage and 0 Celsius, the detector exhibits 25% detection efficiency and 10 KHz dark count rate with negligible cross talk and aflterpulsing. Dark count rates are kept low by cooling the APD array to -10 C. Background noise is minimized through a combination of narrow-band spectral filtering via a bandpass filter and temporal filtering via low duty cycle/short-duration range gate operation. Further details on the fabrication of large-format GMAPD arrays receivers and passively Q-switched microchip lasers can be found elsewhere [2, 3]. As soon as the beam is pulsed, a flash pick-off fiber triggers the timers behind each GMAPD pixel to start counting. The circularly symmetric, neartransform-limited beam, exiting the laser is shaped by a pair of COTS cylindrical lenses to match the GMAPD receiver s 4-to-1 aspect ratio. Each individual detector pixel has an IFOV of ~31 µrad, resulting in a ground sample distance (GSD) of ~7 cm from a 7.5 kft flight altitude above ground level (AGL). The transmit-path optical elements adjust the far-field divergence to 8x2 mrad to match the receiver FOV for optimal utilization of the photons in the 3D imaging process. The laser exits the enclosure through a gold-coated broadband mirror and is directed by a periscope downwards and to the fast-scan mirror (FSM). The returning light is collected by a custom-built centrally-obscured telescope assembly and focused by a pair of adjustable achromatic lenses onto the detector array after first passing through a 2-in diameter, 3-nm FWHM narrowband optical filter. This filter reduces the amount of background incident onto the GMAPD during daytime measurements. The system focus can be adjusted by varying a micrometer stage which in turn changes the achromatic lens-pair spacing. This allows us the flexibility to image targets at a close range (~90-m) for ground testing. The laser and detector subassemblies are situated inside a light-tight box instrumented with temperature and humidity sensors. Dry air is directed into these boxes through a tube of desiccant to control the dew point inside the enclosures and reduce the chance of condensation. The scanning system utilizes a high-performance mirror made by BAE Systems. The hardware was originally deployed on the ALIRT 3D mapping system. For operation on AOSTB, we rotated the scan mirror assembly by 90 deg. about the vertical axis in order to optimize its operation for FOPEN. In this configuration, we take advantage of the increased maximum acceleration about the mirror s short axis, along the direction of the platform motion, and thus allow the sensor to achieve a larger number of views, or looks, over a given target. Each look at the target is defined as a single sinusoidal scan pattern that travels once across the imaged area on the STO-MP-SET-999 1-2

ground and is typically varying from 250x250 m 2 to 1x1 km 2. We can program the mirror to scan any area, straight below or to either side of the aircraft, at the desired speed, and with various number of looks, as dictated by the collection CONOPS. With an increased number of looks of a given area, we achieve greater angular diversity of reflected photons back towards our detector. This translates into better imaging capability at the human activity layer (HAL) in FOPEN imaging scenarios. The scan mirror command generator (SMCG) records the GPS position from an Applanix GPS/IMU and precise mirror angle encoder values corresponding to where the scan mirror is pointing during a detection. In order to be able to match the timestamps between the scan system and detector system, we use the GPS s pulse-per-second (PPS) signal as a synchronization pulse, and use internal oscillator clocks on the SMCG and APD tower FPGA for further precision. The angle-angle-range (3D) data is then processed through a variety of geometric transformations and coincidence processing algorithms to create high-resolution 3D point clouds [5]. Each pixel contains range information and the 3D image can be displayed in a variety of ways. One visualization method is to assign a different color to each range, and apply the appropriate color to each pixel in and displayed in a two-dimensional array. Another method is to render a 3D model from the data and display the model as if it were photographed from a user-defined aerial perspective. Yet another method is to create a point cloud and project these points onto a plane representing the viewing screen. In a point cloud, each pixel is assigned a point with x and y coordinates corresponding to the pixel position in the array and a z coordinate corresponding to its range. Due to the kinetic depth effect, the human brain can better perceive the 3D nature of the point-cloud image when the projected image plane is rotated as if the observer is moving around the recorded image, or equivalently as if the image is rotating in front of the observer. The 3D movies generated after processing easily reveal man-mode shapes and targets, providing useful information and actionable intelligence. 1.1.2 Laser Radar Link Budget In general, the laser power requirements for a ladar imager depend upon several parameters including: target range, reflectivity, the two-way propagation path loss, the receive aperture area, and the detection efficiency of the receiver. For a 1-W laser operating from an altitude of 2300 m, receiver aperture diameter of 10-cm, with 80% optics transmission, 45% central obscuration receiver telescope, 85% transmission narrowband filter, 25%-efficient 256x64-pix GMAPD array, and assuming 80% two-way atmospheric transmission and 30% target reflectivity, we expect to detect on the order of 0.15 photons per pixel per pulse. This signal level is sufficient for 3D imaging of targets in the open and under foliage during daytime and nighttime conditions. Moreover, the laser is invisible, covert, and eye-safe for all observers. 1.1.3 Results and System Performance The spatial resolution of the AOSTB ladar system was measured using a custom-built modulation transfer function (MTF) bar, or panel, target. The MTF has varying panel widths and spacing, as shown in the diagram on the left in figure 2. The ladar data shown in the right side of figure 2 shows proper optical focus, as indicated by the fact that all MTF panels are spatially resolved. Examples of two images from a flight over Maynard, MA (target-scan mode) and Boston shore (line-of-communication scan) are shown in figures 3 and 4, respectively. The target-scan shows an area of over 500x500 m 2, which took on the order of 10 sec to collect. The image over the littoral area south of Boston, including downtown, covered an area of ~500 m wide by 10 km long and only took a few minutes and a single pass over the designated area to collect. The coloring is set according to height while the brightness is set according to reflectivity. Fusing range with intensity data offers additional possibilities for a variety of EOIR applications. The view perspective can be rotated because multiple angled look gets incorporated into this 3D view. To demonstrate our typical FOPEN capabilities, figure 5 shows two images taken over a forested area near Burlington, VT. Figure 5(a) is a 3D point cloud view with the foliage intact, while 5(b) has the foliage removed by applying a range cut in software. This was possible because our system is capable of aggregating multiple looks through the foliage, which allows us to build up the necessary signal from these objects as required for detection and identification. Multi-look image registration along with coincidence-processing techniques reveals pathways, roofs, vehicles, and other objects of interest hidden under the foliage. These capabilities are of interest for the intelligence, surveillance, and reconnaissance missions. STO-MP-SET-999 1-3

1.1.4 Conclusion and Future Work Over the last two decades, MITLL has pioneered the development of enabling technologies and systems for high-sensitivity scanning 3D imaging laser radars. Here, we reported the development of an airborne optical testbed that is re-configurable, allows for roll-on roll-off capability, and can accommodate active laser radar and passive imaging sensors on a low-cost Twin Otter aircraft. The system is capable of more than 100 km^2 per hour area coverage, with spatial and depth (range) resolution on the order of a few centimeters. It uses a covert and eye safe invisible laser that can operate day and night. Current and future AOSTB mission areas include wide-area down-looking high-resolution 3D imaging, side-looking and up-looking laser ranging and tracking, and sensor fusion with conventional EOIR cameras and hyperspectral payloads. In this paper we described the AOSTB ladar system, presented a sample from the recently collected high-resolution 3D data, and discussed testbed configurations that can support various defense and non-dod applications. 1-4 STO-MP-SET-999

The Airborne Optical Systems Testbed (AOSTB) Fig. 1. AOSTB DHC-6 Twin Otter aircraft on the tarmac at Hanscom Air Force Base. Fig. 2. Model of our MTF optical resolution table (left). Airborne ladar image of the same MTF table showing good spatial resolution of ~7cm (right). Fig. 3. Target-scan of a 500 by 500 m^2 area of Maynard, MA. The number of looks employed was ~10. The scan takes about 10s to collect. Each pixel is color codded with range. Intensity information can be derived from and fused with the angle-angle-range data. Fig. 4. Large-area 3D ladar image: South Boston (lower left) and downtown Boston (upper right). The area scanned is 500-mwide by 10-km-long. The dark areas at the bottom are the Atlantic Ocean and coastal waterways, which reflect back very little light at our SWIR operating wavelength of 1064 nm. STO-MP-SET-999 1-5

(a) Fig. 5. Ladar target scan of a forested area in northern Vermont. On the left (a), we show a 3D point cloud image of the foliated area. The figure on the right, (b) is the same scene, with the foliage removed in processing, revealing a tent, a bench, a car and a truck that were hidden under the trees. (b) Acknowledgments. The authors wish to thank P. Consalvo (tech support), T. Shih (engineering), K. Ingersoll and S. Gorsky (software), A. Vasile, and L. Skelly (data processing), B. Willard (optics), MITLL Flight Facility (pilots and aircraft support), K. Schultz, J. Kyung, J. Khan (management). References 1. M. A. Albota, R. M. Heinrichs, D. G. Kocher, D. G. Fouche, B. E. Player, M. E. O Brien, B. F. Aull, J. J. Zayhowski, J. Mooney, B. C. Willard, and R. R. Carlson, Three-dimensional imaging laser radar with a photon-counting avalanche photodiode array and microchip laser, Appl. Opt. 41, 36, pp. 7671-7678 (2002). 2. B. F. Aull, A. H. Loomis, D. J. Young, R. M. Heinrichs, B. J. Felton, P. J. Daniels, and D. J. Landers, Lincoln Laboratory Journal 13, 335 (2002). 3. J. J. Zayhowski and C. Dill III, Diode-pumped passively Q-switched picosecond microchip lasers, Opt. Lett. 19, pp. 1427-1429, 15 (1994). 4. R. Knowlton. (2011). Airborne ladar imaging research testbed. MIT Lincoln Lab., Lexington, MA, Tech. Notes. [Online]. Available: www.ll.mit.edu/publications/technotes/technote_alirt.pdf. 5. A. N. Vasile, L. J. Skelly, M. E. O Brien, D. G. Fouche, R. M. Marino, R. Knowlton, M. J. Khan, R. M. Heinrichs, Advanced Coincidence Processing, of 3D Laser Radar Data, in Advances in Visual Computing: 8 th Internation Symposium, G. Bebis et al. (Eds.): ISVC 2012, Part I, LNCS 7431, pp. 382 393, 2012. 6. R. M. Marino and W. R. Davis, Lincoln Laboratory Journal 15, 23 (2005). 1-6 STO-MP-SET-999

STO-MP-SET-999 1-7