Unmanned Aircraft Systems for Remote Building Inspection and Monitoring

Similar documents
Visual inspection strategies for large bridges using Unmanned Aerial Vehicles (UAV)

Helicopter Aerial Laser Ranging

Heterogeneous Control of Small Size Unmanned Aerial Vehicles

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

RPAS Photogrammetric Mapping Workflow and Accuracy

ASPECTS OF DEM GENERATION FROM UAS IMAGERY

Classical Control Based Autopilot Design Using PC/104

UAV BASED MONITORING SYSTEM AND OBJECT DETECTION TECHNIQUE DEVELOPMENT FOR A DISASTER AREA

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

Displacement Measurement of Burr Arch-Truss Under Dynamic Loading Based on Image Processing Technology

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

Unmanned Aerial Vehicles: A New Approach for Coastal Habitat Assessment

Unmanned Aerial Vehicle Data Acquisition for Damage Assessment in. Hurricane Events

Overview. Objectives. The ultimate goal is to compare the performance that different equipment offers us in a photogrammetric flight.

ON THE CREATION OF PANORAMIC IMAGES FROM IMAGE SEQUENCES

Validation of the QuestUAV PPK System

HALS-H1 Ground Surveillance & Targeting Helicopter

RESEARCH ON LOW ALTITUDE IMAGE ACQUISITION SYSTEM

High Resolution Sensor Test Comparison with SPOT, KFA1000, KVR1000, IRS-1C and DPA in Lower Saxony

OPTIMIZATION OF A NEW MULTI-PURPOSE UAS FOR SCIENTIFIC APPLICATIONS USING AERODYNAMIC RECONFIGURATION

EXAMPLES OF TOPOGRAPHIC MAPS PRODUCED FROM SPACE AND ACHIEVED ACCURACY CARAVAN Workshop on Mapping from Space, Phnom Penh, June 2000

UAV PHOTOGRAMMETRY COMPARED TO TRADITIONAL RTK GPS SURVEYING

Deliverable 5-B: Review and Update on AURA System Requirements, Sensors, and Platforms Supplemental Report

AIRPORT MAPPING JUNE 2016 EXPLORING UAS EFFECTIVENESS GEOSPATIAL SLAM TECHNOLOGY FEMA S ROMANCE WITH LIDAR VOLUME 6 ISSUE 4

The Hyperspectral UAV (HyUAV) a novel UAV-based spectroscopy tool for environmental monitoring

INTELLIGENT LANDING TECHNIQUE USING ULTRASONIC SENSOR FOR MAV APPLICATIONS

The Research of Real-Time UAV Inspection System for Photovoltaic Power Station Based on 4G Private Network

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

SENSITIVITY ANALYSIS OF UAV-PHOTOGRAMMETRY FOR CREATING DIGITAL ELEVATION MODELS (DEM)

Hardware Modeling and Machining for UAV- Based Wideband Radar

IMAGE ACQUISITION GUIDELINES FOR SFM

USE OF IMPROVISED REMOTELY SENSED DATA FROM UAV FOR GIS AND MAPPING, A CASE STUDY OF GOMA CITY, DR CONGO

PRODUCT OVERVIEW FOR THE. Corona 350 II FLIR SYSTEMS POLYTECH AB

HD aerial video for coastal zone ecological mapping

Beacon Island Report / Notes

Safe Landing of Autonomous Amphibious Unmanned Aerial Vehicle on Water

EnsoMOSAIC Aerial mapping tools

Time-Lapse Panoramas for the Egyptian Heritage

UNMANNED AERIAL VEHICLES (UAV) AS A SUPPORT TO VISUAL INSPECTIONS OF CONCRETE DAMS

Geo-localization and Mosaicing System (GEMS): Enabling Precision Image Feature Location and Rapid Mosaicing General:

IGI Ltd. Serving the Aerial Survey Industry for more than 20 Years

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

UAV Technologies for 3D Mapping. Rolf Schaeppi Director Geospatial Solutions APAC / India

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

ZJU Team Entry for the 2013 AUVSI. International Aerial Robotics Competition

DECISION NUMBER FOURTEEN TO THE TREATY ON OPEN SKIES

White Paper Reaching 1 cm (0.4 in) drone survey accuracy

ACCURACY ASSESSMENT OF DIRECT GEOREFERENCING FOR PHOTOGRAMMETRIC APPLICATIONS ON SMALL UNMANNED AERIAL PLATFORMS

DEVELOPMENT OF A UAV-BASED REMOTE SENSING SYSTEM FOR UNPAVED ROAD CONDITION ASSESSMENT INTRODUCTION

Augmented Reality and Unmanned Aerial Vehicle Assist in Construction Management

Autonomous UAV support for rescue forces using Onboard Pattern Recognition

Sample Copy. Not For Distribution.

Structure from Motion (SfM) Photogrammetry Field Methods Manual for Students

IPRO 312: Unmanned Aerial Systems

White paper on SP25 millimeter wave radar

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

Autonomous Navigation of a Flying Vehicle on a Predefined Route

2019 NYSAPLS Conf> Fundamentals of Photogrammetry for Land Surveyors

Photomod Lite Contest 2013 Creating vegetation map using UAV at Seaside Palouki forest (Greece) by Apostolos Nteris

The Next Generation Design of Autonomous MAV Flight Control System SmartAP

MULTISPECTRAL AGRICULTURAL ASSESSMENT. Normalized Difference Vegetation Index. Federal Robotics INSPECTION & DOCUMENTATION

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

Design and Implementation of FPGA Based Quadcopter

Monitoring the vegetation success of a rehabilitated mine site using multispectral UAV imagery. Tim Whiteside & Renée Bartolo, eriss

Development of a Sense and Avoid System

PRELIMINARY RESULTS FROM THE PORTABLE IMAGERY QUALITY ASSESSMENT TEST FIELD (PIQuAT) OF UAV IMAGERY FOR IMAGERY RECONNAISSANCE PURPOSES

Requirements Specification Minesweeper

PROGRESS ON THE SIMULATOR AND EYE-TRACKER FOR ASSESSMENT OF PVFR ROUTES AND SNI OPERATIONS FOR ROTORCRAFT

Teleoperation of a Tail-Sitter VTOL UAV

Full Waveform Digitizing, Dual Channel Airborne LiDAR Scanning System for Ultra Wide Area Mapping

U-Pilot can fly the aircraft using waypoint navigation, even when the GPS signal has been lost by using dead-reckoning navigation. Can also orbit arou

Photogrammetry Image Processing for Mapping by UAV

Ground Control Configuration Analysis for Small Area UAV Imagery Based Mapping

Muhd. Safarudin Chek Mat #1,Nazirah Md Tarmizi #2, Mokhtar Azizi Mohd Din *3, Abdul Manan Samad #2

Introduction to Remote Sensing Lab 6 Dr. Hurtado Wed., Nov. 28, 2018

Five Sensors, One Day: Unmanned vs. Manned Logistics and Accuracy

Test and Integration of a Detect and Avoid System

Aerial Image Acquisition and Processing Services. Ron Coutts, M.Sc., P.Eng. RemTech, October 15, 2014

White Paper Reaching 1 cm (0.4 in) drone survey accuracy

Internet of Things and smart mobility. Dr. Martin Donoval POWERTEC ltd. Slovak University of Technology in Bratislava

SFR 406 Spring 2015 Lecture 7 Notes Film Types and Filters

DEFORMATION CAMERA

The drone for precision agriculture

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

VisionMap A3 Edge A Single Camera for Multiple Solutions

AG-VA Fully Autonomous UAV Sprayers

Improving the Safety and Efficiency of Roadway Maintenance Phase II: Developing a Vision Guidance System for the Robotic Roadway Message Painter

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

Photogrammetry. Lecture 4 September 7, 2005

Microwave Testing (µt): An Overview. Johann Hinken, FI Test- und Messtechnik GmbH Magdeburg, Germany, July 2016

UltraCam Eagle Prime Aerial Sensor Calibration and Validation

DETERMINATION OF ST. GEORGE BASILICA TOWER HISTORICAL INCLINATION FROM CONTEMPORARY PHOTOGRAPH

A New Capability for Crash Site Documentation

Airborne Thermal Survey

Jager UAVs to Locate GPS Interference

Revised and extended. Accompanies this course pages heavier Perception treated more thoroughly. 1 - Introduction

Phase One 190MP Aerial System

FLIGHT DATA MONITORING

accuracy. You even hear the terms subcentimeter or even millimeter absolute accuracy during some of these

Transcription:

6th European Workshop on Structural Health Monitoring - Th.2.B.1 More info about this article: http://www.ndt.net/?id=14139 Unmanned Aircraft Systems for Remote Building Inspection and Monitoring C. ESCHMANN, C.-M. KUO, C.-H. KUO and C. BOLLER ABSTRACT The paper reports on an investigation made at Fraunhofer IZFP where a rotary wing octocopter micro air vehicle (MAV) system has been used to scan buildings for inspection and monitoring purpose with a high resolution digital camera. The MAV has been equipped with a microcontroller-based flight control system and different sensors for navigation and flight stabilization. Pictures have been taken at a high speed and frequency, and stored onboard before being downloaded once the MAV completed a mission. Pictures taken have then been stitched together to obtain a full 2D image at a resolution allowing damages and cracking to be observed still in the millimeter range. In a follow-on step an image processing software has been developed that allows cracking patterns to be specifically filtered out, which may be further analyzed from a statistical pattern recognition point of view in a future step. INTRODUCTION An enhanced amount of civil infrastructure buildings has become an issue with regard to their ageing process and hence life cycle management. Conventional means for monitoring the condition of those buildings is by man driven visual inspection only, possibly supported by some tap testing. This way of monitoring mainly provides integral information about cracking condition and possibly detachment of the covering layers of concrete or stone based structures. Effort required in providing this information can become laborious when considering structures of a dam, a cooling tower, a church or even a simple multistoried building since significant lifting equipment is required for inspection. A means to circumvent this effort is by using unmanned aerial vehicles (UAV) and those even at small scales as micro aerial vehicles (MAV) as an airborne sensor system to capture the required data. The potential applications for such unmanned aircraft in the non-destructive testing (NDT) Christian Eschmann, Fraunhofer IZFP, Campus E3.1, 66123 Saarbrücken/Germany, christian.eschmann@izfp.fraunhofer.de Licence: http://creativecommons.org/licenses/by-nd/3.0 1

focus on the tasks state detection, damage analysis and condition monitoring. The field of unmanned aircraft is categorized according to different classifications, lined up by size restrictions, weight limits or the respective area of operation (operating radius, flight duration). These common categories include both aircraft and helicopters, as well as any other type of aircraft. Due to insurance-related claims, the use of unmanned aerial vehicles outside of registered (model) airfields is limited to a maximum takeoff weight of 5 kilograms. Accordingly, the UAVs used for the building inspection shall not exceed the classification category of micro [1] (µuav, MAV) which also has a weight limit of 5 kilograms. Both in terms of requirements and goals for a selective damage detection and the high building density especially in urban areas, the choice is on VTOL (Vertical Take-Off and Landing) capable platforms. Due to this high building density and the associated traffic, the risks for personal injury and infrastructural damage has to be kept as low as possible. Consequently the redundancy factor of the sub-systems as well as of the complete system, thus the reliability of flight important functions, is of particular significance. Another major aspect is the request for stable hovering characteristics in order to ensure a planned and detailed damage inspection. SYSTEM COMPONENTS Based on the conditions mentioned above Fraunhofer IZFP uses a Mikrokopter MAV platform [2] since early 2010. Arisen over the last few years, the technique of multi-rotor systems offers in comparison to a conventional helicopter concept a very simple mechanism, a highly effective design in terms of different payload concepts (to accommodate additional sensors, etc.). Figure 1 shows the MAV platform used at Fraunhofer IZFP. The chosen concept is an octocopter, i.e. a configuration with eight rotors. Advantage of this arrangement: even in case of a failure of one or even several electric motors flight control is still preserved, which means a considerable level of safety. Table 1. Octocopter technical data. Figure 1. Octocopter MAV inspection platform. Diameter Take-Off Weight Endurance Max. Payload 1.02 m 2.5 kg < 20 min 2 kg The Fraunhofer IZFP octocopter has a size of about 1 meter in diameter and a mass of approximately 2.5 kilograms, which does not exceed the legal weight limitations even with maximum payload (Table 1). The octocopter is equipped with various sensors such as gyroscopes, accelerometers and a barometric altitude sensor, which are used by a microprocessor-controlled flight control system for attitude stabilization. Navigation is mostly done by GPS, with additional support from the sensors of the flight control system and a 3D magnetic sensor. For safety reasons, the 2

MAV platform, both in manual mode controlled by the pilot as well as in semiautonomous mode with GPS-guided navigation by waypoint operation has to be flown within visual line of sight under permanent pilot s supervision required to intervene at all time. To control and monitor in-flight action, all relevant telemetry data is sent directly to the pilot as well as the ground station, on which the entire flight path planning can be seen. In addition, a real time video image transmission from the MAV is available, either displayed on the laptop or projected into the pilot s video eyeglasses in order to get a better picture of the object to be inspected. Applicable Payload As a part of the state detection, the building inspection can be done by numerous NDT methods, e.g. visual inspection, thermography, radar or laser. These techniques can be divided into three main categories: both the optical and the infrared method offer assuming monoscopic operation only a two-dimensional picture, while radar and laser applications provide an additional depth information useful for a more detailed damage diagnoses. As a third method there is the broadly used measurement by ultrasonics, which however requires coupling of the flight platform to the building making the technique rather unsuitable. Although all these techniques can be considered established in NDT in general and are also applied within the civil engineering sector, their implementation into a flying platform is still a significant challenge. It is specifically with sensors considering radar where the challenge becomes high while optical techniques including the thermographic ones have become achievable due to improvement of off the shelf technology, would this be in terms of cost, size and performance. The availability of digital information in camera systems nowadays has made building inspection possible even at very high resolution.. The latest camera used is a Canon PowerShot SX220 HS, with a resolution of 12 megapixel and a 14x optical zoom (focal length: 5 to 70 millimeters). To provide highest picture quality the digital zoom function has been permanently disabled. BUILDING INSPECTION The method of visual building inspection using MAV is generally divided into two process steps: data acquisition (by aerial survey, in-flight) and digital postprocessing (post-flight). The stages of this process are schematically shown in Figure 2. Data Acquisition The main focus of using UAVs is clearly on the data acquisition of the infrastructure to be inspected. To generally fly around an object, a preliminary flight track planning is needed, which is usually done by using a common software based on GPS waypoint navigation. However for inspecting a building GPS navigation becomes insufficient due to the precision required to the façade to be monitored and the threat of any shadowing effects resulting from near by buildings.. Moreover GPS does not allow accurate flight altitude control which is an essential factor under flight 3

planning aspects. A combination of collision and navigation sensors has therefore to be developed allowing an autonomous flight program (under pilot control) to become feasible in the long term. Hence manual flight control is currently still the only option to perform when flying close to a building. Figure 2. 2-step process for façade modeling. In order to have a images easily allocated to the real object in a structured way there are two options of flight patterns available when using an UAV for on-site building inspection (Figure 3). On the one hand, the flight path can be allocated horizontally as a storey-wise scanning of the building, and on the other it can follow vertically aligned slices. Figure 3. Options for on-site flight pattern. With regard to the usability of the aerial photos, the flight pattern option 2 was eliminated for UAV inspection as the main vertical movement increased lens-induced effects negative for stitching. Additionally the horizontal speed has to be quite limited while recording images such that fast bank angle changes effecting images are reduced and not be leveled by the automatic stabilization of the camera pod. For data recording, the integrated digital camera is controlled by an automatic photo-firing sequence, which can be set to a frequency of up to 3 pictures per second. 4

Optionally the camera can be controlled manually to set zoom, focus and shutter release if necessary. For optimal in-flight detection of damages, both in automatic recording and manual focusing mode, the real-time video link (low definition) can be used by the pilot or another person for camera orientation. It is not necessary for this application to transmit also the high quality images in real time, as the digital building reconstruction is still quite time-consuming. Therefore the data stored on the camera is read out after landing. Due to the automatic triggering of the camera, each flight generates a large amount of data, e.g. in a 15 minutes flight normally more than 1200 photos. This amount is far more than what is needed for a subsequent inspection, but as a result of the not completely stable hover a relatively high incidence of unusable image data is produced, a consequence of the not fully filtered out vibrations from the platform or external influences such as wind gusts. Additionally there is often a very high overlap of the area captured on each image, which varies depending on the hover speed parallel to the building façade. Accordingly unnecessary records are eliminated in case of too high overlapping to avoid double-or multiple-information within the images and to keep the image data base as small as possible without loss of quality. Image Processing After completion of the aerial survey, the second step is the digital post-processing of the selected images of step 1. Programmed for applications such as airborne terrestrial mapping [3] or panoramic photography [4], there is a variety of experimental and commercial software solutions available to reassemble the individual images nowadays. These stitching or mosaicking methods are based on pattern recognition techniques which analyze similar image content structures, called matching points, in two or more images and link them together based on these points. The panorama creation software [5] analyzes the input data under the assumption that images recorded are made only by pivoting without changing the camera s position. Since the aerial survey with UAVs generates images each made from a different position, the above algorithms are not suitable for this application. In contrast, the software for the mapping of landscape or similar mainly 2D objects can handle images from different locations. However, these algorithms are based on a precise geo-referencing procedure, which is possible due to noticeable GPS location changes together with inertial systems (inertial measurement units, IMU) of high accuracy [6]. In summary, several stitching and mosaicking programs have been analyzed to test their practicability for the reconstruction of building façades. Problems have been found regarding the transition between images, the fact of an unknown geometry of the object to be rebuilt by the software as well as any rearrangement out of image series being available. The stitching of up to 15 images was successful, but to generate a full façade reconstruction consisting of several hundred images was only possible through stich of the pre-stitched parts hence resulting in a multi-level stitching process. Facing the fact that particularly the edge areas of these façade parts created by the software are currently not suitable for fully automated stitching, stitching work has to be done mainly by hand so far, using programs such as Corel Photo-Paint, with which every single image has to be distorted and resized until it is adapted and integrated in the collage. 5

Image-based Digital Inspection and Monitoring For the building shown in Figure 4 more than 12,000 images were taken in total during four days of flight while only several hundred images were finally used for the 2-dimensional model shown. The digital façade reconstruction has an overall resolution of about 1.27 gigapixel. However the picture size at this resolution is very hard to handle. To make the inspection more user-friendly the model is separated into floors which by themselves are separated into parts of 10 window frames each. Figure 4. Digital façade reconstruction. The inspection then can be done directly on these window sections. In areas of special interest, high resolution detailed photos can be linked to the sections for even better monitoring. In terms of damage inspection, tests have been done based on high resolution areas allowing crack sizes down into the millimeter range to be characterized (Figure 5). The aim is finally to have this feature integrated in the full building reconstruction wherever significant damages appear. Therefore some filtering algorithms have been programmed in order to detect cracks automatically. Figure 5. High resolution crack inspection in submillimeter range. 6

AUTOMATED CRACK DETECTION Since the walls of inspected buildings are mostly of light color, cracks appear as black lines while using visual inspection. For their highlighting and extraction, two different methods can be used: Adding additional color value: This method analyses the threshold value to determine if each photo pixel needs to be added either with more black or white, resulting in black crack areas. However, this method only works with walls of grey or white color. Edge detection: The edge bases upon applying a Gaussian Blur (Equation 1) [7] to the original image, then subtracting it from the image again. By doing this step, the edge will be displayed as almost black, while others will be almost pure white. Equation 1. Gaussian Blur Due to unconvincing test results with the color-adding method, the edge detection method was further investigated for crack inspection. A typical damage found in buildings can be seen on the left of Figure 6. The original photo has been processed with the edge detection method to enhance the damaged areas. The four results on the right show the effect of varying the variance σ (sigma) used in the Gaussian Blur (increasing sigma value to the right). Too small values of sigma can result in an insufficient damage enhancement, but it can also be seen that a higher sigma does not necessarily mean better crack extraction. Figure 6. Influence of sigma value: original image (left) and results with increasing sigma values. To further analyze the cracking shown in Figure 5 in more detail the image has been re-inspected with the edge detection software (Figure 7). After the extraction, the width of the long but small cracks on the surface has been filtered out for clearer visualization. However, even with this first good approach, the edge detection method is still not the perfect one. It helps to extract bigger damages, but especially for tiny surface cracks it can be seen that they are still not very visible after image processing, whereas man made edges resulting from the object s geometry may be mistakenly taken as a crack in an automated filtering process so far. 7

Figure 7. Original image (left) and result after crack enhancement by edge detection. CONCLUSION The visual inspection of buildings using unmanned aircraft systems has shown that the use of an MAV in this case an octocopter represents an appropriate technique to create a first data base required for digital building monitoring. The high resolution camera attached to the MAV showed good results even under non-optimal flight conditions and confirmed that already visual recording methods provide valuable information for infrastructural inspection purposes. Improvements have to be done regarding data acquisition with respect to a better stabilization of the flight platform, anti-collision and navigation systems as well as route planning algorithms to expand the automation of the process. Also the image post-processing has to be improved by reducing manual workflow by appropriate image stitching and mosaicking software, ideally with an integrated crack detection feature based on the results shown which still requires to be developed. REFERENCES 1. UVS International. 2011. UAS: The Global Perspective. 9 th ed. Blyenburgh & Co, Paris. 2. Kurz, J., 2012. Das virtuelle Bauwerk - Kombinierte skalenübergreifende Visualisierung von ZfPBau Ergebnissen. In: Deutsche Gesellschaft für Zerstörungsfreie Prüfung e.v. (DGZfP): Bauwerksdiagnose 2012. DGZfP, Berlin. 3. Remondino, F., Barazzetti, L., Nex, F., Scaioni, M., Sarazzi, D., 2011. UAV photogrammetry for mapping and 3D modeling - Current status and future perspectives. Int. Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. 38(1/C22). ISPRS Confernce UAV-g, Zurich, Switzerland 4. Ward, G., 2006. Hiding seams in high dynamic range panoramas. Proceedings of the 3rd symposium on applied perception in graphics and visualization. 153. ACM International Conference Proceeding Series. ACM 5. Brown, M., Hartley, R. and Nister, D., 2007. Minimal Solutions for Panoramic Stitching. International Conference on Computer Vision and Pattern Recognition (CVPR2007), Minneapolis. 6. Krüger, T., Wilkens, C.-S., Reinhold, M., Selsam, P., Böhm, B., Vörsmann, P., 2010. Ergebnisse des ANDROMEDA-Projektes - Automatische Luftbildgewinnung mit Unbemannten Kleinflugzeugen. Deutscher Luft- und Raumfahrtkongress, Hamburg. Paper ID 161314. 7. Beard, B. L., Jones, K. M., Chacon, C., and Ahumada, A. J., Jr., 2005. Detection of blurred cracks: A step towards an empirical vision standard. Final Report for FAA Agreement DTFA-2045. 8