UAV BASED MONITORING SYSTEM AND OBJECT DETECTION TECHNIQUE DEVELOPMENT FOR A DISASTER AREA

Similar documents
Assessment of Unmanned Aerial Vehicle for Management of Disaster Information

Unmanned Aerial Vehicle Data Acquisition for Damage Assessment in. Hurricane Events

Helicopter Aerial Laser Ranging

THE modern airborne surveillance and reconnaissance

Aerial Image Acquisition and Processing Services. Ron Coutts, M.Sc., P.Eng. RemTech, October 15, 2014

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

USE OF IMPROVISED REMOTELY SENSED DATA FROM UAV FOR GIS AND MAPPING, A CASE STUDY OF GOMA CITY, DR CONGO

Module 3 Introduction to GIS. Lecture 8 GIS data acquisition

RESEARCH ON LOW ALTITUDE IMAGE ACQUISITION SYSTEM

VisionMap A3 Edge A Single Camera for Multiple Solutions

HALS-H1 Ground Surveillance & Targeting Helicopter

Phase One 190MP Aerial System

PEGASUS : a future tool for providing near real-time high resolution data for disaster management. Lewyckyj Nicolas

Real Time Target Surveillance with an Autonomous/Manual Controlled Unmanned Air Vehicle

Geo-localization and Mosaicing System (GEMS): Enabling Precision Image Feature Location and Rapid Mosaicing General:

Unmanned Aerial Vehicles: A New Approach for Coastal Habitat Assessment

Govt. Engineering College Jhalawar Model Question Paper Subject- Remote Sensing & GIS

Detection and Monitoring Through Remote Sensing....The Need For A New Remote Sensing Platform

The Research of Real-Time UAV Inspection System for Photovoltaic Power Station Based on 4G Private Network

An Introduction to Geomatics. Prepared by: Dr. Maher A. El-Hallaq خاص بطلبة مساق مقدمة في علم. Associate Professor of Surveying IUG

Overview. Objectives. The ultimate goal is to compare the performance that different equipment offers us in a photogrammetric flight.

Leica - 3 rd Generation Airborne Digital Sensors Features / Benefits for Remote Sensing & Environmental Applications

UAV Technologies for 3D Mapping. Rolf Schaeppi Director Geospatial Solutions APAC / India

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

746A27 Remote Sensing and GIS

ScienceDirect. The potential of UAV-based remote sensing for supporting precision agriculture in Indonesia

PERFORMANCE EVALUATIONS OF MACRO LENSES FOR DIGITAL DOCUMENTATION OF SMALL OBJECTS

SENSITIVITY ANALYSIS OF UAV-PHOTOGRAMMETRY FOR CREATING DIGITAL ELEVATION MODELS (DEM)

VisionMap Sensors and Processing Roadmap

9/12/2011. Training Course Remote Sensing Basic Theory & Image Processing Methods September 2011

The Hyperspectral UAV (HyUAV) a novel UAV-based spectroscopy tool for environmental monitoring

HIGH RESOLUTION COLOR IMAGERY FOR ORTHOMAPS AND REMOTE SENSING. Author: Peter Fricker Director Product Management Image Sensors

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

DISASTER MONITORING AND MANAGEMENT BY THE UNMANNED AERIAL VEHICLE TECHNOLOGY

Unmanned Aircraft Systems for Remote Building Inspection and Monitoring

Sample Copy. Not For Distribution.

Heterogeneous Control of Small Size Unmanned Aerial Vehicles

Drones and Ham Radio. Bob Schatzman KD9AAD

Visual inspection strategies for large bridges using Unmanned Aerial Vehicles (UAV)

High Resolution Sensor Test Comparison with SPOT, KFA1000, KVR1000, IRS-1C and DPA in Lower Saxony

A Mini UAV for security environmental monitoring and surveillance: telemetry data analysis

Remote Sensing Platforms

UAV TOOLKIT APP (BETA/EXPERIMENTAL 0.8) OCT 2015

Hardware Modeling and Machining for UAV- Based Wideband Radar

Disaster Countermeasures. Citation Advanced Materials Research, (2013) Trans Tech Publications, S

RIEGL VZ Terrestrial Laser Scanning. 3D Very Long Range Terrestrial Laser Scanner with Online Waveform Processing

REMOTE SENSING WITH DRONES. YNCenter Video Conference Chang Cao

Jager UAVs to Locate GPS Interference

Abstract. Keywords: landslide, Control Point Detection, Change Detection, Remote Sensing Satellite Imagery Data, Time Diversity.

Terrestrial Laser Scanning. 3D Very Long Range Terrestrial Laser Scanner with Online Waveform Processing. visit our website e

UAV PHOTOGRAMMETRY COMPARED TO TRADITIONAL RTK GPS SURVEYING

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

Photogrammetry. Lecture 4 September 7, 2005

Monitoring the vegetation success of a rehabilitated mine site using multispectral UAV imagery. Tim Whiteside & Renée Bartolo, eriss

High Altitude Communications Platforms

EXAMPLES OF TOPOGRAPHIC MAPS PRODUCED FROM SPACE AND ACHIEVED ACCURACY CARAVAN Workshop on Mapping from Space, Phnom Penh, June 2000

UltraCam and UltraMap Towards All in One Solution by Photogrammetry

Classical Control Based Autopilot Design Using PC/104

Intermediate Systems Acquisition Course. Lesson 2.2 Selecting the Best Technical Alternative. Selecting the Best Technical Alternative

Preliminary Datasheet

Stratollites set to provide persistent-image capability

DESIGN & FABRICATION OF UAV FOR DATA TRANSMISSION. Department of ME, CUET, Bangladesh

Overview of how remote sensing is used by the wildland fire community.

RIEGL VQ-480-U. Airborne Laser Scanning. Lightweight Airborne Laser Scanner with Online Waveform Processing. visit our website

Full Waveform Digitizing, Dual Channel Airborne LiDAR Scanning System for Ultra Wide Area Mapping

Research Article UAV-Based Sensor Web Monitoring System

Airborne or Spaceborne Images for Topographic Mapping?

HD aerial video for coastal zone ecological mapping

Ricoh's Machine Vision: A Window on the Future

Workshop on Intelligent System and Applications (ISA 17)

Intelligent Sensor Platforms for Remotely Piloted and Unmanned Vehicles. Dr. Nick Krouglicof 14 June 2012

RPAS Photogrammetric Mapping Workflow and Accuracy

Photogrammetry Image Processing for Mapping by UAV

The EDA SUM Project. Surveillance in an Urban environment using Mobile sensors. 2012, September 13 th - FMV SENSORS SYMPOSIUM 2012

RPAS & MANNED AIRCRAFT

Abstract Quickbird Vs Aerial photos in identifying man-made objects

Airborne Laser Scanning. Long-Range Airborne Laser Scanner for Full Waveform Analysis. visit our webpage LASER MEASUREMENT SYSTEMS

FORMOSAT-5. - Launch Campaign-

USE OF DIGITAL AERIAL IMAGES TO DETECT DAMAGES DUE TO EARTHQUAKES

NUMERICAL ANALYSIS OF WHISKBROOM TYPE SCANNER IMAGES FOR ASSESSMENT OF OPEN SKIES TEST FLIGHTS

Customer Showcase > Defense and Intelligence

Interoperability for Critical Situations

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

LMS-Q780. Airborne Laser Scanning. Full Waveform Digitizing Airborne Laser Scanner for Wide Area Mapping. visit our website

The drone for precision agriculture

New and Emerging Technologies

10. Real Time Mapping System INTRODUCTION REALTIME VOLCANO ACTIVITY MAPPING SYSTEM WITH GROUND FIXED SINGLE DIGITAL CAMERA

GeoBase Raw Imagery Data Product Specifications. Edition

GEOSPATIAL THERMAL MAPPING WITH THE SECOND GENERATION AIRBORNE FIREMAPPER 2.0 AND OILMAPPER SYSTEMS INTRODUCTION

UAV-based Environmental Monitoring using Multi-spectral Imaging

Leica ADS80 - Digital Airborne Imaging Solution NAIP, Salt Lake City 4 December 2008

Fourth Meeting of the Working Group of FSMP (FSMP-WG/4)

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

Neural Network Application in Robotics

Autonomous Navigation of a Flying Vehicle on a Predefined Route

Airborne hyperspectral data over Chikusei

RIEGL VUX-1UAV. Unmanned Laser Scanning. Lightweight UAV Laser Scanner with Online Waveform Processing. visit our website

IGI Ltd. Serving the Aerial Survey Industry for more than 20 Years

Microwave Remote Sensing (1)

Implementation of a Self-Driven Robot for Remote Surveillance

Transcription:

UAV BASED MONITORING SYSTEM AND OBJECT DETECTION TECHNIQUE DEVELOPMENT FOR A DISASTER AREA Afzal Ahmed 1, Dr. Masahiko Nagai 2, Dr. Chen Tianen 2, Prof. Ryosuke SHIBASAKI The University of Tokyo Shibasaki Lab., Institute of Industrial Science (IIS) 4-6-1 Komaba, Meguro-ku, Tokyo 153-855, Japan Tel: 81-3-5452-6417, Fax: 81-3-5452-6414e-mail: (afzal, nagaim, chentian)@iis.u-tokyo.ac.jp, shiba@csis.u-tokyo.ac.jp 1: Doctoral Student, Dept. of Civil Engineering, 2: Assistant Professor, Centre for Spatial Information Science (CSIS), 3: Professor, Dept. of Civil Engineering Commission VIII, WG VIII/2 KEY WORDS: Hazards, Developing Countries, Monitoring, Semi-automation, Extraction, Software ABSTRACT: In the case of natural hazards the spatial extent of the disaster area is usually large. It is often difficult to conduct search and rescue operations from the ground. Such operations are often hindered due to the inaccessibility to the locality due to the damaged infrastructures. But it is often crucial that the victims are identified and rescued as soon as possible so that timely help can reach to them. Usually manned aircraft equipped for covering wide area with special sensors are used in such purposes. Since it is difficult to operate the manned aircraft from low altitude, a binocular telescope is usually employed to detect small targets from high altitude. In that case, the range of vision for searching becomes narrow and the possibility of oversight must increase. Since UAV can fly at a very low altitude, high-resolution images can be acquired which then can be used to detect the victims. In this research, a UAV based monitoring system and object detection technique is proposed in order to enhance the search and rescue operation in a disaster area. The main focus here is to develop a data processing system, which would provide a faster, logical and accurate means of handling and processing a large amount of data acquired by the UAV system. The data processing system, henceforth termed as Data Viewer, consists of several components in order to carry out different responsibilities e.g., flight design, extraction of useful data, data quality checking, data integration, image viewing, object detection etc. 1. INTRODUCTION Numerous applications require aerial monitoring. Civilian applications include resource exploration, monitoring forest fires, oil fields, and pipelines and tracking wildlife, search and rescue operation in disaster area. Applications to homeland security include border patrol and monitoring the perimeter of nuclear power plants. Military applications are numerous. The current approach to these applications is to use manned vehicle for surveillance. However, manned vehicles are typically large and expensive. In addition, hazardous environments and operator fatigue can potentially threaten the life of the pilot. Therefore, there is a critical need for automating aerial monitoring using unmanned air vehicles (UAVs). UAV provides a platform for intelligent monitoring in application domains ranging from security and military operations to scientific information gathering, disaster area monitoring etc. It is often crucial that the victims in a disaster area be identified and rescued as soon as possible. If the spatial extent of the disaster area is large, search and rescue operation becomes more difficult with traditional methods. The usual approach for this purpose is to use manned aircraft equipped for covering wide area with special sensors and to assign the actual recognition task (surveillance) to the crew. However, in the usage of manned aircraft, it is difficult to operate from low altitude. A binocular telescope is usually employed in the manned aircraft for the magnification to detect small targets from high altitude. In that case, the range of vision for searching becomes narrow and the possibility of oversight must increase (SUMITOMO, T., 1997). Compared to the manned aircraft, mini helicopters are highly manoeuvrable, due to their capability to hover and to change flight direction around the centre of rotation. Any low cost still and video camera can be fixed onboard of a model helicopter (EISENBEISS, H., 24). Furthermore, because of the small size of the system it is possible to fly close to the area of interest and to capture highresolution images using low cost digital cameras. The decision to pursue UAVs to meet the growing demand for search and rescue is based on increased effectiveness, and reduced cost. Although the UAV has no inherent rescue capability as a helicopter does, a properly equipped UAV can cover a very large area with varied types of sensors. It can easily locate people in trees or on rooftops in floods, infrared sources in wooded areas (lost or disaster stranded campers, hikers, etc.). It can be deployed to disaster areas quickly, or flown there from a distant location if satellite equipped. UAVs have the advantage that they are more easily retasked, reconfigured, and upgraded to take advantage of different payloads or new sensor technology. In a disaster response, incident commanders must function to effectively coordinate personnel and resources, often with delayed and inaccurate information on the hazards. Therefore, timely and accurate information collection is critical to the central commander since a delay in the received information could have catastrophic outcomes. Data processing, the step followed by information collection, is crucial in terms of time efficiency and accuracy in victim detection. A logical workflow within the Data Viewer would enhance the efficient handling of large data. The conceptual framework and the development of the Data Viewer would be discussed in the section 3. The 373

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B8. Beijing 28 following section (# 2) provides an overview of the UAV type and the sensor used for imaging purpose. The data collected through the experiment is partially processed by the Data Viewer in order to assess its functionality. 2. UAV SYSTEM OVERVIEW The UAV used in this experiment is RPH2, which is a product of Fuji Heavy Industries Ltd, is shown in figure 1. It is 4.1m long, 1.3m wide and.8m high. Table1 shows the main specification of RPH2. Two operators were engaged in controlling the UAV in order to fly it along the predefined path for imaging the desired area. saved in the laptop computer. Another GPS is connected to the laptop, which synchronizes the still camera and the GPS. The synchronized -GPS imaging system is developed by Dr. Nagai (Nagai, M., 24). External batteries were used for supplying power to the computer and GPSs, while the cameras used power from their own batteries. Sensors Model Specification Digital Still Digital Video GPS Canon EOS 5D Canon IVIS HV2 Ashtech G12 Image Size: 4368 2912 pixels f: 24.mm, Weight: 7g, Shutter Speed: 9sec. Frame Size: 192 18 pixels f: 6.1mm, Weight: 6g, Frame Rate: 24fps. Accuracy Differential: 4cm Velocity Accuracy:.1(95%) Weight: 4g Table2: Specification of s and GPS The data collection system consists of a digital still camera, a digital video camera and two GPS. These are tightly attached to the UAV through a tailored steel platform as shown in figure 2. GPS Figure1: RPH2 UAV Weight 33 kg Pay load 1 kg Motor 83.5 Hp Main Rotor 2 rotors, diam: 4.8m Tail Rotor 2 rotors, diam:.8m Range 3km or over Endurance 1 hour Ceiling 2m Table1: Specification of RPH2 UAV Still Figure2: Sensor Arrangement Video The main specifications of these sensors are provided in Table 2. The still camera and the GPS are connected to a laptop computer. The data collected by the GPS is downloaded to and The experiment was conducted on a riverside in order to mapping the riverside environment. Images of the surrounding area were captured from different elevation with the same configuration of the system. 3. FRAMEWORK OF DATA VIEWER Time efficiency of the search and rescue operation in a disaster area given the UAV type and sensors used is one of the most critical deciding factors for the success of the system. The entire process can be divided into various time components among which, the followings are of particular interest for the present discussion: Time required to pre-process the raw GPS data Time required to extract useful flight time from GPS data and to integrate it with other sensor data Time required to do the data quality checking Time required to process the images for viewing Times required to image processing and to identify the target objects and their locations. The step-by-step workflow from data acquisition to object detection, as stated above, reveals that there is a need for developing an efficient data handling, processing and viewing system. This provides the basis for developing the Data Viewer, which would encompass the entire data processing task under one platform. Another important aspect of the system design is to define the flight of the UAV. Efficient flight design is important for the efficient use of the flight duration of the UAV. Based on fulfilling the above requirements, the components of the Data Viewer is selected as follows: Flight Design Useful Flight Time Extraction Data Integration Data Quality Checking Image Processing for Object (Victim) Detection Image Browsing for Visual Inspection by the Rescue Team 374

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B8. Beijing 28 3.1 Flight Design Accurate design of flight path is important for the UAV based surveillance over a disaster area. Thorough understanding of the topography and the infrastructure of the target area are important for this purpose. The success of the search and rescue operation would largely depend on the successful flight design. A number of factors have been identified which influence this design, as evident from the following discussion: For cheap UAV type, flight duration and payload can be serious limitation, which should be considered while designing the system. Often multi-camera system is preferred over the single camera system since the previous system covers larger area than that of the latter system. But again the number of cameras may be limited due to the payload constraint. So the choice of camera type and it s physical dimensions are important factors for designing the imaging system. For multi camera system, it s advantageous to use oblique camera in order to increase the field of view, thereby increasing the area coverage. The angle of inclination of such oblique cameras with respect to the vertical camera can be easily computed provided that the required overlap (usually 1~15%) between vertical and oblique images is supplied to the Flight Design computation system. Figure 3 shows a possible imaging system with camera arrangement. computed with the given UAV flight height. Usually the flight height is determined according to the size of the UAV, since it can t be flown so high that the operator can t see the orientation of the UAV. Flight height is also calculated based on the image resolution, focal length of camera. The area imaged in a single snap along with the UAV flying speed is then used in computing the time required to cover a unit area (e.g. 1 square kilometre) of the disaster scene. Such time computation is important since flight duration of UAV is limited. It will help us to decide where to and when to land the UAV for refuelling purpose. The required shutter speed for the synchronized system can be determined from the above area coverage computation, which is based on the required overlap (usually 6~8%) between two consecutive images. One important output that is obtained from the above calculations is the size of the ground covered by 1pixel. Such information is necessary in order to compute the size of the target (disaster victims in this case) to be identified from the images. The workflow for the flight design computation is illustrated in figure 4. 3.2 Useful Flight Time Extraction 132.3242 132.3237 8 6 132.3232 Side looking 4 Image 2 Forward Oblique Image Vertical Forward Oblique Side Looking Lon Longitude 132.3227 Forward Oblique Vertical Image -2-6 -4 2 4 6 8 1 12-2 Forward Oblique Image 132.3222-4 132.3217 34.5276 34.5286 34.5296 34.536 34.5316 34.5326 Latitude Figure 3: An Example of Arrangement in a test-bed onboard UAV Figure 5: Graph of Lat Vs Lon Payload Constraint Required overlap between vertical and oblique images Flight Height Flight Duration Required overlap between two consecutive images Number of cameras Required inclination of the oblique cameras Area covered by the camera system during a synchronized snap Required time to cover a unit area Required shutter speed Figure 4: Step by step Computation for the Flight Design After the UAV is being deployed for data acquisition, it takes a while for the UAV to reach to the desired flight height and to the desired flight direction. Meanwhile the UAV takes a few turns as well in order to reach to the desired orientation. Since the sensors start data acquisition as soon the UAV starts flying, it is obvious that not all data are useful for our desired application. In other words, we can say that it is better to extract only the useful data before start any heavy processing such as object detection from the image sequences and calculation of the position of the detected objects. This approach is obviously time effective since it reduces the volume of data before we start automation process. The useful flight time is extracted from the GPS data (consisting of GPS Time, Latitude, Longitude, and Height) alone, which was acquired during the flight of the UAV. The graphs in figure 5, 6, and 7 are generated from GPS data are used in the development of the algorithm for useful flight time extraction. After the camera arrangement has been selected, the area covered by one synchronized snap with this arrangement is 375

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B8. Beijing 28 Flight Height (m) GPS Time (Sec) 3 25 2 15 1 5 165 17 175 18 185 19 195 Figure 6: Graph of GPS Time Vs Flight Height 132.3245 132.324 132.3235 Longitude 132.323 132.3225 132.322 34.528 34.5285 34.529 34.5295 34.53 Latitude Lat 34.535 34.531 Figure 8: Useful Flight Time Extracted 34.5315 34.532 34.5325 Figure 5 shows the location of the UAV based on the GPS data (Latitude and Longitude) collected during the flight. From this graph, it is obvious that the UAV took a few turns before reaching to every actual flight position. This unnecessary manoeuvring needs to be eliminated in order to extract the useful data only. Figure 6 shows the graph of GPS Time Vs Flight Height. This figure provides the clue that for each flight height there are a number of points, henceforth termed as Point Flock, which lie within a certain range of the average flight height value. Figure 7 depicts a 3-D graph of Lat Vs Lon Vs Height. This figure is very useful to understand how the actual movement of the UAV took place. The UAV started from an arbitrary point and then rises to a predefined height before it 3.3 Data Integration Since the camera and the GPS acquire data in a synchronized manner, the images available within the Useful Flight Time are only considered for further processing e.g. for viewing and image processing for object detection. Data integration is carried out in such a manner that the GPS time is matched with the camera time. After the matching is over, a new file is generated, which contains GPS data and the Image number. Such integration is important from the Photogrammetry point of view. Since we want to compute the location of the object on the ground, we must know the time and the real-world position of the camera when the images were taken. 3.4 Data Quality Checking Height Longitude Latitude Figure 7: Graph of Lat Vs Lon Vs Height started the actual flight. After completing one flight, the UAV changed its height for next flight. This process repeated until all the predefined height is covered and then the UAV returns to the ground, completing the flight. Each flight line is more or less straight with little fluctuation from the average flight height. The objective of extracting useful flight time or duration is to extract these straight lines with stable flight height. Figure 8 shows a sample result obtained from this module. Data quality checking is another important step before running heavy processing of data such as object detection from image sequence or automated relative orientation of the images for viewing purpose. Here data checking implies to checking for the following missing data: GPS Time Check whether any image was taken during this missing period. Image Interval Check for the consecutive image interval Due to the mechanical fault or other reasons, sometimes GPS fails to record the data into the computer. Also the shutter control system of the camera may not work properly for the same reason. If images are captured when GPS data are missing, we can t use those images for computing the position of the detected objects. In this sense, these images become useless and we can exclude these before we apply image-processing module onto these images. Usually the shutter speed is controlled in order to maintain a required overlap between the consecutive images. If due to some reason, the camera fails to acquire images within the predefined imaging interval, it may not be possible to run the relative orientation process. 376

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B8. Beijing 28 Data Acquisition Interval 6 5 4 3 2 1 GPS Time Figure 9: Check for GPS Data Computing the location of these objects Viewing by the human interpreter (Rescue Team Commander) so that he can properly and timely allocate the resources for rescuing. This portion of the Data Viewer is yet to be developed. Only the probable and useful functionalities of these modules have been defined so far. Since time effectiveness is one of the major issues for developing a successful search and rescue system, image processing and object detection within the Data Viewer should be faster and robust. Also the processed images with the identified objects should be presented in the viewer in such a manner that these are easy to browse and interpret. 4. CONCLUSION Image Acquisition Interval 2 18 16 14 12 1 8 6 4 2 The Data Viewer is under development stage at the moment of writing this paper. The following four parts: Flight Design, Useful Flight Time Extraction, Data Integration, and Data Quality Checking have been developed and implemented for the flight design and partial processing with the experimental data. The applications of the image processing module and the image-viewing module are to be presented during the final presentation. Image Acquisition Time Figure 1: Check for the Inconsistency in Imaging Interval REFERENCE EISENBEISS, H. (24) - A mini unmanned aerial vehicle (UAV): system overview and image acquisition. International Archives of Photogrammetry, Remote Sensing and Spatial information Sciences, vol. XXXVI, part 5/W1, on CD-ROM. Figure 9 shows a temporal location GPS data error. GPS captures data in every 1sec. But this figure shows that the GPS could not acquire data for more than 5 seconds. Any images taken during this missing GPS data period are excluded for further processing. Figure 1 shows that the camera failed to acquire images within the designated interval. 3.5 Image Processing for Object Detection and Browsing Images for Human Interpretation After the errors in GPS data and images are eliminated, these data are used for: Nagai, M., Shibasaki, R., Manandhar, D., Zhao, H., 24. Development of digital surface and feature extraction by integrating laser scanner and CCD sensor with IMU. Istanbul. ISPRS, Vol. XXXV, Part B5. SUMITOMO, T., KURAMOTO, K., MIYAUCHI, H., YAMAMOTO, H., KUNISHI, T., 1997. Development of Image Processing Technique for Detection of the Rescue Target in the Marine Casualty. 9th International Conference, ICIAP '97 Florence, Italy, September 17 19, 1997 Proceedings, Volume II Detecting Object (victim) from the image sequence 377

The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences. Vol. XXXVII. Part B8. Beijing 28 378