Identifying Challenges for Aerial Robots Operating in Near-Earth Environments

Similar documents
Exploring Search-And-Rescue in Near-Earth Environments for Aerial Robots

Aerial Robotics Competition: Lessons in Autonomy

Aerial Robotics Competition: Lessons in Autonomy

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

Recent Progress in the Development of On-Board Electronics for Micro Air Vehicles

Experimental Study of Autonomous Target Pursuit with a Micro Fixed Wing Aircraft

Teleoperation of a Tail-Sitter VTOL UAV

Design and Development of an Indoor UAV

QUADROTOR ROLL AND PITCH STABILIZATION USING SYSTEM IDENTIFICATION BASED REDESIGN OF EMPIRICAL CONTROLLERS

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Developing a Low-Cost Autonomous Indoor Blimp

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

Development of Hybrid Flight Simulator with Multi Degree-of-Freedom Robot

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

A 3D Gesture Based Control Mechanism for Quad-copter

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

International Journal of Scientific & Engineering Research, Volume 8, Issue 1, January ISSN

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

Location Holding System of Quad Rotor Unmanned Aerial Vehicle(UAV) using Laser Guide Beam

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Visual Perception Based Behaviors for a Small Autonomous Mobile Robot

ARDUINO BASED CALIBRATION OF AN INERTIAL SENSOR IN VIEW OF A GNSS/IMU INTEGRATION

A Foveated Visual Tracking Chip

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

ZJU Team Entry for the 2013 AUVSI. International Aerial Robotics Competition

Hardware in the Loop Simulation for Unmanned Aerial Vehicles

Wide Area Wireless Networked Navigators

SELF STABILIZING PLATFORM

Design and Implementation of FPGA Based Quadcopter

Design of Tracked Robot with Remote Control for Surveillance

Classical Control Based Autopilot Design Using PC/104

Evolving Spiking Neurons from Wheels to Wings

Range Sensing strategies

Proposal Smart Vision Sensors for Entomologically Inspired Micro Aerial Vehicles Daniel Black. Advisor: Dr. Reid Harrison

Toward autonomous airships: research and developments at LAAS/CNRS

Testing Autonomous Hover Algorithms Using a Quad rotor Helicopter Test Bed

MEM380 Applied Autonomous Robots I Winter Feedback Control USARSim

POSITIONING AN AUTONOMOUS OFF-ROAD VEHICLE BY USING FUSED DGPS AND INERTIAL NAVIGATION. T. Schönberg, M. Ojala, J. Suomela, A. Torpo, A.

Implementation of Nonlinear Reconfigurable Controllers for Autonomous Unmanned Vehicles

Traffic Control for a Swarm of Robots: Avoiding Group Conflicts

Teleoperation Assistance for an Indoor Quadrotor Helicopter

Development of a Low Cost Autonomous Indoor Aerial Robotics System V1.0 1 June 2009

Modeling And Pid Cascade Control For Uav Type Quadrotor

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Heterogeneous Control of Small Size Unmanned Aerial Vehicles

EE631 Cooperating Autonomous Mobile Robots. Lecture 1: Introduction. Prof. Yi Guo ECE Department

DESIGN & FABRICATION OF UAV FOR DATA TRANSMISSION. Department of ME, CUET, Bangladesh

Introducing the Quadrotor Flying Robot

LDOR: Laser Directed Object Retrieving Robot. Final Report

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

School of Surveying & Spatial Information Systems, UNSW, Sydney, Australia

Development of an Experimental Testbed for Multiple Vehicles Formation Flight Control

2006 CCRTS THE STATE OF THE ART AND THE STATE OF THE PRACTICE. Network on Target: Remotely Configured Adaptive Tactical Networks. C2 Experimentation

The Next Generation Design of Autonomous MAV Flight Control System SmartAP

IMU Platform for Workshops

RESEARCH ON LOW ALTITUDE IMAGE ACQUISITION SYSTEM

AN INSTRUMENTED FLIGHT TEST OF FLAPPING MICRO AIR VEHICLES USING A TRACKING SYSTEM

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

RC BLIMP: AN UNMANNED AERIAL VEHICLE FOR VIDEO SURVEILLANCE Ch. V. Ravi Teja 1, N.Sharath Babu 2, K.Haripal Reddy 3 1

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

A Mini UAV for security environmental monitoring and surveillance: telemetry data analysis

GPS data correction using encoders and INS sensors

Miniature UAV Radar System April 28th, Developers: Allistair Moses Matthew J. Rutherford Michail Kontitsis Kimon P.

Countering Weapons of Mass Destruction (CWMD) Capability Assessment Event (CAE)

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

UNCLASSIFIED. UNCLASSIFIED R-1 Line Item #13 Page 1 of 11

Elements of Haptic Interfaces

Jager UAVs to Locate GPS Interference

FreeMotionHandling Autonomously flying gripping sphere

The Research of Real-Time UAV Inspection System for Photovoltaic Power Station Based on 4G Private Network

Randomized Motion Planning for Groups of Nonholonomic Robots

SELF-BALANCING MOBILE ROBOT TILTER

The Future of AI A Robotics Perspective

INTELLIGENT LANDING TECHNIQUE USING ULTRASONIC SENSOR FOR MAV APPLICATIONS

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Walking and Flying Robots for Challenging Environments

Estimation of Absolute Positioning of mobile robot using U-SAT

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

1 st IFAC Conference on Mechatronic Systems - Mechatronics 2000, September 18-20, 2000, Darmstadt, Germany

INTRODUCTION. Flying freely. Aircraft that do not require a runway. Unconventionally shaped VTOL flying robots

QUADROTOR STABILITY USING PID JULKIFLI BIN AWANG BESAR

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

A Lego-Based Soccer-Playing Robot Competition For Teaching Design

Various levels of Simulation for Slybird MAV using Model Based Design

Mobile Robots (Wheeled) (Take class notes)

Georgia Tech Aerial Robotics Team 2009 International Aerial Robotics Competition Entry

Team KMUTT: Team Description Paper

Experimental Cooperative Control of Fixed-Wing Unmanned Aerial Vehicles

Sensor system of a small biped entertainment robot

A New Perspective to Altitude Acquire-and- Hold for Fixed Wing UAVs

Preliminary Design Review

Accurate Automation Corporation. developing emerging technologies

Autonomous Mobile Robot Design. Dr. Kostas Alexis (CSE)

Realistic Robot Simulator Nicolas Ward '05 Advisor: Prof. Maxwell

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Development of an Autonomous Aerial Reconnaissance System

Prof. Emil M. Petriu 17 January 2005 CEG 4392 Computer Systems Design Project (Winter 2005)

CubeSat Integration into the Space Situational Awareness Architecture

Transcription:

Identifying Challenges for Aerial Robots Operating in Near-Earth Environments Keith W. Sevcik, William E. Green and Paul Y. Oh Drexel University, Philadelphia, PA kws23@drexel.edu, weg22@drexel.edu, paul.yu.oh@drexel.edu Abstract Homeland security missions executed in near-earth environments are often time consuming, labor intensive and possibly dangerous. Aerial robots performing tasks such as bomb detection, search-and-rescue and reconnaissance could be used to conserve resources and minimize risk to personnel. Flying in environments which are heavily populated with obstacles yields many challenges. Little data exists to guide the design of vehicles and sensor suites operating in these environments. This paper explores the challenges encountered implementing several different sensing technolgies in near-earth environments. The results of applying these technologies to control a robotic blimp are presented to direct future work. 1 Introduction Homeland security missions bring new and unfamiliar territories which must be patrolled and kept safe. Caves, forests and other near-earth environments along with urban structures, such as buildings and tunnels, are difficult and time consuming to safeguard. Furthermore, search-and-rescue missions are most often dangerous and require large, diverse task forces [2]. Robots offer a means to offset this demand in resources and personnel. Much of the research effort has been in applying ground-based robots [8], however flying or hovering offers capabilities unachievable by ground based robots. Small-scale aerial platforms, such as micro-airvehicles (MAV s), 1 are capable of flying in environments heavily populated with obstacles and can assist in such missions. However, there are several constraints for MAV s and small unmanned-aerial- IEEE Member. Address all correspondence to this author. This work was supported in part by the National Science Foundation CAREER award IIS 0347430 1 MAV s are defined as aerial vehicles capable of safe, controlled flight in near-earth environments. For example, vehicles such as those used in [3], while small, move too fast to navigate areas densely populated with obstacles. Figure 1: A 30 inch diameter blimp carrying a 14 gram mini wireless camera can provide surveillance images for use in disaster scenarios. vehicles (UAV s) that conventional UAV s, such as the Predator, do not face. For example, equipping MAVs with larger-scale navigational sensor suites, such as inertial measurement units (IMU s), global positioning systems (GPS) and pressure sensors is not feasible due to payload limitations. Furthermore, GPS-based methods will not work in buildings, tunnels or caves because satellite signals are occluded. The net effect is that small, lightweight (i.e. less than 100 g) alternative sensor suites are required for aerial vehicles flying in near-earth environments. The assessment and evaluation of such sensor suites demands an aerial platform which is small and can fly safely and slowly in near-earth environments. Commercial vehicles currently being developed by Honeywell, BAE Systems and Piasecki Aircraft are capable of maneuvering in areas rich with obstacles. However, they are not yet available as research platforms. Nonetheless, collision avoidance and autonomous navigation sensor suites will be needed and can be developed in parallel. A simple and safe platform, such as a blimp, can serve as a test bed for sensor suite evalua- 67

tion. Figure 1 shows a blimp, with a 30 inch diameter (allowing it to fit through standard doorways) and a payload capacity of around 60 g. This is enough to carry a miniature wireless camera or stereo pair, compact sensors and other small electronic packages. Prior work has demonstrated the ability to control and navigate aerial vehicles utilizing a variety of sensing techniques. Vision based guidance and control has been demonstrated by [3]. Optic flow sensors studied in [1] have been used to perform autonomous tasks with MAV s. Localization and guidance using wireless motes has been achieved in [12]. However, the difficulties faced in near-earth environments tend to segregate these sensing methods, making them effective for accomplishing only specific tasks. Little has been done to evaluate these technologies from a single, consistent platform. This paper illustrates how these sensing techniques can be applied to a blimp. Section 2 discusses a blimp s platform characteristics and dynamics. Section 3 demonstrates the use of optic flow sensors, computer vision and wireless motes. Finally, section 5 concludes by summarizing and discussing future work. 2 Aerial Platform Several aerial platforms have been experimented with and evaluated. Rotorcraft, such as helicopters or ducted fan units [7], can hover but are extremely difficult to control. Fixed-wing aircraft can be designed to fly at extremely slow speeds [9], but are limited by their payload capacities. Lighter-than-air vehicles, in contrast, are easy to fly, inexpensive, and capable of hovering. 2.1 Lighter-Than-Air Vehicles Helium is the most common gas used in blimps today, with a lifting capacity of 1.02 kg/m 3 at standard temperature and pressure. The blimp holds roughly.17 m 3 of helium, giving it a theoretical lifting capacity of 174 g. Experimental results show an actual lifting capacity of 200 g. The total mass of the balloon, gondola, fins and mounting tape is 135.8 g. Therefore, the maximum payload that can be carried by the blimp is 64.2 g. This is substantially greater than typical near- Earth MAV s, making it an ideal platform for testing a variety of sensors. The blimp has two electric motors with attached propellers positioned on the gondola which allow forward and backward movement. These two motors can also pivot via a radio-controlled (RC) servo to provide an upward or downward angle to the thrust vector, as depicted in Figure 2. This allows the blimp to increase or decrease its altitude respectively. Yaw (i.e. rotation about the vertical axis) is controlled by an electric motor and propeller placed in the blimp s rear fin. The general approach for modeling a blimp followed by [13], [14] and [15] assumes that: 1. The airship can be modeled as a rigid body, thereby neglecting aeroelastic effects. 2. The volume and mass of the airship can be considered constant. This model is often applied to much larger blimps that use control surfaces to direct the craft. Since the system under investigation is much smaller, the following assumptions can be made to simplify the model: 3. The blimp is symmetric about the XZ plane. 4. The blimp is moving slow enough and is designed in such a way that the aerodynamic forces are negligible. Therefore, the dynamics for the blimp can then be written as: M V = F d + F g + F p V = [ V x V y V z ω x ω y ω z ] T (Velocities along and angular rates about the axes) M = 6x6 mass and inertia matrix F d = Dynamic force vectors (coriolis and centrifugal terms) F g = Gravity and buoyancy vectors F p = Propulsive force vectors The remainder of the system definition closely follows the derivation presented in [13]. All equations of motion are defined about a reference frame fixed to the body of the blimp whose origin is located at the center of buoyancy, which is assumed to be coincident with the center of volume. The center of gravity of the airship is defined relative to the center of buoyancy. The mass matrix accounts for all masses and inertias present in the system, including virtual terms associated with the apparent added inertia of a blimp. The dynamic force vector F d is defined as follows: 68

m z V z ω y + m y V y ω z m x V x ω z + m z V z ω x m y V y ω x + m x V x ω y (J z J y )ω z ω y + J xz ω x ω y + (m z m y )V y V z (J x J z )ω x ω z + J xz (ωz 2 ω2 x ) + (m x m z )V x V z (J y J x )ω y ω x J xz ω z ω y + (m y m x )V x V y The gravity and buoyancy vector F g is given by: k x (mg B) k y (mg B) k z (mg B) a z k y B ( a z k x + a x k z )B k y a x B Figure 2: Blimp diagram Where k x, k y and k z are components of a unit vector in the direction of gravity. Finally, the propulsive forces vector F p for this specific actuation scheme is given by: T p cosµ T t T p sinµ T t d tz T p (d z cosµ d x sinµ) T t d tx T p = Force from thrust propellers T t = Force from turning propeller µ = Angle of inclination of thrust propellers d x, d z = x and z location of thrust propellers d tx, d tz = x and z location of turning propeller Utilizing these equations of motion, it is possible to apply an input force to the thrust propellers and the turning propeller so the resulting linear and angular velocities can be observed. By tuning the various constants used to characterize the system, the model can be made to closely approximate the reactions of the real world system. 2.2 PC-to-RC In order to allow the blimp to be autonomously controlled by a ground-based PC, a PC-to-RC circuit was constructed [10]. Figure 3 shows how the circuit is interfaced with the PC and a standard 4-channel RC transmitter. This setup allows digital commands sent Figure 3: A PC-to-RC circuit converts digital commands to RC signals. Commands are then sent wirelessly to the blimp through a RC transmitter. from the PC to be converted into pulse width modulated (PWM) signals. PWM signals can then be sent wirelessly to the blimp s onboard receiver. The control software running on the PC generates 8- bit numbers for each of the 4 channels on the transmitter. The numbers correspond to the length of the PWM signal. Pulse lengths vary from 1 to 2 ms, where 1.5 ms usually represents the neutral position of a RC servo. The microcontroller, integrated into the PCto-RC circuit, receives the numbers and generates the pulse to be sent to the RC transmitter. The pulses are grouped into frames, with a frame containing one pulse for each channel. Figure 5 shows the signal that would be sent to a 4 channel transmitter. The frames sent from the microcontroller are received through the buddy port on the transmitter. Tradi- 69

Figure 4: Optic flow is used to sense when an obstacle is within close proximity of the blimp. The blimp avoids the collision by giving full throttle to the yawing motor. Figure 5: Signal from microcontroller to transmitter. tionally, the buddy port is used to allow a trainer to take over the control of an amateur under their tutelage. This port can also be used to allow the computer to take control of the transmitter. Autonomous control can then be achieved based on information gathered about the surrounding environment. 3 Sensors Intelligence obtained from sensors allows the robot s control system to make sophisticated decisions. In addition to traditional sensors such as sonar, infrared (IR) and vision, biomimetic sensors can be constructed as lightweight packages. Integrating such hardware can produce a robust sensor suite for near- Earth environments. 3.1 Biomimetic Sensing Insects make heavy use of vision, especially optic flow, for perceiving the environment [4]. Optic flow refers to the apparent movement of texture in the visual field relative to the insect s velocity. Insects perform a variety of tasks in complex environments by using their natural optic flow sensing capabilities. While in flight, for example, objects which are in close proximity to the insect have higher optic flow magnitudes. Thus, flying insects, such as fruit flies [11] and dragon flies, avoid imminent collisions by saccading (or turning) away from regions of high optic flow. Capturing such sensing techniques into a packaged sensor is a vast research area. Neuromorphic chips have been available for many years [6]. However, to achieve the desired weight of 1-2 grams, mixed-mode and mixed-signal VLSI techniques [5] are used to develop compact circuits that directly perform computations necessary to measure optic flow [1]. Centeye has developed the one-dimensional Ladybug optic flow microsensor based on such techniques. A lens focuses an image of the environment onto a focal plane chip which contains photoreceptors and other circuitry necessary to compute optic flow. Low level feature detectors respond to different spatial or temporal entities in the environment, such as edges, spots, or corners. The elementary motion detector (EMD) is the most basic circuit that senses visual motion, though its output may not be in a ready to use form. Fusion circuitry fuses information from the EMD s to reduce errors, increase robustness, and produces a meaningful representation of the optic flow for specific applications. The resulting sensor, including optics, imaging, processing, and I/O weighs 4.8 grams. This sensor grabs frames up to 1.4 khz, measures optic flow up to 20 rad/s (4 bit output), and functions even when texture contrast is just several percent. Integrating insect flight patterns with Centeye s hardware collision avoidance was demonstrated using the blimp (see Figure 4). Although Centeye s optic flow sensors are not yet available commercially, Agilent Technoligies ADNS-2051 optical sensor can be utilized to achieve similar results. 3.2 Computer Vision To perform more sophisticated vision techniques such as line following, a wireless image acquisition system 70

The smartdust series of motes manufactured by Crossbow Technologies 4 consists of small wireless transceivers which can be interfaced with any sensor. Crossbow offers two common packages, the MICA2 and the MICA2DOT. At the core of these motes is an ATmega128L AVR microprocessor. This microprocessor executes all of the code programmed into the mote. Figure 6: A wireless camera is coupled with a computer vision algorithm to achieve line following. is required. RC Toys Eyecam 2 provides a reliable wireless video feed when utilized indoors. It is about as small as a US quarter coin, weighs just 15 grams and transmits color video on 2.4 GHz frequency. The output from the receiver is composite video, which can be digitized with Hauppauge s USB-Live 3 in order to plug-and-play into a PC. To demonstrate line following, the blimp was placed over a black line with a white background. A program was created to process the video feed. The video was then thresholded into a simple black and white image. Code was written to calculate the location of the centroid of the line within the image plane. PD control was then implemented to direct the blimp along the line (see Figure 6). Realisticaly, such ideal environments will not be encountered. However, the same path following techniques can be applied if the location of the blimp is known. 3.3 Wireless Mote Localization Wireless motes provide a means for localizing the blimp. The term motes refers to a general class of technologies aimed at having small, robust and versatile sensors that are easily deployable over a wide area. Such sensor networks could be distributed in factories to monitor manufacturing conditions, spread over fields to log environmental conditions for agriculture, or mixed into concrete to actively measure building stresses and vibrations. 2 http://www.rctoys.com/eyecam.php 3 http://www.hauppauge.com Code is written for the TinyOS operating system. TinyOS is an event driven operating system that handles low level microprocessor and radio networking tasks. This intrinsic networking ability allows for quick development of networks of wireless motes. The motes decide the most efficient network arrangement, resulting in an adhoc network. The TinyOS architecture also supports multihopping, allowing two motes out of range of each other to pass their information between intermediate motes. The radio module used by the MICA2 and MICA2DOT provides a measurement of the strength of received signals. The signal strength between a mobile mote attached to the blimp and wireless motes on the ground can be used to determine the relative position of the robot. If the location of the ground based motes is known, the robot can be localized. Such a strategy could be used to determine the absolute position of an aerial vehicle, the location of a vehicle relative to a target, or the position of an aerial vehicle relative to ground based robots carrying motes. To demonstrate this capability, a program was written to cause one of the motes to act as a beacon. Ground based motes that detected this beacon were programmed to relay the strength of the received signal to a computer base station. These strengths were displayed using a visual basic GUI which indicated motes in proximity to the beacon (see Figure 7). 4 Aerial Robot Competition In investigating different sensing methodologies, several questions arose about the use of aerial robots in near-earth environments. Problems such as the use of distributed versus local computing, the affect of environmental obscurrants, and the range, resolution and robustness of sensors were common across different sensing technologies and aerial platforms. An annual indoor aerial robot competition was conceived to help identify these issues and encourage innovative solutions. The challenges addressed would increase in 4 http://www.xbow.com 71

Figure 7: Signal strength is measured between a mobile node attached to the blimp and fixed nodes placed on a table. As the blimp passes by, the graphic corresponding to the nearest node is lit. complexity, with the goal of achieving full autonomy by the year 2015. In May 2005, Drexel University organized the first indoor aerial robot competition. The inaugural competition, featuring undergraduate teams from Drexel University and Swarthmore College (advised by Professor Bruce Maxwell), focused on both autonomous navigation and target identification in urban-like areas 5. 4.1 Autonomous Collision Avoidance One of the major challenges of autonomous flight in near-earth environments is the limited availability of GPS. This was mimicked by hosting the competition indoors. The autonomous collision avoidance section utilized a 90 x 20 foot space populated with obstacles such as telephone poles and wire, urban structures, trees, etc (see Figure 8). While these obstacles were symbolic of an outdoor setting, hosting the competition indoors prevents the use of GPS for future competitions. The obstacles were overlaid on a white cloth, and a black line ran through the course to denote a collision-free path. Teams had to implement a line following algorithm in real-time that was invariant to changing lighting conditions (i.e. a glass roof enable sun to light up portions of the course) and noise from indoor video transmission. Towards the end of the course, robots were met with a lowspeed fan to simulate wind disturbances. Points were awarded based on how far through the course robots were able to travel. 5 Thanks to Professors Hong Zhang and Rungun Nathan from Rowan and Villanova Universities, respectively, for judging the competition Figure 8: Swarthmore College s blimp following the collision-free path. 4.2 Teleoperated Target Identification The other section of the competition consisted of several mock victims spaced out in a 90 x 50 f oot area. These victims were positioned in a non-conscious manner, perhaps as a result of a chemical or biological agent released through the ventilation system of an office building (see Figure 9). Using a wireless camera mounted on the blimp s gondola, teams utilized teleoperated control to identify survivors and deploy markers (symbolic of radio beacons) pinpointing their locations before hazmat teams can arrive. Blimp operators were only permitted to view video images transmitted wirelessly from the blimp s camera and could not directly view the search area. Points in this section were awarded based on the marker proximity to survivors. 72

eye view is oftentimes unfamiliar to the operator and may require some image processing (e.g. object recognition) techniques to identify victims, tables, chairs, etc. During the teleoperated portion of the course, one of the teams lost control of their blimp when it was flown over a portion of the course that had been heated by sunlight. This observation identified thermals as a major concern for aerial robots operating in near-earth environments. 5 Conclusions Figure 9: In the search-and-rescue portion, teams will have to locate victims by viewing images transmitted from the robot s wireless camera. 4.3 Results The difficulty of the line following section was evident after practice runs for each team. To compensate for this, each team was allotted two restarts (i.e. the blimp can be placed back in the position it last lost the line). With the incorporation of this rule, both teams were able to follow the line until reaching the fan area, a distance of 75 feet. Once confronted with low speed wind currents, each team s blimp was immediately blown off course, unable to demonstrate gust stabilization. The target identification task also proved to be difficult. Teams were only able to locate and mark 1 to 4 victims out of a possible 8. In addition to the scores accumulated in the collision avoidance and target identification sections, each team was also judged on the design of both the flight system and the marker deployment mechanism. The overall winner of the 2005 competition was Drexel University. The key challenges identified in the inaugural competition were found mostly in the line following section. For example, sunlight shined sporadically on the course resulting in large gradients which effected the efficiency of the computer vision algorithms. Also, wireless video transmission indoors is diminished, but still usable at short distances (i.e. 100 feet). Furthermore, stabilizing an aerial robot in the presence of wind gusts is still a prevalent challenge. In the teleoperated portion of the competition, teams found it difficult to interpret the raw video transmitted from the blimp s wireless camera. A bird s The design of a sensor suite for a MAV varies greatly from the sensor suites utilized on traditional UAVs. Flying below tree tops or in and around urban structures prevents the use of GPS. Furthermore, devices such as IMU s and gyros often strain the payload capacities of small, lightweight aircraft. Design then focuses on achieving fundamental autonomous tasks such as altitude control and obstacle avoidance using the smallest packages possible. However, even the most highly-developed control system will fail when presented with unforeseen obstacles. Telephone wires, for example, are extremely thin, but could easily be fatal to a MAV. Such near-earth environment impediments demand the use of varied sensing technologies to ensure robustness. Through fusion of optic flow sensing, vision based guidance and wireless network localization, aerial vehicles are provided with a diverse sensor suite capable of addressing the issues faced. This paper demonstrates the porting of these techniques onto a robotic blimp, which provides a robust, versatile platform who s dynamics are well understood and documented. To begin to characterize these sensor suites, future work must be conducted to measure the reactions of these sensors to variables introduced in a controlled near-earth environment. To facilitate controller design, experimental results must be duplicated in simulated models. With well understood models and corroborating physical data, design can then move towards making MAV s fully autonomous in near-earth environments. References [1] Barrows, G., Mixed-Mode VLSI Optic Flow Sensors for Micro Air Vehicles, Ph.D. Dissertation, University of Maryland, College Park, MD, Dec. 1999. [2] Blitch, J., World Trade Center Search-and-Rescue Robots, Plenary Session IEEE Int Conf Robotics and Automation, Washington D.C., May 2002. 73

[3] Ettinger, S.M., Nechyba, M.C., Ifju, P.G., Waszak, M., Vision-Guided Flight Stability and Control for Micro Air Vehicles, IEEE/RSJ Int Conf on Robots and Systems, Lausanne, Switzerland, pp. 2134-2140, October 2002. [4] Gibson, J.J., The Ecological Approach to Visual Perception, Houghton Mifflin, 1950. [5] Harrison, R., Koch, C., An analog VLSI implementation of a visual interneuron: enhanced sensory processing through biophysical modeling, International Journal of Neural Systems, 9: 391-395, 1999 [6] Higgins, C., Sensory architectures for biologicallyinspired autonomous robotics, The Biological Bulletin, vol. 200, pp 235-242, April 2001. [7] Hamel, T.; Mahony, R., Chriette, A., Visual Servo Trajectory Tracking for a Four Rotor VTOL Aerial Vehicle, IEEE International Conference on Robotics and Automation (ICRA), Washington, D.C., pp. 2781-2786, 2002. [8] Murphy, R., et al, Mobility and sensing demands in USAR, IEEE Industrial Electronics Conference (IECON), V1, pp. 138-142, 2000. [9] Nicoud, J.D., Zufferey, J.C., Toward Indoor Flying Robots, IEEE/RSJ Int Conf on Robots and Systems, Lausanne, pp. 787-792, October 2002. [10] Sevcik, K., Oh, P. PC to RC Interface, Servo Magazine, July 2004 [11] Tammero, L.F., Dickinson, M.H., The influence of visual landscape on the free flight behavior of the fruit fly Drosophila melanogaster, Journal of Experimental Biology, v205, pp. 327-343, 2002. [12] Corke, P., Peterson, R., Rus, D., Networked Robots: Flying Robot Navigation using a Sensor Net,in ISRR 2003 [13] Varella, S. B. Gomes and Ramos, J. G. Airship Dynamics Modeling for Autonomous Operation. Proceedings of 1998 IEEE Int Conf on Robotics and Automation, Leuven, Belgium, pp 3462-3467, 1998. [14] Kim, J., Keller, J., and Kumar, V. Design and Verification of Controllers for Airships., Proceedings of 2003 IEEE Int Conf on Intelligent Robots and Systems, 2003, Las Vegas, Nevada, pp 54-60, 2003. [15] Hygounenc, E., Jung, I-K., Soueres, P. and Lacroix, S. The Autonomous Blimp Project of LAAS-CNRS: Achievements in Flight Control and Terrain Mapping. International Journal on Robotics Research, vol 23, pp 463-512, 2004. 74