Autonomous Quadrotor for the 2013 International Aerial Robotics Competition

Similar documents
ZJU Team Entry for the 2013 AUVSI. International Aerial Robotics Competition

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

Development of ERAU Raven II Quad-Rotor System for the International Aerial Robotics Competition 2015

Development of a Low Cost Autonomous Indoor Aerial Robotics System V1.0 1 June 2009

Testing Autonomous Hover Algorithms Using a Quad rotor Helicopter Test Bed

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

Georgia Tech Aerial Robotics Team 2009 International Aerial Robotics Competition Entry

MB1013, MB1023, MB1033, MB1043

GPS System Design and Control Modeling. Chua Shyan Jin, Ronald. Assoc. Prof Gerard Leng. Aeronautical Engineering Group, NUS

AG-VA Fully Autonomous UAV Sprayers

Augmented Aerial Swarm Behavior via Natural Human Interaction

Hopper Spacecraft Simulator. Billy Hau and Brian Wisniewski

Precision Range Sensing Free run operation uses a 2Hz filter, with. Stable and reliable range readings and

A Mini UAV for security environmental monitoring and surveillance: telemetry data analysis

Classical Control Based Autopilot Design Using PC/104

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

Hardware Modeling and Machining for UAV- Based Wideband Radar

EEL 4665/5666 Intelligent Machines Design Laboratory. Messenger. Final Report. Date: 4/22/14 Name: Revant shah

A Low Cost Indoor Aerial Robot with Passive Stabilization and Structured Light Navigation 1 June 2012

Teleoperation Assistance for an Indoor Quadrotor Helicopter

Requirements Specification Minesweeper

Turtlebot Laser Tag. Jason Grant, Joe Thompson {jgrant3, University of Notre Dame Notre Dame, IN 46556

The Next Generation Design of Autonomous MAV Flight Control System SmartAP

The Future of AI A Robotics Perspective

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

THE IMPORTANCE OF PLANNING AND DRAWING IN DESIGN

Geo-localization and Mosaicing System (GEMS): Enabling Precision Image Feature Location and Rapid Mosaicing General:

Implementation of Nonlinear Reconfigurable Controllers for Autonomous Unmanned Vehicles

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Development of a Low Cost Autonomous Aerial Robotics System V4.0 1 June 2008

TECHNOLOGY DEVELOPMENT AREAS IN AAWA

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

SONOBOT AUTONOMOUS HYDROGRAPHIC SURVEY VEHICLE PRODUCT INFORMATION GUIDE

Husky Robotics Team. Information Packet. Introduction

Safe Landing of Autonomous Amphibious Unmanned Aerial Vehicle on Water

DESIGN & FABRICATION OF UAV FOR DATA TRANSMISSION. Department of ME, CUET, Bangladesh

Mobile Robots Exploration and Mapping in 2D

Jager UAVs to Locate GPS Interference

Hybrid architectures. IAR Lecture 6 Barbara Webb

Recent Progress in the Development of On-Board Electronics for Micro Air Vehicles

DESIGN CONSTRAINTS ANALYSIS

Experimental Study of Autonomous Target Pursuit with a Micro Fixed Wing Aircraft

The drone for precision agriculture

QUADROTOR ROLL AND PITCH STABILIZATION USING SYSTEM IDENTIFICATION BASED REDESIGN OF EMPIRICAL CONTROLLERS

Frequency-Domain System Identification and Simulation of a Quadrotor Controller

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

LOCALIZATION WITH GPS UNAVAILABLE

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Cooperative navigation: outline

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

GEO 428: DEMs from GPS, Imagery, & Lidar Tuesday, September 11

International Journal of Scientific & Engineering Research, Volume 8, Issue 1, January ISSN

UAV BASED MONITORING SYSTEM AND OBJECT DETECTION TECHNIQUE DEVELOPMENT FOR A DISASTER AREA

Multi robot Team Formation for Distributed Area Coverage. Raj Dasgupta Computer Science Department University of Nebraska, Omaha

Team Description Paper

Simulation of a mobile robot navigation system

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Design and Navigation Control of an Advanced Level CANSAT. Mansur ÇELEBİ Aeronautics and Space Technologies Institute Turkish Air Force Academy

MULTIPURPOSE QUADCOPTER SOLUTION FOR AGRICULTURE

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

MRS: an Autonomous and Remote-Controlled Robotics Platform for STEM Education

Control System Design for Tricopter using Filters and PID controller

Nautical Autonomous System with Task Integration (Code name)

EMBEDDED ONBOARD CONTROL OF A QUADROTOR AERIAL VEHICLE 5

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Design of a Remote-Cockpit for small Aerospace Vehicles

Mini Turty II Robot Getting Started V1.0

HALS-H1 Ground Surveillance & Targeting Helicopter

A Comparative Study of Structured Light and Laser Range Finding Devices

EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2011/11

Free Flight Mapping: Pix4Dcapture & dji Spark Jerry Davis, SFSU Institute for Geographic Information Science

PROJECTS 2017/18 AUTONOMOUS SYSTEMS. Instituto Superior Técnico. Departamento de Engenharia Electrotécnica e de Computadores September 2017

Drones for Telecommunications

MaxBotix Inc. Copyright MaxBotix Incorporated Patent 7,679,996. Close Range Operation. Warning: Personal Safety Applications

A conversation with Russell Stewart, July 29, 2015

MFAM: Miniature Fabricated Atomic Magnetometer for Autonomous Magnetic Surveys

Team Description Paper

Spring Final Review. Austin Anderson Geoff Inge Ethan Long. Gavin Montgomery Mark Onorato Suresh Ratnam. Eddy Scott Tyler Shea Marcell Smalley

Team S.S. Minnow RoboBoat 2015

Considerations: Evaluating Three Identification Technologies

Georgia Tech Team Entry for the 2012 AUVSI International Aerial Robotics Competition

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

The Hyperspectral UAV (HyUAV) a novel UAV-based spectroscopy tool for environmental monitoring

FORZA 700 SPEED Supplemental manual

MB7760, MB7769, MB7780, MB7789

Design and Implementation of FPGA Based Quadcopter

CubeSat Integration into the Space Situational Awareness Architecture

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Location Holding System of Quad Rotor Unmanned Aerial Vehicle(UAV) using Laser Guide Beam

IoT Wi-Fi- based Indoor Positioning System Using Smartphones

project gnosis tech ed development centre Teaching Kids since 2013

MIL-STD-1553 DATA BUS/PCM MULTIPLEXER SYSTEM

A 3D Gesture Based Control Mechanism for Quad-copter

Introducing the Quadrotor Flying Robot

Helicopter Aerial Laser Ranging

Terry Max Christy & Jeremy Borgman Dr. Gary Dempsey & Nick Schmidt November 29, 2011

INTELLIGENT LANDING TECHNIQUE USING ULTRASONIC SENSOR FOR MAV APPLICATIONS

CubeSat Proximity Operations Demonstration (CPOD) Vehicle Avionics and Design

Michael P. Vitus 260 King St Unit 757

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Transcription:

Autonomous Quadrotor for the 2013 International Aerial Robotics Competition Hengyu"Robbie" Hu Mechanical Engineering, minor Computer Engineering 2014 John Rafael Aleman Pericon Mechanical Engineering 2014 Han-Wei"Bill" Chen Computer Engineering 2014 Andy Mo Computer Engineering 2014 Lu Quan Tan Mechanical Engineering 2014 Hsiang-Wei"kevin" Ma Biomedical Engineering 2015 ABSTRACT: The Boston University Unmanned Aerial Vehicles Team (BU UAV TEAM) will compete in the 2013 International Aerial Robotics Competition (IARC) with a custom built quadrotor that is capable to traversing through narrow corridors of an unknown building using Simultaneous Localization and Mapping (SLAM) algorithms. While exploring, the vehicle uses image recognition program to identify the assistive Arabic signs and the flash drive. Ultimately, a passive retrieval mechanism consist adhesive and magnet secures the flash drive and releases a decoy through mechanical levers. Include returning, all mission shall be completed within ten minutes limit. 1. INTRODUCTION: This document present the system Boston University Unmanned Aerial Vehicles Team has designed and implanted for the 2013 International Aerial Robotics Competition, which will be held in Grand Forks, North Dakota from August 5 to August 8. 1.1 Statement of the problem: The goal of the 2013 IARC competition is to create a small aerial robot capable of fully autonomous flight through a confined environment. The mission itself is composed of a multitude of parts that test the ability of a UAV to avoid obstacles, locate key objectives, and retrieve said objectives (in this case, a flash drive). Overall, the mission can be divided into four stages. Stage 1 tests the UAV s ability to locate and travel through a window while being able to ascertain when it is safe to do so. Stage 2 tests the UAV s object avoidance ability as well as its mapping and search algorithm for finding the flash drive. Stage 3 of the mission tests the UAV s ability to retrieve the flash drive and leave a fake in its place. The final Stage of the mission tests the UAV s ability to locate the exit as soon as possible and land at a predetermined location. 1.2 Conceptual solution to solve the problem: The Boston University UAV TEAM has developed an autonomous aerial vehicle system that is able to explore indoor environment without GPS. The system utilizes a quadrotor platform and incorporates a vertical facing RGB camera, a laser range finder (LIDAR), a RGB-D camera (Microsoft Kinect), and an ultrasonic sensor. The vertical facing camera

and Kinect s front facing RGB camera feed video stream to object recognition software that will recognize the blue LED, posted Arabic sign and flash drive to assist navigation and flash drive retrieval maneuvers. LIDAR will feed distance and angle measurements of obstacles around the quadrotor to perform and aid Simultaneous Localization and Mapping (SLAM). Path planning software uses the SLAM data to guide the vehicle to explore frontier effectively for searching Arabic sign or flash drive. Kinect s depth data will be used to assist entrance, front obstacle avoidance, and determine the height for altitude stabilization along with the ultrasonic sensor. The retrieval mechanism will collect the flash drive with a magnet and adhesive while releasing the decoy using mechanical levers. BU UAV hardware architecture: System Engineering Diagram:

1.3 Yearly Milestones: This is our first year participating in the IARC. The BU UAV TEAM aims to develop a robust indoor UAV system that s able to adapt to various indoor missions and complete IARC s all required mission. 2. AIR VEHICLE We decided to build a quadrotor because of its flexibility, stability, maneuverability, and affordability. In addition, there exists a multitude of well-developed open-source stable control systems for this particular platform. The BU UAV TEAM quadrotor weighs approximately 1.49kg and spans 22.2 inches in width. Carbon fiber plates and Hexagonal carbon fiber frames are processed using state of the art CNC machine and Laser cutter. 3D printer was used to manufacture flash drive retrieval and dropping mechanics. CAD model over view Partial assembly 2.1 Propulsion and Lift System: Four-10x4.7 blades lift the quadrotor; the brushless motors Hyperion ZS2213-18 are attached to a 30A Turnigy ESC. These motors were chosen for their high lift capacity. Although this parameter is not crucial in the actual competition, we chose these motors so that it can lift additional weight due to redundant safety equipment and a larger battery during the research and development stage. 2.2 Guidance, Nav, and Control: In the attempts of hovering, Openpilot CC3D alone can accurately maintain the vehicle s roll and pitch; however, it cannot accurately maintain the vehicle s x, y, z position and has trouble with yaw. We will keep relying on the inertial measurement unit that is on the CC3D to stabilize the vehicle s roll and pitch. Along with Maxbotix ultrasonic sensor, we also use Kinect s depth data to calculate the altitude by detecting the floor to stabilize the vehicle s z position. LIDAR will assist maneuvering in x, y direction and yaw through SLAM software. The SLAM software generates a 2 dimensional occupancy grid map and locates the vehicle s relative position. Through the map, the vehicle will maintain a minimum distance away from obstacles to avoid collision. Kinect s depth data will also be used to reinforce the LIDAR for frontal obstacle avoidance. 2.3 Stability Augmentation System: The inner loops of the quadrotor stability control loops is done by Openpilot, however the IMU is not accurate enough to keep the vehicle from drifting in x, y, z direction. The Maxbotix MB1330 ultrasonic sensor is specifically picked for its beam pattern, this

sensor proved both high noise tolerance and sufficient sensitivity. Starmac-ros-package is used to implement floor detection and height measurement. Kinect and ultrasonic sensor together give more accurate altitude measurement, thus offset the drift in z direction. 2.4 Navigation: The vehicle will follow SLAM path by default, but switch to tracking flash drive or sign when they are recognized. The vehicle s current state and the occupancy grid map that is created by SLAM determine it s navigational behavior. To construct the occupancy grid map, areas near walls and obstacles receive low values while areas near unexplored territory receive high values. This way, the vehicle will be drawn to the frontiers and eventually will map the entire compound. Should one of the camera sensors detect a mission element such as the sign, the vehicle will focus on exploring beyond the general area of the sign until it spots the flash drive 2.5 Flight Termination System: The flight termination system will be made using the kill switch kit supplied by the IARC organization. The switch is composed of 3 NMOS FETs. The flight termination system is powered through the onboard batteries and when closed, allows uninterrupted flow of electricity from the batteries to the motors. We use XBEE as the independent receiver to connect the switch and open it if it receives a kill signal. 3. PAYLOAD 3.1 Sensor Suite: 3.1.1 GNC Sensors: Sensors responsible for guidance and navigational control include the Hokuyo LIDAR, Kinect, and Openpilot board. The Openpilot board keeps the UAV stable while it traverses through the environment constrained within safety parameters set by SLAM. The Kinect detects additional obstacles that the LIDAR cannot detect. The ultrasonic sensor helps ensure that the vehicle maintains a set height. 3.1.2 Mission Sensors: Sensors responsible for detecting target objects include the Kinect s RGB camera and the vertical facing camera. Using dual band mini PCI Wi-Fi adapter (450Mbps), the atom board streams live video collected from these two cameras back to the ground control station to determine if target objects is within sight. 3.1.3 Target Identification: Kinect s RGB camera and the vertical facing camera are constantly scanning for mission elements such as the sign and flash drive. One powerful image recognition algorithm that can be used to achieve detection and tracking is OpenTLD that stands for tracking, learning and detecting, this method is developed by Zdenek Kalal, a Czechish student from University of Surrey. This method can simultaneously track a selected object, learns its appearance and detects its position. OpenTLD is a very flexible object recognition system, which may potentially allow UAV to quickly adapt to new targeting objects if

mission demands. Since source code of OpenTLD is published under the terms of the GNU General Public License, also due to non-profit nature of the competition, a C++ implementation of OpenTLD, brought by Georg Nebehay, is modified and used to improve mission successfulness. The image below shows a test run of the OpenTLD algorithm. Despite receiving video from a low resolution camera, the TLD algorithm was still able to track the sign s position and size with high confidence (average 70%) while maintaining an average 30 fps, proving it may be more suitable than optical character recognition (OCR) or Speeded Up Robust Features (SURF) algorithm. The vehicle will be exposed to the targeting objects preflight, so that OpenTLD may learn what they will look like from all possible orientations. Once that is completed, it will be saved into a configuration file. During the mission, ground control station will running three OpenTLDs, all loaded with the previously saved configuration files, simultaneously. The first and second OpenTLD monitors Kinect s RGB camera video feed. One looks for the doorplate while the other looks for the flash drive. The third OpenTLD monitors the vertical facing camera video feed for flash drive. Once the target has been identified, the UAV will focus upon it and attempt to get closer while maintaining target locked. Should target lock fail, the UAV will attempt to return to its last known state when it still held target lock and attempt a different avenue of approach. Once the third OpenTLD helps maneuver the UAV directly above the flash drive, the vehicle will proceed to retrieve the flash drive. 3.1.4 Threat Avoidance: Walls are the primary threat during the exploration of an unknown indoor compound. Using the 2 dimensional occupancy grid map that SLAM provided, the vehicle s travel through an optimal path, which keeps a minimum distance away from walls and obstacles. The UAV will keep exploring the environment until it finds a target object or there are no more frontiers left. As the occupancy grid map is only 2 dimensional, any obstacle that is shorter than a certain height will not be recorded in the SLAM process.

Closely passing over such obstacles can severely threaten the vehicle s stability either through tricking the ultrasonic sensors to believe the UAV is much lower than it really is or through turbulence caused by sudden ground effects. To avoid this scenario, a simplified point cloud map to detect any obstacles in front of the aircraft will be employed using the Kinect s depth sensor. The Starmac-ros-package created by UC Berkeley Hybrid Systems Lab can be used to implement the above idea. A screenshot of the Starmac-ros-package in action is included below. 3.2 Communications: Since the vehicle has very limited onboard processing power, all computationally expensive operations are moved to the ground control station through 5GHz Wi-Fi. Visual data is streamed to the GCS using the Lightweight Communications and Marshalling (LCM) protocol, which allows for low latency multi-process communication between the UAV and GCS. 3.3 Power Management System: The vehicle will be powered by a 5000mA-hr lithium polymer (LiPo) battery. Power from the battery is connected to a kill switch, which is connected to the power distribution hub. One Mini DC-DC Voltage Stabilizer and Four ESCs are connected to the hub. The voltage stabilizer delivers stable power to the Intel atom board, Kinect and LIDAR. 4. OPERATIONS 4.1 Flight Preparations: Due to the complexity of the UAV, additional steps must be made to ensure that the UAV is ready to fly. First, a check must be done to confirm that all components are undamaged and connected properly. This is to ensure that the UAV will not fail from a wire coming loose during the flight. The next step is to check the battery to ensure that the UAV can operate for at least 12-minutes. The last step is the test the kill switch and ensures that it is operational. If the vehicle passes these three tests, it is safe to fly. A pre-run test flight will also be executed to check the mission readiness of the UAV. By testing the UAV functions on a smaller scale, a better idea of what needs improvement or fixing is obtained. Below is an example of a checklist that may be used for flight preparations done by the team

1. Table of parts; are they all undamaged and connected securely? 2. Is the battery fully charged? 3. Is the kill switch operational? 4. Was the pre run test flight successful? a. Were all the TLD software models successfully loaded? b. Can the UAV detect images of the sign and flash drive? c. Is the vehicle communicating with the command center? i. Is the command center receiving data from the vehicle s two cameras, LIDAR, Kinect, and ultrasonic sensor? ii. Is the UAV responding correctly to the return data issued from the command center? d. Is the retrieval and drop mechanism working correctly? 4.2 Man/Machine Interface: Once the vehicle is correctly connected to the wireless network, we will use SSH to remotely control the onboard system. We can configure the vehicle to fly autonomously or manually. An Xbox game controller is used to manually fly the vehicle. Video streams from 2 cameras, a 2-D map, and a point cloud stream from Kinect will be displayed on the screen for flight monitoring. 5. RISK REDUCTION 5.1 Shock/Vibration Isolation: The carbon fiber frame greatly reduced the concerns from vibrational effect. Sensitive component such as the Openpilot CC3D board (IMU) are attached to the main frame with rubber washers. Memory foam is used to join the Flexible carbon fiber landing gears to the main frame, which serves as an over damped dampers. This will help prevent unwanted bouncing when landing. Laser-cut acrylic motor mounting brackets also act as fail-safe points during collision. The brittle acrylic mounts will snap and disperse the energy to prevent serious damage to the motor or the frame. 5.2 Electromagnetic Interference (EMI)/Radio Frequency Interference (RFI) Solutions: We have investigated using analog video transmission to stream video and found that the EMI from the vehicle and environment can impose a great influence on the quality of the video. Switching to digital wireless transmission eliminates that problem. With great performance, the Linksys E4200v2 dual band router and mini PCI dual band Wi-Fi adapter (450Mbps) on the atom board are used to establish 5GHz WI-FI connection, which satisfies both our need for bandwidth and coverage. Radio Frequency Interference may seriously influence wireless connection s stability, and ultimately jeopardizes the whole mission. It is expected that there will be multiple 5GHz WI-FI routers in the competition arena. Before flight, we will scan the 5GHz WI- FI to find the channel that has the least overlap with any other channel. Additionally, depending on the local performance of the router in the arena, we may deploy addition signal repeaters to amplify the signal.

5.3 Safety: Safety is always our utmost priority. The listed safety measures are implemented during testing and development. Although some of the protection mechanisms may add too much weight, the control system should be improved sufficiently to effectively avoid obstacles during the competition. 1. All personnel must remain behind safety net during any flight 2. A lightweight fence guard created by tightened heavy-duty fishing line encloses the UAV to prevent propellers from colliding with the environment. 3. Before every flight, Openpilot requires an unarming procedure to prevent accidental propeller spin. 4. An independently controlled kill switch that can cut of the power to the UAV instantly. 5. The foam padding on the top of the Hokuyo LIDAR protects the sensor in case of accidental flipping. 5.4 Modeling and Simulation: 3D models of the UAV were produced using Solidworks. Part files were exported to allow fabrication of parts via a CNC mill and laser cutter. Minor simulations were done using COMSOL to understand the effects of stress on the shape of the rods used to make the cross frame. To simulate data link, we streamed between virtual machines on virtual network. The 3D modeling is the passive retrieval mechanism, the bottom plate will consist adhesive and magnet to secures the flash drive. The simulation below illustrates the movement of the mechanical lever releasing the decoy when the bottom plate has been pressed. Stand by loaded with decoy Releasing the decoy when triggered

5.5 Testing: During data link testing, we streamed between computers on local network. While our connections were successful, they suffered from high delays. We hope to reduce the lag in the data feed to real time by the time of the competition. While testing OpenTLD, we tried to run the program with various Operating Systems. We have found out that in order to let multiple OpenTLD to access one video stream on Linux, Webcam Studio must first be used to create virtual cameras, which allows multiple OpenTLDs using the same webcam. With this feature, OpenTLDs on flight would be able to search three objects with the only two cameras installed on the flight. 6. CONCLUSION: Through careful research, study, development and integration of existing mature algorithms, libraries and equipment, the overall system the BU UAV TEAM designed to meet IARC s challenges is theoretically sound. Although we are yet in the final stage of completing full implementation, the vehicle will ultimately prove to be capable of flying stably, tracking correctly, navigating effectively, and completing the mission successfully.

REFERENCES: Michelson, R., Rules for the International Aerial Robotics Competition 6th Mission, http://iarc.angel-strike.com/iarc_6th_mission_rules.pdf Kalal, Z.; Matas, J.; Mikolajczyk, K., "P-N learning: Bootstrapping binary classifiers by structural constraints," Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on, vol., no., pp.49,56, 13-18 June 2010 Kalal, Z.; Mikolajczyk, K.; Matas, J., "Forward-Backward Error: Automatic Detection of Tracking Failures," Pattern Recognition (ICPR), 2010 20th International Conference on, vol., no., pp.2756,2759, 23-26 Aug. 2010 Kalal, Z.; Matas, J.; Mikolajczyk, K., "Online learning of robust object detectors during unstable tracking," Computer Vision Workshops (ICCV Workshops), 2009 IEEE 12th International Conference on, vol., no., pp.1417,1424, Sept. 27 2009-Oct. 4 2009 Bouffard, Patrick, Jeremy Gillula, Haomiao Huang, Michael Vitus, and Claire Tomlin. "Quadrotor Altitude Control and Object Avoidance." (2011): n.pag. Web. 29 May 2013. <http://www.ros.org/wiki/openni/contests/ros 3D/Quadrotor Altitude Control and Obstacle Avoidance>. Chair of Automation Technology at the Chemnitz University of Technology,. "Autonomous corridor flight of a UAV." n.pag. Ros Wiki. Web. 29 May 2013. <http://www.ros.org/wiki/openni/contests/ros 3D/Autonomous corridor flight of a UAV using the Kinect sensor.>. Huang, Albert, Edwin Olson, and David Moore. "LCM: Lightweight Communications and Marshalling." <https://code.google.com/p/lcm/>.