3D Vision Based Landing Control of a Small Scale Autonomous Helicopter

Similar documents
Spring Localization I. Roland Siegwart, Margarita Chli, Martin Rufli. ASL Autonomous Systems Lab. Autonomous Mobile Robots

Lab 3 Acceleration. What You Need To Know: Physics 211 Lab

Motion-blurred star image acquisition and restoration method based on the separable kernel Honglin Yuana, Fan Lib and Tao Yuc

Pulse Train Controlled PCCM Buck-Boost Converter Ming Qina, Fangfang Lib

ECMA st Edition / June Near Field Communication Wired Interface (NFC-WI)

Fuzzy Inference Model for Learning from Experiences and Its Application to Robot Navigation

Memorandum on Impulse Winding Tester

Mobile Robot Localization Using Fusion of Object Recognition and Range Information

Role of Kalman Filters in Probabilistic Algorithm

Prediction of Pitch and Yaw Head Movements via Recurrent Neural Networks

ECMA-373. Near Field Communication Wired Interface (NFC-WI) 2 nd Edition / June Reference number ECMA-123:2009

Autonomous Humanoid Navigation Using Laser and Odometry Data

Modeling and Prediction of the Wireless Vector Channel Encountered by Smart Antenna Systems

the next step in tyre modeling

MAP-AIDED POSITIONING SYSTEM

Investigation and Simulation Model Results of High Density Wireless Power Harvesting and Transfer Method

Attitude Estimation of A Rocking Ship with The Angle of Arrival Measurements Using Beacons

A WIDEBAND RADIO CHANNEL MODEL FOR SIMULATION OF CHAOTIC COMMUNICATION SYSTEMS

Robot Control using Genetic Algorithms

KALMAN FILTER AND NARX NEURAL NETWORK FOR ROBOT VISION BASED HUMAN TRACKING UDC ( KALMAN), ( ), (007.2)

Estimation of Automotive Target Trajectories by Kalman Filtering

SLAM Algorithm for 2D Object Trajectory Tracking based on RFID Passive Tags

(This lesson plan assumes the students are using an air-powered rocket as described in the Materials section.)

P. Bruschi: Project guidelines PSM Project guidelines.

ICAMechS The Navigation Mobile Robot Systems Using Bayesian Approach through the Virtual Projection Method

THE OSCILLOSCOPE AND NOISE. Objectives:

Direct Analysis of Wave Digital Network of Microstrip Structure with Step Discontinuities

EE 330 Lecture 24. Amplification with Transistor Circuits Small Signal Modelling

10. The Series Resistor and Inductor Circuit

Knowledge Transfer in Semi-automatic Image Interpretation

A New Measurement Method of the Dynamic Contact Resistance of HV Circuit Breakers

UNIT IV DIGITAL MODULATION SCHEME

Foreign Fiber Image Segmentation Based on Maximum Entropy and Genetic Algorithm

Electrical connection

Design and Implementation an Autonomous Mobile Soccer Robot Based on Omnidirectional Mobility and Modularity

The IMU/UWB Fusion Positioning Algorithm Based on a Particle Filter

Programmable DC Electronic Load 8600 Series

Phase-Shifting Control of Double Pulse in Harmonic Elimination Wei Peng1, a*, Junhong Zhang1, Jianxin gao1, b, Guangyi Li1, c

4.5 Biasing in BJT Amplifier Circuits

Effective Team-Driven Multi-Model Motion Tracking

Programmable DC Electronic Loads 8600 Series

3D Laser Scan Registration of Dual-Robot System Using Vision

AN303 APPLICATION NOTE

EE201 Circuit Theory I Fall

Humanoid Robot Simulation with a Joint Trajectory Optimized Controller

A Segmentation Method for Uneven Illumination Particle Images

MATLAB/SIMULINK TECHNOLOGY OF THE SYGNAL MODULATION

A New Voltage Sag and Swell Compensator Switched by Hysteresis Voltage Control Method

Industrial, High Repetition Rate Picosecond Laser

GaN-HEMT Dynamic ON-state Resistance characterisation and Modelling

Acquiring hand-action models by attention point analysis

ARobotLearningfromDemonstrationFrameworktoPerform Force-based Manipulation Tasks

Chapter 14: Bandpass Digital Transmission. A. Bruce Carlson Paul B. Crilly 2010 The McGraw-Hill Companies

Comparing image compression predictors using fractal dimension

Transmit Beamforming with Reduced Feedback Information in OFDM Based Wireless Systems

R. Stolkin a *, A. Greig b, J. Gilby c

Multiple Load-Source Integration in a Multilevel Modular Capacitor Clamped DC-DC Converter Featuring Fault Tolerant Capability

Performance Study of Positioning Structures for Underwater Sensor Networks

A Multi-model Kalman Filter Clock Synchronization Algorithm based on Hypothesis Testing in Wireless Sensor Networks

A Comparison of EKF, UKF, FastSLAM2.0, and UKF-based FastSLAM Algorithms

Autonomous Robotics 6905

Dead Zone Compensation Method of H-Bridge Inverter Series Structure

Exploration with Active Loop-Closing for FastSLAM

Notes on the Fourier Transform

Chapter 2 Introduction: From Phase-Locked Loop to Costas Loop

Parameters Affecting Lightning Backflash Over Pattern at 132kV Double Circuit Transmission Lines

Auto-Tuning of PID Controllers via Extremum Seeking

4 20mA Interface-IC AM462 for industrial µ-processor applications

How to Shorten First Order Unit Testing Time. Piotr Mróz 1

Dimensions. Model Number. Electrical connection emitter. Features. Electrical connection receiver. Product information. Indicators/operating means

Double Tangent Sampling Method for Sinusoidal Pulse Width Modulation

Calculation on the Inter-Lobe Clearance Distribution of Twin-Screw Compressor by Optimization Method

HF Transformer Based Grid-Connected Inverter Topology for Photovoltaic Systems

Distributed Multi-robot Exploration and Mapping

Comparative Analysis of the Large and Small Signal Responses of "AC inductor" and "DC inductor" Based Chargers

Optimal Navigation for a Differential Drive Disc Robot: A Game Against the Polygonal Environment

A Smart Sensor with Hyperspectral/Range Fovea and Panoramic Peripheral View

EXPERIMENT #9 FIBER OPTIC COMMUNICATIONS LINK

Jitter Analysis of Current-Mode Logic Frequency Dividers

Simultaneous camera orientation estimation and road target tracking

5 Spatial Relations on Lines

Abstract. 1 Introduction

Universal microprocessor-based ON/OFF and P programmable controller MS8122A MS8122B

ECE-517 Reinforcement Learning in Artificial Intelligence

Dynamic Networks for Motion Planning in Multi-Robot Space Systems

RESISTANCE ESTIMATION OF A PWM-DRIVEN SOLENOID

Lecture 4. EITN Chapter 12, 13 Modulation and diversity. Antenna noise is usually given as a noise temperature!

Lecture #7: Discrete-time Signals and Sampling

A new image security system based on cellular automata and chaotic systems

DrunkWalk: Collaborative and Adaptive Planning for Navigation of Micro-Aerial Sensor Swarms

Automatic Power Factor Control Using Pic Microcontroller

EECE 301 Signals & Systems Prof. Mark Fowler

The ramp is normally enabled but can be selectively disabled by suitable wiring to an external switch.

Mobile Communications Chapter 3 : Media Access

The design of an improved matched filter in DSSS-GMSK system

Robust Visual-Inertial State Estimation with Multiple Odometries and Efficient Mapping on an MAV with Ultra-Wide FOV Stereo Vision

Evaluation of the Digital images of Penaeid Prawns Species Using Canny Edge Detection and Otsu Thresholding Segmentation

Inferring Maps and Behaviors from Natural Language Instructions

Adaptive Fusion of Inertial Navigation System and Tracking Radar Data

sensors ISSN

Transcription:

3D Vision Based Landing Conrol of a Small Scale Auonomous Helicoper Zhenyu Yu, Kenzo Nonami, Jinok Shin and Demian Celesino Unmanned Aerial Vehicle Lab., Elecronics and Mechanical Engineering, Chiba Universiy 1 33 Yayoi cho, Inage ku, Chiba 263 8522, Japan Corresponding auhor E mail: yzy@graduae.chiba u.jp;nonami@faculy.chiba u.jp Absrac: Auonomous landing is a challenging bu imporan ask for Unmanned Aerial Vehicles (UAV) o achieve high level of auonomy. he fundamenal requiremen for landing is he knowledge of he heigh above he ground, and a properly designed conroller o govern he process. his paper presens our research resuls in he sudy of landing an auonomous helicoper. he above he ground heigh sensing is based on a 3D vision sysem. We have designed a simple plane fiing mehod for esimaing he heigh over he ground. he mehod enables vibraion free measuremen wih he camera rigidly aached on he helicoper wihou using complicaed gimbal or acive vision mechanism. he esimaed heigh is used by he landing conrol loop. Considering he ground effec during landing, we have proposed a wo sage landing procedure. wo conrollers are designed for he wo landing sages respecively. he sensing approach and conrol sraegy has been verified in field fligh es and has demonsraed saisfacory performance. Keywords: Landing Conrol, 3D Vision, Auonomous Helicoper, Heigh Over Ground Esimaion, wo Sage Landing 1. Inroducion Unmanned Aerial Vehicles (UAV) are ideal plaforms for missions ha are dangerous, expensive, or impossible for human o carry ou. Wih he capabiliies of hovering and Verical ake off and Landing (VOL), auonomous unmanned small scale helicopers can be applied in such applicaions as surveillance, infrasrucure inspecion, mine deecion, search and rescue and so on. he unique hovering and VOL flying characerisics make such auonomous ask scenario as go search find akereurn possible. Recen sudy on auonomous helicopers ranges from sysem modeling and idenificaion (Meler, B., ischler, M. B., & Kanade,., 2002), conroller design (Lai, G., Fregene, K. & Wang, D., 2000; Enns, R. & Si, J., 2003; Shin, J., Nonami, K., Fujiwara, D. & Hazawa, K., 2005) o sensor inegraion and hardware implemenaion. he hovering capabiliy has been mos sudied. However, he VOL capabiliy is less addressed, which is anoher imporan facor owards fully auonomous mission. In order o land a helicoper auonomously, we need o know he relaive heigh beween helicoper and he ground. Also we need a well performed conroller o smooh he landing process. In lieraure, differen landing sraegies have been repored. he sensing schemes for landing can be mainly summarized ino wo groups: (1), vision only, (2), combinaion of muliple sensors. he group one measures heigh and verical velociy for landing direcly by using vision only. While group wo makes use of muliple sensors for he measuremens, and ypically he sensors may include GPS, sonar, laser range meer or vision sysem. he sensing scheme in he work (Shakernia, O., Ma, Y., Koo J. & Sasry, S.S., 1999) belongs o he firs group. In he paper, a single camera is used o rack feaures in a landing pad and esimae he ego moion. he pose of he helicoper wih respec he landing pad is derived and used for landing conrol. he performance of his approach is presened wih vision in he loop simulaion. he work (Kanade,., Amidi, O. & Ke, Q., 2004) shows anoher vision based moion esimaion for Micro Air Vehicles (MAV) conrol. he paper (Saripalli, S., Mongomery, J.F. & Sukhame, G. S., 2003) and (Nakamura, S., Kaaoka, K. & Sugeno, M., 2001) presen he sudy of landing conrol wih muliple sensors, which falls ino he group 2 sensing scheme. he firs paper uses GPS and sonar sensor for heigh measuremen, and a camera is used for searching landing pad. he vision sysem is no used direcly for heigh measuremen bu insead for arge recogniion and localizaion. he second paper presens a similar approach where an acive vision sysem is used for finding he landing pad. he heigh measuremen is by using differenial GPS. Our sensing sraegy belongs o he group one. he difference beween our approach and above menioned approach lies in he fac ha: Inernaional Journal of Advanced Roboic Sysems, Vol. 4, No. 1 (2007) ISSN 1729 8806, pp. 51 56 51

Inernaional Journal of Advanced Roboic Sysems, Vol. 4, No. 1 (2007) (1), we use binocular 3D vision sysem, which measure range wihou relying on any predefined landmark. (2), he vibraion free heigh measuremen is derived from he range map by he proposed plane fiing algorihm. (3), verical velociy is esimaed by fusing he heigh measuremen wih he acceleraion measuremen from Inerial Measuremen Uni (IMU). Wih he sensing sraegy, we proposed a wo sage landing conrol procedure o address he dynamics variaion and performance requiremen difference in high aliude and low aliude. he 3D vision based landing conrol approach has been verified in our field experimenal sudy. he paper is organized as follows. Secion 2 describes he helicoper esed and sysem archiecure. Secion 3 shows he applicaion of 3D vision o deermine he heigh over a fla ground. Secion 4 describes he wo sage conroller design for landing. Secion 5 presens our field experimenal resuls, and secion 6 gives our conclusion and fuure work. 2. esbed and sysem archiecure In he research we use a SF 40 model RC helicoper as our flying plaform. I is powered by a wo sroke gasoline engine wih a maximum payload of 8.5kg. A picure of he helicoper is shown in Fig.1 and he deailed specificaions are shown in able 1. (Single Board Compuer). Sensor readings are forwarded o he ground compuer for calculaing he acuaing command according o he conrol law. hen he command is sen o he helicoper o conrol he movemen of servomoors. Fig. 2 shows an archiecural view of he sysem. Fig. 2. Sysem archiecure he onboard sensors we used are all commercial off heshelf (COS) producs. he sensors equipped are a Bumblebee binocular 3D vision camera and an Aiude Heading Reference Sysem (AHRS). An onboard Penium III class SBC is used o run vision processing algorihm, and inerface beween sensors and hos compuer. he Bumblebee camera and he AHRS are used o measure he verical posiion and velociy during he landing conrol. he camera is mouned on he helicoper wih is lens facing direcly down o he ground. In able 2 and able 3, we show he picures and primary parameers of Bumblebee camera and AHRS respecively. Baseline : 12cm Resoluion : 1204x768 Max. Frame Rae : 15 Field of View : 50 Inerface : IEEE 1394 Weigh : 375g able 2. Vision sysem specificaion Fig. 1. Helicoper plaform: SF40 model helicoper Engine 2 sroke, gasoline, 40cc Fuselage Dimension 1467mm x 245mm Main roor diameer 1790mm ail roor diameer 273mm Max. Payload Abou 8.5kg able 1. Specificaion of SF40 model helicoper he sysem consiss of a ground processing par and an onboard sensing par. he ground par is responsible for execuing conrol law and sending conrol command o he helicoper. he ground par also provides user inerface o operaor. he onboard par is responsible for sensing he helicoper saes necessary for conrol purpose. he sensors are inerfaced via an onboard SBC Yaw range: 360 Pich range:180 Roll range: 360 Accelaraion range : 10G Updae Rae : >60Hz Inerface : RS232 Weigh : 770g able 3. AHRS specificaion 3. Heigh Esimaion wih 3D Vision Vision sysem has become popular in he field of robo conrol for is passive naure and he abiliy o percep he environmens. Applicaion of vision sysem for such purposes generally requires mulidimensional signal 52

Zhenyu Yu; Kenzo Nonami; Jinok Shin & Demian Celesino / 3D Vision Based Landing Conrol of a Small Scale Auonomous Helicoper processing echnique o exrac useful informaion from he redundan vision daa. In addiion, a small scale helicoper manifess unsable dynamics and requires consan aenion o keep i sable. he conrol ineracion ogeher wih he engine vibraion make i necessary o handle he vibraion induced noise when applying vision sysem in a small scale helicoper. In our applicaion, he 3D vision camera is fixed on he helicoper wih lens looking downward. he camera vibraes wih he helicoper. he vibraion induced measuremen error will ge bigger as he helicoper flies higher. We can minimize he vibraion effec by using a gimbal mechanism o keep he camera from vibraing wih he helicoper or by compensaing he measuremen wih he aiude informaion from an aiude sensor. Bu eiher way will complicae he sysem design and burden he limied payload/space consrain. We solve he problem in a sof way by developing algorihm o ge vibraion free heigh esimaion wihou adding any hardware suppor. he heigh esimaion is o decide he heigh of he helicoper above he ground. he heigh measuremen comes from he Bumblebee 3D camera. he vision sysem performs Sum of Absolue Differences (SAD) correlaion beween he lef and righ images o derive he deph informaion and oupus he deph image as resul. he size of he deph image is configurable. In our case, we configure he dimension of he deph image o be 120 pixels (heigh) by 160 pixels (widh). A ypical deph image generaed by he vision sysem is shown in Fig.3. follows. Firsly we apply Leas Mean Square (LMS) echnique o fi he deph image daa wih a plane. hen we compue he perpendicular disance from he cener of graviy of he helicoper o he plane. he value is used as he heigh esimaion. We perform he calculaion in he camera s frame. Since he Euclidean disance is invarian o he roaion. he esimaion is immune o he aiude change of helicoper. he definiion of he camera s body coordinae is shown in Fig. 4. he z axis is owards he ground. he y axis is perpendicular o he z axis and owards he fron. he x axis poins inside he paper which forms he righ hand rule body coordinae frame. Fig. 4. he camera mouning configuraion and he camera body frame definiion In he camera coordinae sysem we can express he plane by Eq. (1), n p = 0 (1) where n= [ a b c d] is he plane parameer vecor, p= [ x y z 1] is he homogeneous coordinae of a poin on he plane. he plane parameer n can be solved by minimizing Eq.(2). Fig. 3. Deph image from Bumblebee camera A each sample insance, we receive a deph image from he Bumblebee camera which conains all measuremens in he area ha he camera can see. his kind of high dimensional daa canno be direcly used by he conroller. In his sudy, we propose o use plane fiing mehod o map he high dimensional daa ino one dimensional measuremen. In developing he algorihm, we assume ha he ground surface is relaive fla. hus he plane approximaion will no cause significan error. he mehod is saed as where P f ( n) Pn x1 y1 z1 1 x2 y2 z2 1 = M M M M xn yn zn 1 = (2) 2 conains poins daa in view used for plane fiing. he soluion of n is he associaed eigenvecor of PP wih he leas eigenvalue. Once he plane equaion (1) is deermined, we can calculae he perpendicular disance from he C.G. (Cener of Graviy) poin of he helicoper o he plane. h = axc + byc + czc + d 2 2 2 a + b + c (3) 53

Inernaional Journal of Advanced Roboic Sysems, Vol. 4, No. 1 (2007) where ( xc yc zc) is he coordinae of helicoper C.G. poin. he accuracy of he measuremen is evaluaed by comparing wih a Novael RK 2 (Real ime Kinemaic) GPS which has a resoluion of 1cm. he comparison resul is shown in Fig. 5. ground effec bu may experience sronger wind. For he conroller design, we expec ha he conrolled sysem can respond fas and can olerae wind disurbance robusly. In he landing phase, he helicoper flies near he ground where he ground effec is significan. he dynamics in his phase may differ from ha of he descending phase. Undershoo, overshoo or rebounding are undesirable. For safe landing, we also expec some managemen suppor in he ouch down phase, for example, we wan he engine o be sopped or be kep in idle sae as soon as he helicoper ouches he ground. his kind of funcion is no required in he descending phase. he wo phase reamen gives us flexibiliy in designing conroller o saisfy he performance requiremen in differen phase wihou compromise. We have designed wo conrollers who are responsible for he wo phases respecively. Fig. 5. Heigh measuremen comparison : Vision and GPS he daa is gahered in real fligh wih GPS collocaed wih 3D vision sysem. Exceping he ime from abou 88 second, he measuremen accuracy of vision sysem is comparable o ha of he GPS. he mismach is because GPS los differenial compensaion and degraded o sandalone working mode. During he ime, we can see vision provide he righ heigh measuremen. he verical velociy is esimaed by using a seady sae Kalman filer. he filer design is based on he kinemaic relaionship beween acceleraion and posiion, which is shown in Eq. (4). && (4) z () = a() where z() represens he heigh and az () is he verical acceleraion. he acceleraion can be measured by AHRS sensor. 4. wo Sage Landing Landing is o descend from some arbirary heigh ill conacing he ground. his process can be divided ino wo phases: descending phase and landing phase. We define he descending phase as he process before he helicoper s aliude ges lower han a specified heigh. he landing phase refers o he process ha he helicoper descends from he specified heigh ill ouching he ground. he definiion is illusraed in Fig. 6. In his sudy, he specified heigh is chosen o be 2 meers. A his heigh he helicoper is jus free from he ground effec. he reason for idenifying he landing process as wo phases lies in he fac ha he working condiions and conrol requiremens of he wo phases are differen. In he descending phase, he helicoper flies a a relaive higher aliude where he helicoper is away from he z Fig. 6. Definiion of wo landing phases 4.1. Conrol archiecure Our landing conrol is planned in a 3 layered framework as shown in Fig.7. he op layer is Landing planner. he middle layer is Landing coordinaor. he low layer is he wo sage conroller which conrols he landing dynamics direcly. he landing planner makes landing decision and insrucs he landing coordinaor o execue landing ask. Currenly his funcion layer is no implemened ye. We are working o enable he planner funcionaliy by using vision daa o find a safe landing sie. he landing coordinaor coordinaes he wo low layer landing conrollers by sending reference command and ransferring conrol auhoriy from sage 1 conroller o sage 2 conroller a proper ime. Curren implemenaion of he coordinaor is based on he response ime of he sage 1 conroller and he difference beween acual sysem response and he commanded reference. o be specifically, if he ime passed since landing sar is longer han he response ime of sage 1 conroller, and he error beween he acual saes and commanded arge have been less han some hreshold for a specified ime, he conrol auhoriy is ransferred o sage 2 conroller. he wo 54

Zhenyu Yu; Kenzo Nonami; Jinok Shin & Demian Celesino / 3D Vision Based Landing Conrol of a Small Scale Auonomous Helicoper sage landing conrollers direcly handle he landing dynamics. he dynamic model and conroller design are described in he nex wo subsecions. Fig. 7. hree layer conrol archiecure 4.2. Dynamics of verical moion When ignoring he cross couplings, he verical moion of helicoper can be simplified as a rigid body wih graviaional force and hrus acing on i. he hrus generaed by main roor can be expressed by Eq. (5). b 2 = ρaω R 3 ( θ + φ) c (5) 4 where he symbol meaning is lised in able 4. Symbol B ρ a Ω θ φ c R Landing Planner Landing Coordinaor wo sage landing conroller Meaning Blade number Air densiy Lif coefficien Main roor speed Collecive pich angle Air inflow angle able 4. Parameer definiions of Eq. (5) Blade chord lengh Main roor radius 4.3. Conroller design he conrollers are designed based on he linearized model in Eq. (6). However, differen K is used o design he sage 1 and sage 2 conrollers. We apply Linear Quadraic opimal conrol heory for he design. wo LQI (Linear Quadraic wih Inegral) conrollers are designed for he landing phase 1 and phase 2 respecively. he inegral acion is for rejecing consan disurbance and achieving zero seady racking error. Linear Quadraic conrol offers a sysemaic procedure o find he feedback gain by minimizing he inegral of weighed plan saes and plan inpu. 0 (7) ( ( ) J = x Qx( ) + u( ) Ru( ) d where Q>0, R 0 are weighing marix, x() is plan sae and u () is plan inpu. A key sep is o selec he sae weighing marix Q and conroller oupu weighing marix R. he performance specificaion is encoded in he weighing marix. In our design, we iniially choose he weighing marix based on he maximum allowed variaion range of saes and conroller oupu. hen he conroller is refined hrough experimens. 5. Experimenal Resuls he experimenal resuls of he landing conrol are shown in Fig. 8 o Fig. 10. Fig. 8 shows he heigh rajecory during landing. Fig. 10 shows verical velociy and Fig. 10 shows he conroller oupu. In experimens, he roor speed Ω is kep as consan by a governor. he hrus generaed by he main roor is hus proporional o he sum of collecive pich angle and inflow angle. he inflow angle is no conrollable. By ignoring his erm, we arrive a simple linear relaionship beween hrus and he collecive pich angle. he effec of he ignored erm φ is reaed as disurbance and is o be compensaed by he conroller. Linearizing around he hovering sae and considering he ransporaion delay, we ge he ransfer funcion from collecive pich o verical displacemen. (Hazawa, K., Shin, J., Fujiwara, D., Igarashi, K., Fernando, D. & Nonami, K., 2003) K sd Gs () e 2 s = (6) where K is a consan which is idenified from experimenal daa, and is he ransporaion delay. d Fig. 8. Heigh rajecory during landing Fig. 9 Verical velociy during landing 55

Inernaional Journal of Advanced Roboic Sysems, Vol. 4, No. 1 (2007) he wo sage landing sraegy have been verified in field fligh es. We have successfully landing he helicoper in experimen. Our fuure work will focus on enabling he helicoper o land in an unknown environmen auonomously. We will develop safe area deecion algorihm o find safe landing sie. his self landing sie locaing capabiliy will become he core of he landing planner in he 3 layered landing conrol framework. Fig. 10 Conroller oupu during landing From Fig. 8 and Fig. 9 we can see he smooh landing process. he sage 1 conroller is commanded by he landing coordinaor o descend ill 2 meers high above he ground a ime 0. A he ime abou 66 second, he conrol auhoriy is ransferred o he sage 2 conroller. A he ime of 76 second, he helicoper ouched ground. Around ha ime, we can see he vision measuremen is quie noisy. his is because of he limi neares measurable range of he 3D vision sysem. In our case, he 3D vision sysem canno measure disance nearer han 30cm. he noisy heigh daa canno be used for conrol. Insead we inegrae he verical acceleraion wice for heigh measuremen when he vision sysem canno measure. Fig. 10 shows he conroller oupu which is he pulse command sen o he servomoor responsible for changing he collecive pich. he ramp like conrol daa saring from 76 second is o decrease he collecive pich and makes he helicoper away from neural balance. 6. Conclusion and Fuure Work In his sudy, we presen a 3D vision based approach for landing conrol of unmanned helicoper. We have designed a plane fiing mehod for heigh esimaion. he mehod is insensiive o he aiude change of he helicoper. For a smooh and sable landing, we proposed a wo sage landing sraegy, which consiues he low conrol layer in our 3 layered landing conrol framework. he wo sage landing separaes he landing process ino descending phase and landing phase. wo conrollers are deployed o address he difference requiremen in he landing phases. he effeciveness of he proposed 3D vision based over ground heigh esimaion mehod and 7. References Enns, R. & Si, J. (2003). Helicoper rimming and racking Conrol Using Direc Neural Dynamic Programming, IEEE ranscaions on Neural Neworks, Vol. 14, No. 4, pp. 929 939. Hazawa, K., Shin, J., Fujiwara, D., Igarashi, K., Fernando, D. & Nonami, K. (2003). Auonomous Fligh Conrol of Unmanned Small Hobby Class Helicoper. Journal of Roboics and Mecharonics, Japan, Vol. 15, No. 5, pp. 546 554. Lai, G., Fregene, K., & Wang, D. (2000). A Conrol Srucure for Auonomous Model Helicoper Navigaion. In Proc. IEEE Canadian Conf. Elecrical and Compuer Engineering, pp. 103 107, Halifax, NS, 2000. Meler, B., ischler, M. B., & Kanade,. (2002). Sysem Idenificaion Modeling of a Small Scale Unmanned Roorcraf for Fligh Conrol Design, Journal of he American Helicoper Sociey, January, pp. 50 63 Nakamura, S., Kaaoka, K. & Sugeno, M. (2001). A Sudy on Auonomous Landing of an Unmanned Helicoper Using Acive Vision and GPS. he Journal of Roboics Sociey Japan, Vol. 18, No. 2, pp. 252 260 Saripalli, S., Mongomery, J.F. & Sukhame, G. S. (2003). Visually guided landing of an unmanned aerial vehicle. IEEE ranscaions on Roboics and Auomaion, Vol. 19, No. 3, pp. 371 381. Shakernia, O., Ma, Y., Koo J. & Sasry, S.S. (1999). Landing an Unmanned Aerial Vehicle: Vision Based Moion Esimaion and Nonlinear Conrol. Asian Journal of Conrol, Vol. 1, No. 3, pp. 128 145. Shin, J., Nonami, K., Fujiwara, D. & Hazawa, K. (2005). Model based opimal aiude and posiioning conrol of small scale unmanned helicoper. Roboica, Vol. 23, pp. 51 63. 56