PLEASE NOTE! THIS IS SELF-ARCHIVED VERSION OF THE ORIGINAL ARTICLE

Similar documents
E 322 DESIGN 6 SMART PARKING SYSTEM. Section 1

Concerning the Potential of Using Game-Based Virtual Environment in Children Therapy

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Hardware in the Loop Simulation for Unmanned Aerial Vehicles

OBSTACLE DETECTION AND COLLISION AVOIDANCE USING ULTRASONIC DISTANCE SENSORS FOR AN AUTONOMOUS QUADROCOPTER

STRUCTURE SENSOR QUICK START GUIDE

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Small Unmanned Aerial Vehicle Simulation Research

PROPOSED SYSTEM FOR MID-AIR HOLOGRAPHY PROJECTION USING CONVERSION OF 2D TO 3D VISUALIZATION

Multi-Modal User Interaction

U-Pilot can fly the aircraft using waypoint navigation, even when the GPS signal has been lost by using dead-reckoning navigation. Can also orbit arou

CRAFT UAV CRAFT CUSTOMIZABLE SIMULATOR

Classical Control Based Autopilot Design Using PC/104

Training Schedule. Robotic System Design using Arduino Platform

KINECT CONTROLLED HUMANOID AND HELICOPTER

MULTIPURPOSE QUADCOPTER SOLUTION FOR AGRICULTURE

Profiling Radiometer for Atmospheric and Cloud Observations PRACO

Learning Media Based on Augmented Reality Applied on the Lesson of Electrical Network Protection System

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

AN EXPLORATION OF UNMANNED AERIAL VEHICLE DIRECT MANIPULATION THROUGH 3D SPATIAL INTERACTION. KEVIN PFEIL B.S. University of Central Florida, 2010

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

A Kinect-based 3D hand-gesture interface for 3D databases

Free Making Things See: 3D Vision With Kinect, Processing, Arduino, And MakerBot (Make: Books) Ebooks Online

AirMagnet Spectrum XT

Not For Sale. Introduction to Game Development. Chapter 1

OughtToPilot. Project Report of Submission PC128 to 2008 Propeller Design Contest. Jason Edelberg

Consultation Paper on Using a Portion of the Band GHz for Tactical Common Data Link (TCDL) Systems

Applying Multisensor Information Fusion Technology to Develop an UAV Aircraft with Collision Avoidance Model

Short Course on Computational Illumination

TEAM AERO-I TEAM AERO-I JOURNAL PAPER DELHI TECHNOLOGICAL UNIVERSITY Journal paper for IARC 2014

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

Air Marshalling with the Kinect

A Publicly Available RGB-D Data Set of Muslim Prayer Postures Recorded Using Microsoft Kinect for Windows

A wireless, ultra-portable oscilloscope with impressive specs debug circuits in their natural environment.

DISCO-PRO AG ALL-IN-ONE DRONE SOLUTION FOR PRECISION AGRICULTURE. 80ha COVERAGE PARROT SEQUOIA INCLUDES MULTI-PURPOSE TOOL SAFE ANALYZE & DECIDE

Android Studio Development Essentials - Android 6 Edition By Neil Smyth

Indoor Positioning with a WLAN Access Point List on a Mobile Device

The Next Generation Design of Autonomous MAV Flight Control System SmartAP

Apple ARKit Overview. 1. Purpose. 2. Apple ARKit. 2.1 Overview. 2.2 Functions

Gaze-controlled Driving

Introduction to Mobile Sensing Technology

AN ORIENTATION EXPERIMENT USING AUDITORY ARTIFICIAL HORIZON

Interfaces Controllers Microphones. Stands

Microsoft xbox 360 support number

Inertial Systems. Ekinox Series TACTICAL GRADE MEMS. Motion Sensing & Navigation IMU AHRS MRU INS VG

PRODUCTS AND LAB SOLUTIONS

The Making of a Kinect-based Control Car and Its Application in Engineering Education

The brain for the plane is the Airelectronics' U-Pilot flight control system, which is embedded inside the plane's fuselage, leaving a lot of space on

Devastator Tank Mobile Platform with Edison SKU:ROB0125

Datasheet: AirMagnet Spectrum XT

Using Unmanned Aircraft Systems for Communications Support

SDR OFDM Waveform design for a UGV/UAV communication scenario

Hopper Spacecraft Simulator. Billy Hau and Brian Wisniewski

Published in: Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction

MFAM: Miniature Fabricated Atomic Magnetometer for Autonomous Magnetic Surveys

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

Virtual Reality Based Scalable Framework for Travel Planning and Training

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

vstasker 6 A COMPLETE MULTI-PURPOSE SOFTWARE TO SPEED UP YOUR SIMULATION PROJECT, FROM DESIGN TIME TO DEPLOYMENT REAL-TIME SIMULATION TOOLKIT FEATURES

Scalable geospatial 3D client applications in X3D - Interactive, online and in real-time

How real time picture and situational awareness can be improved by using Unmanned Aircraft Systems (UAS)?

Controllers. Synths. Speakers. Amps. Stands

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

Getting Started with EAA Virtual Flight Academy

PLEASE NOTE! THIS IS SELF ARCHIVED VERSION OF THE ORIGINAL ARTICLE

Cutting-edge image quality

Educational Augmented Reality Tools: Development, Implementation, and Assessment of Phase I

project gnosis tech ed development centre Teaching Kids since 2013

A Lightweight Open Source Command and Control Center and its Interface to Cubesat Flight Software

Using Adobe Photoshop

SMART CITY TECH WORKSHOPS

Quick Reference Guide - Behind The Blackboard! Getting Started Guide - Techsmith Njcountyrecording.com Getting Started Guide

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Intuitive Vision Robot Kit For Efficient Education

Drones and Ham Radio. Bob Schatzman KD9AAD

CS415 Human Computer Interaction

Playing xbox 360 games on laptop

Platform KEY FEATURES OF THE FLUURMAT 2 SOFTWARE PLATFORM:

UAV CRAFT CRAFT CUSTOMIZABLE SIMULATOR

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Customer Showcase > Defense and Intelligence

Table of Contents. Page #'s Title Name Department Controlling Robots in Cluttered Environments Marc Killpack Mechanical Engineering

Available online at ScienceDirect. Procedia Computer Science 50 (2015 )

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Visual Tracking and Surveillance System

A New Capability for Crash Site Documentation

WPI PPL System Development Updates & Overview of the results from the August 2008 WPI PIPILTER Workshop

25 velocity-sensitive mini-keys keyboard USER MANUAL

Internet of Things and smart mobility. Dr. Martin Donoval POWERTEC ltd. Slovak University of Technology in Bratislava

Chrome para xbox one

The definitive guide for purchasing Bluetooth Low Energy (BLE) Beacons at scale

DV420 SPECTROSCOPY. issue 2 rev 1 page 1 of 5m. associated with LN2

Implementation Of Vision-Based Landing Target Detection For VTOL UAV Using Raspberry Pi

Vishnu Nath. Usage of computer vision and humanoid robotics to create autonomous robots. (Ximea Currera RL04C Camera Kit)

GESTURE BASED HUMAN MULTI-ROBOT INTERACTION. Gerard Canal, Cecilio Angulo, and Sergio Escalera

A Cross-platform Application for Learning Physics Using Gamification. Name: Lam Matthew Ho Yan UID: Supervisor: Dr. T.W.

Transcription:

PLEASE NOTE! THIS IS SELF-ARCHIVED VERSION OF THE ORIGINAL ARTICLE To cite this Article: Rajamäki, J. (2016) Kinetic Controlled Flying of Micro Air Vehicles (MAV) for Public Protection and Disaster Relief (PPDR). In Prof. Valeri Mladenov (Ed.) Proceedings of the 10th International Conference on Circuits, Systems, Signal and Telecommunications (CSST '16). United States: WSEAS Press, 47-52. URL: http://www.wseas.us/e-library/conferences/2016/barcelona/csst/csst-05.pdf Laurea University of Applied Sciences Ratatie 22, 01300 Vantaa, Finland Phone +358 (0)9 8868 7150 Fax +358 (0)9 8868 7200 firstname.surname@laurea.fi www.laurea.fi Business ID 1046216-1 Domicile Vantaa

Kinetic Controlled Flying of Micro Air Vehicles (MAV) for Public Protection and Disaster Relief (PPDR) JYRI RAJAMÄKI Research, Design and Innovations Laurea University of Applied Sciences Vanha maantie 9, FI-02650 Espoo FINLAND http://www.laurea.fi Abstract: -We present a hands-free kinetic control method for flying of micro air vehicles (MAVs) or small unmanned aerial vehicles (UAVs) for public protection and disaster relief (PPDR). The system combines a 3D depth-sensing camera with a low-cost drone. The solution is based on Delicode Ltd s NI matetm software toolkit. In a study with 10 participants, the kinetic control method was tested and compared with the drone s normal control mode. The speed to fly a certain path inside a building, and the path accuracy at checkpoints were compared between these two modes. The learning rate with the kinetic control method was found to be better than by normal control. After five attempts, an overall flying speed and path accuracy were still weaker by kinetic control method than by conventional hands-on method. However, hands-free control gives several significant benefits for PPDR applications. Keywords - Kinetic, control, input, robot, mobile, drone, micro air vehicle (MAV), unmanned aerial vehicle (UAV), public protection and disaster relief (PPDR). 1 INTRODUCTION Public protection and disaster relief (PPDR) responders, such as law enforcement officers, and search and rescue personnel, have diverse needs to acquire information in order to build accurate realtime situational awareness. PPDR responders often find themselves in situations where they would benefit from having eyes in the sky. Solutions could be provided by small unmanned aerial vehicles (UAVs) or micro air vehicles (MAVs), which would be relatively inexpensive to purchase and operable without long and costly specialized training. These traits would make such services more readily available to a wide range of local civil authorities, such as police, fire and rescue, customs, border control, etc. [8] 1.1 Micro Air Vehicles The aerodynamics of MAVs is very different from that of conventional, larger aircraft. The Reynolds number, a key component of aircraft design [2], of MAVs are low compared to conventional aircraft (see Figure 1). Because the vast majority of aviation research has been in the flight regime of high Reynolds numbers, designing and operating MAVs present a significant aerospace engineering challenge. More detailed investigation of the unique aerodynamics of MAVs requires further investigation [2]. With different aerodynamics, also controlling of MAVs is different compared to conventional aircrafts. ISBN: 978-1-61804-366-5 47

2 Original Prototype and Earlier Studies Natural control of MAVs and UAVs is likely to emerge from research in the fusing of several human input and output modalities. The fusion may occur between sensor data retrieved from e.g. eye-, hand-, head- and facial muscle movements [1]. 2.1 Parrot AR Drone AR Drone (see Figure 2) is a radio controlled flying quadrotor helicopter built by the French company Parrot. It is designed to be controlled with ios devices, such as iphone, ipad, or ipod Touch, as shown in Figure 3. Today, official apps are also available for Android devices, and unofficial apps for Samsung BADA and Symbian devices [9]. Parrot AR Drone was introduced at the International Consumer Electronics Show (CES) Las Vegas in 2010, and the AR Drone 2.0 was unveiled at CES Las Vegas 2012. This small UAV weighs 420 grams. It features a HD 720P front camera and more sensors such as a Figure 1. Classes of flying vehicles arranged by mass and characteristic Reynolds Number, adapted from Mueller [5]. pressure sensor working like an altimeter. Communication with the base station is via WiFi. Commands are transmitted to the robot every 100 milliseconds, thus continuously updating the navigation instructions [3]. Figure 3. AR.FreeFligh control application for iphone [3] ISBN: 978-1-61804-366-5 48

2.2 Demo of Gaze Controlled Flying Alapetite, Hansen and MacKenzie [1] developed a demo that applies Parrot AR Drone. The intent of the demo was to present a test-of-concept for an eye in the sky that can be controlled and manipulated intuitively by gaze. The display in the control room allows the user to perceive visual information acquired by the embedded camera on the drone. Below the display monitor there is a gaze tracking unit. With this, a fly-where-you-look control principle is investigated. Their approach relies on a direct feedback loop with no visible interface components displayed. They utilize the point of regard on the screen directly as the user, who is situated in a control room, observes the streaming video to continuously adjust the locomotion of the UAV. Figure 2. AR Drone 2.0 3 The Design Process 3.1 3D Depth-sensing Camera together with a Small UAV Today, cameras can sense depth and track movements. Motion-sensing games are the well-known application of this technology. The 3D depth-sensing cameras are getting smaller and smaller, and we have lately seen the release of many embedded and mobile designs. One of the additional advantages of 3D depthsensing is that such a camera, being Infra-Red based, works accurately in the darkness. Our demo was first introduced at the SNUC 2013 (the Secure communications Network operators and Users Conference) France on February 26th and 27th, 2013. It is based on Delicode Ltd s NI matetm software toolkit. Delicore is a start-up company based in Helsinki. It specializes in designing and developing software and user interfaces for these novel sensor technologies. Working together with Cassidian Finland Ltd and Laurea University of Applied Sciences, Delicode designed applications of combining a 3D depth-sensing camera with MAVs. 3.2 Kinect for Windows Kinect for Windows is a motion sensing input device by Microsoft. It is first developed for the Xbox 360 video game console. Based around a webcam-style add-on peripheral for the Xbox 360 console, it enables users to control and interact with the Xbox 360 without the need to touch a game controller, through a natural user interface using gestures and spoken commands. Today, Kinetic technology is open for several operating systems and the Kinect for Windows sensor and software development kit (SDK) is available. Kinetic sensors and SDK offer a development platform for several end-user experiences. They offer the potential to transform how people interact with computers in multiple industries, including education, healthcare, retail, transportation, and beyond. [4] 3.3 Processing According to processing.org web pages [7]: Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work. Today, there are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning, prototyping, and production. 3.4 NI Mate NI (Natural Interaction) mate is a flagship product developed by Delicode Ltd. It is small but powerful software that takes real-time motion capture data from an OpenNI compliant device such as the Kinect for Windows, Asus Xtion or PrimeSense Carmine and turns it into two industry standard protocols: OSC ISBN: 978-1-61804-366-5 49

(Open Sound Control) and MIDI (Musical Instrument Digital Interface). Because NI mate is available for Windows, Mac OS X and Ubuntu Linux, it offers users easy installation and user-friendly configuration interface. Standard protocols for its output make NI mate as a flexible piece of software. NI mate is available for Windows machines through the OpenNI Arena. A new version of NI mate operates e.g. with Kinect for Windows devices. To make the device functioning with NI mate, we installed the Kinect for Windows runtime drivers and SKD from Microsoft. The minimum recommended specifications for a computer running NI mate are a 32 bit (x86) or 64 bit (x64) dual-core 2.66-GHz or faster processor, dedicated USB 2.0 bus and at least 2 GB RAM. The required minimum space for capturing the full armature in NI mate is about 2x4 meters, while the maximum space a sensor device such as the Microsoft Kinect is able to track from is about 3x6 meters. [8] 4.1 Participants Ten unpaid volunteer participants (4 female) were recruited from the local university campus. Participants ranged from 21 to 51 years (mean = 29, SD = 8.8). No one was daily users of MAVs or UAVs. Participants had no prior experience with the system. 4.2 Apparatus The equipment to be flown was AR Drone 2.0 with 3D depth-sensing camera. With regard to the hands-on control mode, we applied Apple iphone 5 64GB multimedia phone with AR.FreeFligh control application. With regard to the Kinetic control mode, we applied HP ProBook 4545s 15.6" HD/A4-4300M/4 GB/500 GB/Windows 8 Pro 64-bit personal computer, with Kinect for Windows tools and NI matetm software toolkit. 4 Evaluating the Interaction Given the above control mechanisms, an evaluation testing of our new control method was carried out. Our goal was to investigate whether the changes of control method result in improved interaction. The kinetic control method was tested as a baseline for comparison against the original hands-on interaction method shown in Figure 3. 4.3 Procedure The experiment was performed in a hall of a university building. Prior to data collection, participants completed a pre-test questionnaire soliciting demographic data. The experiment began with a training session. This involved flying the drone two times from a starting point about 10 meters straight away and then back. This was made by both controlling modes. The goal was to bring participants Figure 4. The path to be flown Figure 5. Alternative paths ISBN: 978-1-61804-366-5 50

Figure 6. Mean rate of flying speed by control mode Figure 7. Flying speed (m/s) by control mode and trial up to familiar with flying of drone. Training was followed by flying a certain beforehand marked path five times with both controlling methods. Half of the participants applied kinetic control methods first, another half started with the conventional hands-on method. Participants were asked to fly "as quickly and accurately as possible" the path, which was marked by tapes on the floor and poles with flags. The 15-metre long path (one direction) started from a marked spot on the floor. After going up in the air, the drone was first flown along with the path to one direction having landing at another marked spot on the floor. Then the drone was flown back via the same path and landed to the starting spot, so the total distance to fly was 30 metre. The accuracy was measured from both landings by the distance from middle of the spot to the nearest point of the drone. 4.4 Design Figure 4 illustrates the path to be flown. Figure 5 shows the alternative possibilities to fly the path acceptable. The independent variable is the controlling mode. The dependent variables are flying speed and landing inaccuracy. 5 Results and Discussion 5.1 Flying Speed The results for flying speed are shown in Figures 6 and 7. The overall mean rate of flying time was 46.9 sec by kinetic control and 20.1 sec by traditional hands-on control. These values were equivalent to flying speeds 0.64 m/s and 1.49 m/s respectively. As expected, flying speed increased significantly across trials as shown in Figure 7. The learning effect by kinetic controlling method was bigger and it seems that the kinetic control method gains on the flying speed by the hands-on mode after 10 trials, as illustrated in Figure 8. Because the learning effect and low number of trials, the statistical significance of the flying speed with different controlling methods is not able to be tested. 5.2 Landing inaccuracy Every trial had two landings (one in the middle, another at the end). The results of landing inaccuracy are shown in Figures 9 and 10. The overall inaccuracy was 1.65 metre. The overall mean rate of landing inaccuracy was 2.52 metre by kinetic control and 0.77 metre by traditional hands-on control. This means that the inaccuracy of kinetic control method was 2.9 times worse than by traditional hands-on method. The difference was statically significant (F 1,9 =222, p<.0001). However, also the landing accuracy had a great learning effect, as Figure 10 shows. ISBN: 978-1-61804-366-5 51

6 Conclusions We presented a new kinetic controlling and manipulating method for small unmanned aerial vehicles and micro air vehicles that allows hands-free controlling. In a user study, the overall flying speed and landing accuracy were not so good than by conventional hands-on method. On the other hand, the user study contained only five trials. Because of a significant learning effect, a longer-lasting user study should be arranged. Figure 8. Learning effects Figure 9. Mean rate of landing accuracy by control mode Figure 10. Landing accuracy by control mode and trial References [1] Alapetite, A., Hansen, J. P., and MacKenzie, I. S. (2012). Demo of gaze controlled flying. Proceedings of the Eighth Nordic Conference on Human-Computer Interaction NordiCHI 2012. New York: ACM, 773-774. [2] Benson, T., Reynolds Number, Glenn Research Center, Cleveland, http://www.grc.nasa.gov/www/k- 12/airplane/reynolds.html, retrieved 4 Jan 2009. [3] Drone [online] http://www.ardrone.com [4] Kinect for Windows [online] http://www.microsoft.com/en-us/kinectforwindows/ discover/features.aspx [5] Mueller, T. J. (2009) On the Birth of Micro Air Vehicles, International Journal of Micro Air Vehicles, Multi-Science Publishing Co Ltd, Essex, March (2009), 1-12. [6] NI Mate [online] http://www.ni-mate.com [7] Prosessing.org [online] http://www.processing.org/ [8] Ruoslahti, H., Guinness, R., Viitanen, J., and Knuuttila, J., Airborne Security Acquisition Using Micro Air Vehicles: Helping Public Safety Professionals Build Real-Time Situational Awareness, In Proc. Hawaii International Conference on System Sciences (2010). [9] Webster, A., AR.Drone coming to Android, gets new multiplayer games, June 3 (2011) Ars Technica [online] http://arstechnica.com/gaming/2011/06/ardronecoming-to-android-gets-new-multiplayer-games/ ISBN: 978-1-61804-366-5 52