AVATAR: Autonomous Operations of Ground- Based Robots Performed from the ISS

Similar documents
Canadian Activities in Intelligent Robotic Systems - An Overview

REMOTE OPERATION WITH SUPERVISED AUTONOMY (ROSA)

Design of a Remote-Cockpit for small Aerospace Vehicles

A FRAMEWORK FOR AUTONOMOUS SPACE ROBOTIC OPERATIONS

Autonomous Cooperative Robots for Space Structure Assembly and Maintenance

On-demand printable robots

7th ESA Workshop on Advanced Space Technologies for Robotics and Automation 'ASTRA 2002' ESTEC, Noordwijk, The Netherlands, November 19-21, 2002

Randomized Motion Planning for Groups of Nonholonomic Robots

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

6 System architecture

Ground Systems for Small Sats: Simple, Fast, Inexpensive

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

UNIT VI. Current approaches to programming are classified as into two major categories:

Design and Control of the BUAA Four-Fingered Hand

Tele-manipulation of a satellite mounted robot by an on-ground astronaut

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Saphira Robot Control Architecture

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

An Agent-based Heterogeneous UAV Simulator Design

A TEST-BED FOR THE DEMONSTRATION OF MSS GROUND CONTROL. É. Dupuis*, J.-C. Piedbœuf*, R. Gillett**, K. Landzettel***, B. Brunner***

The Design of key mechanical functions for a super multi-dof and extendable Space Robotic Arm

Development of a telepresence agent

Multi-Agent Planning

Range Sensing strategies

Accessible Power Tool Flexible Application Scalable Solution

Automation & Robotics (A&R) for Space Applications in the German Space Program

Autonomous Satellite Servicing Using the Orbital Express Demonstration Manipulator System

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

End-to-End Simulation and Verification of Rendezvous and Docking/Berthing Systems using Robotics

ROBOTIC MANIPULATION AND HAPTIC FEEDBACK VIA HIGH SPEED MESSAGING WITH THE JOINT ARCHITECTURE FOR UNMANNED SYSTEMS (JAUS)

Robot: Robonaut 2 The first humanoid robot to go to outer space

Skyworker: Robotics for Space Assembly, Inspection and Maintenance

Robot Task-Level Programming Language and Simulation

The Lunar Split Mission: Concepts for Robotically Constructed Lunar Bases

Operationally Responsive Satellite System CuSat - Nanosat with an Attitude

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

IMPLEMENTING MULTIPLE ROBOT ARCHITECTURES USING MOBILE AGENTS

C. R. Weisbin, R. Easter, G. Rodriguez January 2001

Multisensory Based Manipulation Architecture

Wheeled Mobile Robot Kuzma I

Multi-Robot Cooperative System For Object Detection

Dynamics and Operations of an Orbiting Satellite Simulation. Requirements Specification 13 May 2009

A LARGE COMBINATION HORIZONTAL AND VERTICAL NEAR FIELD MEASUREMENT FACILITY FOR SATELLITE ANTENNA CHARACTERIZATION

ARCHITECTURE AND MODEL OF DATA INTEGRATION BETWEEN MANAGEMENT SYSTEMS AND AGRICULTURAL MACHINES FOR PRECISION AGRICULTURE

Image Guided Robotic Assisted Surgical Training System using LabVIEW and CompactRIO

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Jager UAVs to Locate GPS Interference

A FACILITY AND ARCHITECTURE FOR AUTONOMY RESEARCH

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

KySat-2: Status Report and Overview of C&DH and Communications Systems Design

Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats

t =1 Transmitter #2 Figure 1-1 One Way Ranging Schematic

Analysis of Tumbling Motions by Combining Telemetry Data and Radio Signal

Robot Navigation System with RFID and Ultrasonic Sensors A.Seshanka Venkatesh 1, K.Vamsi Krishna 2, N.K.R.Swamy 3, P.Simhachalam 4

Robotic System Simulation and Modeling Stefan Jörg Robotic and Mechatronic Center

Franka Emika GmbH. Our vision of a robot for everyone sensitive, interconnected, adaptive and cost-efficient.

Platform Independent Launch Vehicle Avionics

Ground Station Design for STSAT-3

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

University. Federal University of Santa Catarina (UFSC) Florianópolis/SC - Brazil. Brazil. Embedded Systems Group (UFSC)

RobOps Approaching a Holistic and Unified Interface Service Definition for Future Robotic Spacecraft

Revised and extended. Accompanies this course pages heavier Perception treated more thoroughly. 1 - Introduction

Inter-Device Synchronous Control Technology for IoT Systems Using Wireless LAN Modules

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

H2020 RIA COMANOID H2020-RIA

CubeSat Navigation System and Software Design. Submitted for CIS-4722 Senior Project II Vermont Technical College Al Corkery

Team Description Paper: Darmstadt Dribblers & Hajime Team (KidSize) and Darmstadt Dribblers (TeenSize)

Satellite Sub-systems

ASSESSMENT OF SPHERES

NCUBE: The first Norwegian Student Satellite. Presenters on the AAIA/USU SmallSat: Åge-Raymond Riise Eystein Sæther

SELF-BALANCING MOBILE ROBOT TILTER

Wide Area Wireless Networked Navigators

Technical Notes LAND MAPPING APPLICATIONS. Leading the way with increased reliability.

Nigerian Communications Satellite Ltd. (NIGCOMSAT)

3-DEMON MONITORING PLATFORM: EXAMPLES OF APPLICATIONS IN STRUCTURAL AND GEOTECHNICAL MONITORING PROJECTS

Advanced Robotics Introduction

Attitude Determination. - Using GPS

Vision-based Localization and Mapping with Heterogeneous Teams of Ground and Micro Flying Robots

THE SPHERES ISS LABORATORY FOR RENDEZVOUS AND FORMATION FLIGHT. MIT Room Vassar St Cambridge MA

Autonomous and Autonomic Systems: With Applications to NASA Intelligent Spacecraft Operations and Exploration Systems

THE DEVELOPMENT OF A LOW-COST NAVIGATION SYSTEM USING GPS/RDS TECHNOLOGY

LOCALIZATION WITH GPS UNAVAILABLE

The Oil & Gas Industry Requirements for Marine Robots of the 21st century

Technical Notes FOR MARINE MAPPING APPLICATIONS. Leading the way with increased reliability.

General Environment for Human Interaction with a Robot Hand-Arm System and Associate Elements

ReVRSR: Remote Virtual Reality for Service Robots

Using Simulation to Design Control Strategies for Robotic No-Scar Surgery

Real-Time Bilateral Control for an Internet-Based Telerobotic System

PolySat Launch and Operations

Amateur Radio On The International Space Station (ARISS) Status & Future Plans. AMSAT-UK 20 th Colloquium University of Surrey July 31, 2005

Multi-channel telemetry solutions

Prof. Ciro Natale. Francesco Castaldo Andrea Cirillo Pasquale Cirillo Umberto Ferrara Luigi Palmieri

Modeling and Experimental Studies of a Novel 6DOF Haptic Device

World Automation Congress

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Transcription:

AVATAR: Autonomous Operations of Ground- Based Robots Performed from the ISS Eric Martin, Régent L'Archevêque, Sébastien Gemme, Tony Pellerin, Jean-François Cusson, Erick Dupuis Canadian Space Agency, Space Technologies 6767 route de l'aéroport, Saint-Hubert, Quebec, Canada J3Y 8Y9 E-mails: FirstName.LastName@space.gc.ca Abstract The international exploration initiatives currently being defined by most space agencies place a strong emphasis on robotic missions: either as support of human missions or as precursor to astronaut flights. One scenario being considered is to have a system on a planetary body, such as Mars or the Moon, operated by a human in a spacecraft orbiting around it. Such robotic system should have various levels of autonomy for better efficiency. In that context, the Canadian Space Agency (CSA) has initiated the Avatar project consisting in a series of missions used to validate and increase the level of maturity of its autonomous robotic and software technologies. This paper presents the first planned mission, Avatar RESCUE, where a robotic test bed located at the CSA headquarters will be operated from the International Space Station using a low bandwidth amateur radio link. 1. Introduction The international exploration initiatives currently being defined by most space agencies place a strong emphasis on robotic missions: either as support of human missions or as precursor to astronaut flights. One of the concepts being envisaged is to have astronauts orbiting around a planetary body (such as Mars or the Moon) and controlling landed assets on the surface. Another key element to all human exploration efforts is the necessity to perform robotic operations in space for assembling and maintaining spacecraft. For better efficiency, the operation of these robotic systems shall be performed with various level of autonomy. During the past few years, the Canadian Space Agency has been developing the Autonomous Robotics and Ground Operations (ARGO) software suite. ARGO provides a framework for the integration of the space operations process from planning to postflight analysis. The objective of ARGO is to reduce operational costs and increase efficiency by providing operator aids and permitting the implementation of a level of autonomy appropriate to the application. The target applications of the ARGO framework cover the full spectrum of autonomy from supervisory control such as might be expected for ISS robotics to more autonomous operations such as would be encountered in planetary exploration missions. It also covers the full range of space robotic applications from orbital manipulators to planetary exploration rovers. One of the important features of the ARGO framework is that it does not provide a universal architecture for ground control and autonomy. Instead, ARGO provides a set of toolboxes that can be assembled in a variety of manners depending on the application and its requirements. To facilitate the re-use of software, the design is modular and portable to the maximum extent possible. Within the ARGO framework, the Robotics Group of the Space Technologies division of the Canadian Space Agency (CSA) demonstrated many times in 2006/2007 the autonomous capture of a tumbling satellite using the CSA's Automation and Robotics Testbed (CART) through remote locations using low bandwidth internet links. As shown in Figure 1, CART is a dual-manipulator testbed where each arm has seven degrees-of-freedom (dof). One arm emulates the free-flyer dynamics and the second is the chaser robot equipped with a SARAH hand. The Laser Camera System (LCS) from Neptec is used to guide the chaser robot. A hierarchical finite state machine engine based on Cortex, one of the components of ARGO, is used to coordinate the autonomous capture task and to provide feedback to the remote operator. The role of the Copyright Canadian Space Agency 2008. All rights reserved.

operator is very simple: initiate the capture by submitting the high level command, and monitor the chaser robot while performing the task. The sequencing of events (approach, fly together, and capture) is fully autonomous. Russian Segment on ISS Avatar Operator Station Amateur radio link CART Testbed Located at CSA SARAH Figure 1: A dual manipulator system that simulates the tracking and capture scenario; the manipulator on the left is equipped with a hand, and the manipulator on the right is mounted by a mock-up satellite. Avatar is a series of space missions used to validate and increase the level of maturity of the autonomous robotic and software technologies developed at the Canadian Space Agency within the ARGO framework. In the first planned mission, Avatar RESCUE, it is proposed to bring the project described in the previous paragraph one step further by initiating the commands to operate the system from the ISS. An amateur radio already available on the Russian segment of the ISS will be used to communicate with a ground radio counterpart located at the CSA facility in St-Hubert. The concept of Avatar RESCUE flight demonstration is summarized in Figure 2. This mission is planned for the spring of 2008. In this paper, the various components of the Avatar RESCUE mission will be described in detail in the following sections. An overview of the second Avatar mission is presented in Section 6. 2. ISS Operator Station The operator station on the International Space Station is composed of an IBM ThinkPad A31 laptop connected to a Kenwood TM-D700A amateur radio through a RS-232 link. The radio is connected to a dual-band antenna, and thus can transmit in the UHF and VHF bands (144-146 MHz and 430-440 MHz). One band is used for uplink and the other one is used for downlink. The built-in Terminal Node Controller (TNC) of the radio is used to transmit data using AX.25 frames. The details of the transmission protocol are presented in Section 3. LCS Quicksat Figure 2: Concept of the flight demonstration. The Operator Graphical User Interface (GUI) is presented in Figure 3. It follows the IDAGS (Integrated Display and Graphics Standard) standard as for the various GUIs used on the ISS. It is available both in English and Russian languages. It is composed of four sections: the Operator panel, the Viewer 3D, the Error Log panel and the Chat panel. Using the Operator panel, the astronaut can select among the three Operation Modes (described in Section 5), namely the Semi-Autonomous, the Auto- Pilot and the Autonomous modes. He can Unsafe and Safe the arm at the beginning and end of the mission. The middle part of the Operator panel is used to send the high-level commands Capture and Release. In both cases, the operator must press the Execute button to confirm the execution of the command. The Confirm button is used only in the Semi-Autonomous mode and allows to move to the following phase of the capture sequence. The Abort button can be selected at any time in all operation modes. Finally, the bottom part of the Operator panel is used to present feedback to the operator about some pre-defined failures that could occur during the experiment. The bandwidth of the amateur radio link being too limited to transfer video stream or images in real-time, it was decided to render a 3D model of the chasing arm and the target satellite to the operator. In that case, only 13 numbers must be transferred corresponding to the 7 joint angles of the manipulator and the 6 numbers required to present the pose of the satellite. The 3D model was calibrated with respect to the real testbed.

The Error Log panel is used to display to the operator all phases of the autonomous scenario. It can be used to understand why a specific decision has been taken by the autonomy engine. The Chat panel can be used to exchange information between the operator on the ISS and the controllers on the ground. Note that the Controller GUI on the ground has also the panels presented to the operator plus an additional Controller panel. The controller can also send commands to the system if needed. Figure 3: GUI of the Avatar Operator Station the glue that is used to integrate the toolboxes available in the ARGO framework. It provides a set of standard Application Programming Interfaces (API) unifying the methods used to send commands and receive telemetry between sub-systems. The Remote Toolbox can be used to connect a ground station to a remote robot, to integrate inter-acting sub-systems on the remote robot or even to link ground stations together. It provides generic, reliable and manageable tools for remote-system control. 3. Data Communication This section documents the data handling between the ISS and the CSA ground segment through the amateur radio link and the communication infrastructure between the various required computers at CSA. This communication infrastructure is summarized in Figure 4. All the communication data handling is based on Remote, a subsystem of the ARGO framework. Remote will first be described followed by the specifics of the amateur radio link. 3.1. Remote Description The Remote (Remote Execution and Monitoring of Objects for Teleoperation Environments) Toolbox is Any system that can be locally controlled through a Java class layer can be remotely controlled through the Remote API. Its modular construction allows easy interfacing with any communication protocol or media. The Remote API allows the software developer to make abstraction of the fact that the software is distributed over several processes and platforms. Each interface is coded with an abstraction layer that is handled by the Remote API through a Java Layer. The Remote API has been developed as a multiuser, multi-target system and provides mechanisms for conflict-resolution among users. It offers monitoring and administration tools, which allow some users to observe and eventually interfere with other users actions, without compromising system stability. Extensive error-tracking and execution-feedback

features are also part of this toolbox, as well as abort mechanisms. Protocole (LTP), was developed over the AX.25 protocol to get packet receive acknowledgment, as for the TCP/IP protocol. In order to communicate, a TNC is connected to the computer using a serial link. The protocol between the computer and the TNC is KISS, which is very similar to HDLC protocol. Once the TNC receives the frame, is converts it into AX.25 before sending it to the receiving TNC. The receiving TNC then converts it into KISS and sends it to the receiving host. This setup is presented in Figure 5. Figure 5: Setup of Remote ISS-CART Network Figure 4: CART Communication Architecture The Remote toolbox provides a set of metacommands to manage and monitor the submission/execution queue of commands to a remote device. In particular it allows the user to cancel previously submitted commands that are queued for execution, to monitor the progress of commands, from acknowledgement to completion, and to be alerted on any error or abnormal situation that occurs during the execution process. It allows safe multi-user operation, together with priority mechanisms that prevent conflicts. It also provides both blocking and nonblocking operation, as well as batch-mode command envoy. Finally, Remote also provides advanced scheduling and prioritizing mechanisms. The toolbox provides default TCP-IP and Local communication protocols. It also offers the developer the option to provide his own protocol (e.g. UDP, AX.25). 3.2. Amateur Radio Link Through AX.25 As mentioned above, an amateur radio link is used to transfer required data to remotely operate CART in a wireless mode. The communication link uses the AX.25 protocol to transfer data. AX.25 is a layer 2+3 protocol in the OSI Networking Model. AX.25 is the equivalent of IP protocol in the TCP/IP paradigm. This protocol does not perform any check to make sure packets are received on the other side, the packet is simply sent and no acknowledgment is needed. Therefore, an extra layer, the Latency Tolerant 4. CSA Amateur Radio Ground Station The Canadian Space Agency owns and operates an amateur radio station at its St-Hubert headquarters. The radio used for the Avatar project is a Yaesu FT- 847 connected to a Spirit-2 TNC that is connected to a computer through a serial port. The radio is also directly connected to the same computer such that it can be commanded during passes to control its frequency to compensate in real time for the Doppler shift. The antenna system is composed of a tri-band omnidirectional monopole and two high gain CP Yagi antennae. A two-axis (EL/AZ) motor controller is installed with a separate module to interface with the computer via a serial port. The motor controller includes switches so that the antenna can be rotated manually. The omnidirectional antenna can accommodate UHF, VHF and HF communication, while one Yagi is for UHF and the other one for VHF. On the computer mentioned above, a software is installed to take control of the radio and the antenna rotator. The software, developed by CSA personnel, has the following capabilities: from the press of a button, the software automatically connects to a specialized site on the Internet to download the latest keplerian elements calculated by NORAD. The keplerian elements are a set of parameters that are required in order to predict the orbit of the space station (or any orbiting satellite). Since the space station is in a rather low orbit and is affected by atmospheric residues, it is impossible to predict its orbit for

more than a couple of days in advance. Therefore it is extremely important to update the keplerian elements approximately every week; from the press of a button, the software automatically connects to the National Research Counsil Canada at the institute for national measurement standards via the Internet and synchronize the software clock with the atomic clock in Ottawa. The precision of this synchronization is no more than one second for now. Clock precision is extremely important, since orbit prediction depends on the keplerian elements and the knowledge of the exact current time. A delay of more than 15 seconds can seriously affect the communication link during a pass, and in most instances we want to be much more precise than that; allow the user to select the space station as the satellite to track, and provides the exact countdown to the next acquisition of signal (AOS). Also provides a map of the world with the current position of the space station and the antenna footprint, with real time updates; when the space station is in range, automatically stears the antenna so that it points in its direction, to get the best possible signal; when the space station is in range, automatically change the receiving frequency of the radio to compensate in real time for the Doppler shift; when the signal is acquired, displays a countdown to loss of signal (LOS). Finally, as shown in Figure 4, this computer is connected to the CART Gateway through an AX.25/TCP bridge. 5. CART Testbed The CART testbed is composed of four main hardware components, namely the Target Satellite held by one robotic arm, the Chaser Manipulator, the SARAH hand attached at this manipulator, and the vision system composed of a laser sensor and its pose determination software. Below, a short description of each element is presented. 5.1. Target Satellite Many satellites use momentum wheels to stabilize and to control their attitude. When the attitude-control system fails, due to friction, the angular momentum stored in the wheels is transferred to the satellite body over time. This momentum transfer causes tumbling in the satellite. At the same time, the satellite is under the action of small external, nonconservative moments, which make the satellite tumbling motion occur mostly about its major principal axis [1], i.e., the axis about which the satellite moment of inertia is maximum. Therefore, to mimic the tumbling motion of a satellite, we assume that the satellite is initially rotating about an axis very close to its major principal axis. Then, to create the trajectory, we solve the Euler equations assuming no external moments. It should be noted that, even though there are dissipative moments acting on the satellite, they would not have any significant effect over the short period of capture and the angular momentum of the satellite would be conserved. The ensued motion of the target satellite can be characterized as a major rotation about a spin axis with small precession and nutation [2]. This is similar to the motion described in [3]. The satellite trajectory is generated offline and then provided as an input to the robotic arm holding the satellite. In order to avoid joint limits, the force sensor at the wrist of the robot was disconnected together with the torque sensor at the last joint. In this configuration, the last joint of the robot can be rotated continuously without reaching any joint limit. By making sure the main rotation axis of the satellite is aligned with this joint, the robot can tracked the satellite trajectory for several minutes without reaching any joint limit. 5.2. Vision System The Laser Camera System (LCS), shown in Figure 6(a), and the software CAPE for Collision Avoidance and Pose Estimation, both from Neptec [4], are used to generate the pose (position and orientation) of the target satellite with respect to the LCS Frame. The LCS sensor is particularly suited for space application as it is immune to harsh and/or changing lighting conditions [4]. The LCS is capable of handling solar inference, and it has been successfully flown on the Space Shuttle Discovery (STS-105) in 2001 and is part of all space shuttle missions since its return to flight, as it is used to inspect its underneath tiles. The range data from the LCS sensor are processed using Neptec's proprietary software CAPE and the pose of the satellite is obtained. The pose estimation method is model-based, using a CAD model of the target satellite and a modified version of the Iterative Closest Point algorithm. For more information, see [4].

The pose is calculated at about 2 Hz with a delay of 0.5 sec on the pose of the object at a given instant. The location of the LCS sensor with respect to the manipulator inertial frame was calculated using the kinematic model of the robot arm holding the target satellite. Consequently, the actual pose of the target satellite in the inertial frame could readily be calculated. 5.3. Chaser Manipulator This section describes how the trajectory of the chaser manipulator is generated. An extended Kalman filter is used to filter the LCS raw measurements and to provide a smoothed pose of the target satellite every millisecond. The Kalman filter is fully adaptative and does not need any priori knowledge of the inertia properties of the satellite or the noise properties of the vision sensor [5]. After a short observation period, the filter converges to the actual motion of the satellite and can be used to predict the pose of the satellite when the vision system becomes occluded. It is even possible to capture the satellite when the view of the vision system is fully occluded, only based on the prediction of the Kalman filter. More information is available in [6]. Based on the filtered poses of the capture frame attached to the capture handle, and the tool frame attached to the grasping device, a Cartesian velocity command is generated in the tool frame of the manipulator to progressively reduce the distance between them. Generating the command in the tool frame provided the opportunity to independently activate and assign each degree of freedom to a different task. In the experiments, the activation of the tasks is based on the distance between the grasping device and the target satellite. A detailed description of the algorithms used is provided in [7]. 5.4. SARAH Hand SARAH is an underactuated dexterous robotic hand developed by Université Laval [8]. It has three reconfigurable fingers mounted on a common structure. The fingers can envelop various shapes including cylindrical and spherical geometries. SARAH has 10 degrees of freedom but can be actuated with only two drive systems. One drive controls the opening and closing of the fingers while the other controls the orientation of the fingers to reconfigure the grasp. Each finger of SARAH has three phalanges. The selfadaptability of the hand is obtained using underactuation. Note that, although the hand passively adapts to any geometrical shape, it is not back-drivable and therefore provides a firm grip. The hand is shown in Figure 6(b). (a) Figure 6: (a) Laser Canera System (LCS); (b) Underactuated SARAH hand 6. Autonomy This section presents the three autonomy scenarios considered for the Avatar RESCUE mission. Since the autonomy is based on the Cortex software developed at CSA, this software is first shortly described in Section 6.1. The overall scenarios are presented in Section 6.2. 6.1. Cortex Description Cortex provides a set of tools to implement onboard autonomy software based on the concept of hierarchical finite state machines. Cortex allows an operator to graphically generate the behaviors to be implemented on the remote system. It automatically generates the code to be uploaded and it can be used to debug and monitor the execution of the autonomy software on-line and off-line. Cortex has been developed in light of the fact that the development of such behavior sets rapidly becomes labor intensive even for relatively simple systems when using low level programming languages, thus making reusability very difficult if not impossible. Cortex is based on the Finite State Machine (FSM) formalism, which provides a higher-level way of creating, modifying, debugging, and monitoring such reactive autonomy engines. Some advantages of this representation are its intuitiveness and ease with which it can be graphically constructed and monitored by human operators. The concept of hierarchical FSM allows a high-level FSM to invoke a lower-level FSM. This provides the capability to implement hierarchical task decomposition from a high-level task into a sequence of lower-level tasks. If the FSM is implemented in a modular fashion, it allows the implementation of the concept of libraries that provide the operator with the re-use of FSM from one application to another. (b)

6.2. Autonomy Scenarios As mentioned in Section 2, there are three Operation Modes that can be selected in the Operator panel. The three different modes were implemented in order to test scenarios with different level of autonomy. The Autonomous mode is, as one can guess, a fully autonomous mode. The autonomy engine, based on Cortex, is running on the ground. Once a high-level command as Capture is sent by the operator and received on the ground, it is decomposed in lowerlevel commands that can be interpreted by the chaser manipulator. The sequencing of commands is determined by the autonomy engine that will take the necessary actions to capture the satellite or abort the mission in case of malfunctions. The only intervention possible by the operator is an Abort command to abort the mission. At the limit, if the communication link is broken after the command is received on the ground, the operation will continue. Once the communication is reacquired, the operator will get feedback on the state of the system and see if the operation was successful. The Semi-Autonomous mode requires more interventions from the operator. After sending the Capture command, the operator must press the Confirm button in a few critical phases of the operation. In that case, if the communication link is broken, then the mission will abort and the chaser manipulator will return in a safe state. The Auto-Pilot mode is also a fully autonomous mode. However, in this case, an autonomy engine is used to replace the operator on-board the ISS. Basically, on the ground the system is in a state equivalent to the Semi-Autonomous mode. The Confirm commands are in this case sent by this autonomous engine replacing the operator, based on the available information. Therefore, as for the Semi- Autonomous mode, the mission will be aborted in case the communication link is broken. The objective of this mode is to demonstrate that an autonomy engine could be used to automate part of an existing mission without having to completely change the software and the information exchanged through the communication link. High-level commands could be decomposed locally in regular commands usually sent through the communication link. In all three operation modes, five different malfunctions have been considered and the autonomy scenario has been coded accordingly. These malfunctions can be generated artificially by the controller of the mission on the ground to check the correctness of the autonomy scenario. They can also be detected automatically during the mission if an anomaly would actually occur. In both cases, the feedback to the operator will be the same and presented in the Operator panel. As one can observe in Figure 3, the possible malfunctions are: Loss of communication link Hardware problem Loss of vision system Target unreachable Risk of collision Some anomalies may have a different effect depending on the mission stage. For example, if the vision system would fail during the initial approach where the chaser manipulator is a few meters away from the satellite, the chaser arm would continue to approach the satellite for X seconds while the autonomy engine will try to restart the vision system. If the restarting commands are not successful, then the arm would return in a safe state. On the other hand, if the failure would occur during the final approach, the approach could still continue but only for a shorter period of time of Y seconds, while trying to restart the vision system. If the vision signal is not re-acquired then the autonomy engine would bring back the chaser arm at the end of the initial approach phase for a period of X-Y seconds, again trying to solve the problem with the vision system. Finally, if the failure would occur during the capture phase, the mission would be aborted and the arm would return in the safe position. Another possibility would be to perform the capture of the satellite using the prediction of the pose of the satellite generated by the extended Kalman filter. However, it is not clear at this point if such risk can be taken. Additional tests are required to address the robustness of this method. 6. Next Avatar Mission As mentioned in Section 1, one of the concepts being envisaged for space operation is to have astronauts orbiting around a planetary body and controlling landed assets on the surface. With the Avatar RESCUE mission described in this paper, CSA will develop and test some of the critical elements required to perform such mission using an existing testbed and demonstration scenario. The new critical element in Avatar RESCUE is the transmission of data at low bandwith using an amateur radio link. In the second mission, that element being tested, a more complex scenario is being envisaged. A mobile robotic

testbed operating in the Mars emulation terrain located at the CSA headquarters in St-Hubert will be autonomously operated from the ISS. This Mars terrain, shown in Figure 7, simulates the topography found in typical Mars landscapes. 8. References [1] M.H. Kaplan, Modern spacecraft dynamics and control, New York: Wiley, 1976. [2] H. Goldstein, Classical mechanics, 2nd ed. Reading, MA: Addison-Wesley, 1980. [3] H. Nagamatsu, T. Kubota, and I. Nakatani, Capture strategy for retrieval of a tumbling satellite by a space robotic manipulator, In Proc. IEEE Int. Conf. Robotics and Automation (ICRA), Minneapolis, MN, 1996, pp. 70 75. Figure 7: A view of the Mars terrain located at CSA Headquarters in St-Hubert, Quebec, Canada In the scenario, the rover will first take a 3D scan of its environment and transmit it to the operator on the ISS. The operator will then select an interesting area for science investigation and send a high-level command to the rover to proceed and move to that location. After localizing itself, the rover will autonomously plan its trajectory to reach the desired location and navigate towards the destination while avoiding obstacles. Since the communication window with the ISS will only be possible during direct line of sight, which will last at most 10 minutes, this operation will be conducted fully autonomously without any communication link with the operator. Once the communication is re-acquired during a subsequent orbital pass, a new scan of the environment will be sent to the operator together with the appropriate feedback such that he can assess the state of the system and send further instructions to the rover. This second Avatar mission is being planned for 2009. [4] S. Ruel, C. English, M. Anctil, and P. Church, 3DLASSO: Real-time pose estimation from 3D data for autonomous satellite servicing, In Proc. of the 8th International Symposium on Artificial Intelligence, Robotics and Automation in Space isairas, Munich, Germany, 2005. [5] F. Aghili, and K. Parsa, Adaptive motion estimation of a tumbling satellite using laser-vision data with unknown noise characteristics, In Proc. IEEE Int. Conf. on Intelligent Robots and Systems (IROS), San Diego, CA, 2007, pp. 839-846. [6] F. Aghili, K. Parsa, and E. Martin, Robotic Docking of a Free-Falling Space Object with Occluded Visual Condition, In Proc. 9th Int. Symp. on Artificial Intelligence, Robotics and Automation in Space - isairas, Los Angeles, CA, 2008. [7] I. Rekleitis, E. Martin, G. Rouleau, R. L'Archevêque, K. Parsa. and E. Dupuis, Autonomous capture of a tumbling satellite, Journal of Field Robotics, Special Issue on Space Robotics, 24(4), 2007, pp. 275-296. [8] T. Laliberté, L. Birglen, and C. Gosselin, Underactuation in robotic grasping hands, Japanese Journal of Machine Intelligence and Robotic Control, Special Issue on Underactuated Robots, 4(3), 2002, pp. 77 87. 7. Conclusion In this paper, an overview of the first Avatar mission, Avatar RESCUE, was presented. In this mission, a robotic testbed located at the CSA headquarters will be operated from the ISS using a low bandwidth amateur radio link in a satellite servicing scenario. The various components of this mission, planned for the spring of 2008, were discussed. Finally, an overview of the second Avatar mission where a rover operating in the CSA Mars terrain will also be operated from the ISS was also presented. This second Avatar mission is being planned for 2009.