CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION

Similar documents
Object Exploration Using a Three-Axis Tactile Sensing Information

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

The Humanoid Robot ARMAR: Design and Control

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

Design and Implementation of a Simplified Humanoid Robot with 8 DOF

Kid-Size Humanoid Soccer Robot Design by TKU Team

Sensor system of a small biped entertainment robot

Chapter 1 Introduction

Team Description 2006 for Team RO-PE A

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

UKEMI: Falling Motion Control to Minimize Damage to Biped Humanoid Robot

Birth of An Intelligent Humanoid Robot in Singapore

Design and Control of the BUAA Four-Fingered Hand

Pr Yl. Rl Pl. 200mm mm. 400mm. 70mm. 120mm

Biomimetic Design of Actuators, Sensors and Robots

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

A Semi-Minimalistic Approach to Humanoid Design

Korea Humanoid Robot Projects

Concept and Architecture of a Centaur Robot

Concept and Architecture of a Centaur Robot

Moving Obstacle Avoidance for Mobile Robot Moving on Designated Path

Development and Evaluation of a Centaur Robot

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

RoboCup TDP Team ZSTT

Robotics. Lecturer: Dr. Saeed Shiry Ghidary

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Introduction to robotics. Md. Ferdous Alam, Lecturer, MEE, SUST

Body Movement Analysis of Human-Robot Interaction

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Information and Program

DEVELOPMENT OF THE HUMANOID ROBOT HUBO-FX-1

Sensing the Texture of Surfaces by Anthropomorphic Soft Fingertips with Multi-Modal Sensors

Haptic Tele-Assembly over the Internet

Control of ARMAR for the Realization of Anthropomorphic Motion Patterns

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Mechanical Design of Humanoid Robot Platform KHR-3 (KAIST Humanoid Robot - 3: HUBO) *

Chapter 1 Introduction to Robotics

Simple Path Planning Algorithm for Two-Wheeled Differentially Driven (2WDD) Soccer Robots

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

Robot Control Using Natural Instructions Via Visual and Tactile Sensations

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Development of Drum CVT for a Wire-Driven Robot Hand

UNIT VI. Current approaches to programming are classified as into two major categories:

Design and Development of Novel Two Axis Servo Control Mechanism

Design and Experiments of Advanced Leg Module (HRP-2L) for Humanoid Robot (HRP-2) Development

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

JEPPIAAR ENGINEERING COLLEGE

Adaptive Human-Robot Interaction System using Interactive EC

The Haptic Impendance Control through Virtual Environment Force Compensation

Development of a Humanoid Biped Walking Robot Platform KHR-1 - Initial Design and Its Performance Evaluation

Wireless Robust Robots for Application in Hostile Agricultural. environment.

THE HUMAN POWER AMPLIFIER TECHNOLOGY APPLIED TO MATERIAL HANDLING

Wednesday, October 29, :00-04:00pm EB: 3546D. TELEOPERATION OF MOBILE MANIPULATORS By Yunyi Jia Advisor: Prof.

On Observer-based Passive Robust Impedance Control of a Robot Manipulator

Design and Control of an Anthropomorphic Robotic Arm

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Five-fingered Robot Hand using Ultrasonic Motors and Elastic Elements *

Advanced Distributed Architecture for a Small Biped Robot Control M. Albero, F. Blanes, G. Benet, J.E. Simó, J. Coronel

Stabilize humanoid robot teleoperated by a RGB-D sensor

DEVELOPMENT OF A HUMANOID ROBOT FOR EDUCATION AND OUTREACH. K. Kelly, D. B. MacManus, C. McGinn

Experiments of Vision Guided Walking of Humanoid Robot, KHR-2

Cognition & Robotics. EUCog - European Network for the Advancement of Artificial Cognitive Systems, Interaction and Robotics

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Integration of Manipulation and Locomotion by a Humanoid Robot

Team Description for Humanoid KidSize League of RoboCup Stephen McGill, Seung Joon Yi, Yida Zhang, Aditya Sreekumar, and Professor Dan Lee

PHYSICAL ROBOTS PROGRAMMING BY IMITATION USING VIRTUAL ROBOT PROTOTYPES

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

2B34 DEVELOPMENT OF A HYDRAULIC PARALLEL LINK TYPE OF FORCE DISPLAY

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Development of Shape-Variable Hand Unit for Quadruped Tracked Mobile Robot

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Graphical Simulation and High-Level Control of Humanoid Robots

Adaptive Humanoid Robot Arm Motion Generation by Evolved Neural Controllers

Robot Task-Level Programming Language and Simulation

Modelling and Simulation of Tactile Sensing System of Fingers for Intelligent Robotic Manipulation Control

HRP-2W: A Humanoid Platform for Research on Support Behavior in Daily life Environments

Affordance based Human Motion Synthesizing System

Transactions on Information and Communications Technologies vol 6, 1994 WIT Press, ISSN

The Future of AI A Robotics Perspective

Prediction and Correction Algorithm for a Gesture Controlled Robotic Arm

Department of Robotics Ritsumeikan University

Pushing Manipulation by Humanoid considering Two-Kinds of ZMPs

Introduction to Robotics

Robotics. In Textile Industry: Global Scenario

Kazuo Hirai, Masato Hirose, Yuji Haikawa, Toru Takenaka Honda R&D Co., Ltd. Wako Research Center Chuo Wako-shi Saitama Japan

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Simulation of a mobile robot navigation system

AUOTOMATIC PICK AND PLACE ROBOT

KMUTT Kickers: Team Description Paper

sin( x m cos( The position of the mass point D is specified by a set of state variables, (θ roll, θ pitch, r) related to the Cartesian coordinates by:

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Motion Control of Excavator with Tele-Operated System

ZJUDancer Team Description Paper

Adaptive Motion Control with Visual Feedback for a Humanoid Robot

Transcription:

Contact Sensing Approach In Humanoid Robot Navigation CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION Hanafiah, Y. 1, Ohka, M 2., Yamano, M 3., and Nasu, Y. 4 1, 2 Graduate School of Information Science, Nagoya University, Furo-cho Chikusa-ku Nagoya 464-8601 Japan 3, 4 Faculty of Engineering, Yamagata University, Jonan 4-3-16 Yonezawa Yamagata 992-8510 Japan 1 hanafiah@nuem.nagoya-u.ac.jp ABSTRACT This paper presents the development of a basic contact interaction-based navigation system for biped humanoid robot. We utilized a 21-dof humanoid robot Bonten-Maru II in our study and experiments. The robot arms are equipped with force sensors to detect physical contact with objects. We proposed a motion algorithm consists of searching, self-localization, correction of locomotion direction, obstacle avoidance, and object manipulation tasks. We applied contact interaction method, whereby humanoid robot grasp on object surface to define self-localization, and then autonomously correcting locomotion direction and avoiding obstacles. For object manipulation, newly developed optical three-axis tactile sensor is mounted on a robotic finger for evaluation purpose. Experiments with Bonten-Maru II to perform self-localization and obstacle avoidance tasks, and the robotic finger mounted with tactile sensor system to perform object manipulation are conducted. The experimental results reveal good robot performance when recognizing object and avoiding collisions in navigation tasks, and good performance to realize and manipulate objects in object manipulation tasks. Keyword: Humanoid robot, contact interaction-based navigation, self-localization, obstacle avoidance,bobject manipulation, optical three-axis tactile sensor. 1.0 INTRODUCTION Research on humanoid robots in areas related with human-robot interaction has been rapidly increasing especially for application to human's living and working environment (Salter, T., Dautenhahn K., and de Boekhorst, R., 2006; Nehmzow, U., and Walker, K., 2005). Humanoid robots are the type of robot that practically suitable to coexist with 33

Journal Advanced Manufacturing Technology human in built-for-human environment because of its anthropomorphism, human friendly design and locomotion ability (Hirai, K., Hirose, M., Haikawa, Y., and Takenaka, T., 1998; Vukobratovic, M., Borovac, B., and Babkovic, K., 2005). They are created to imitate some of the physical and mental tasks that humans undergo daily. The goal is that one day it will be able to both understand human intelligence and reason and act like humans. If humanoids are able to do so, they could eventually coexist and work alongside humans and could act as proxies for humans to do dangerous or dirty work that would not be done by humans if there is a choice, hence providing humans with more safety, freedom and time. Obviously environments shared with humanoid robots are normally designed for humans. Hence, robots must incorporate a reliable navigation strategy to effectively recognize the environment in which they operate and avoid collisions. The working-coexistence of humans and humanoid robots that sharing common workspaces will impose on the robots with their mechanical-control structure at least two classes of tasks: motion in a specific environment with obstacles, and manipulating various objects from the human's environment (Vukobratovic, M., Borovac, B., and Babkovic, K., 2005). As far as this working-coexistence is concerned, a suitable navigation system combining design, sensing elements, planning and control embedded in a single integrated system is necessary so that robots can further "adapt" to the environment previously dedicated only to humans. In this research, we develop a contact-based navigation system for a biped humanoid robot. Our goal is to develop autonomous navigation system for humanoid robots and to improve current visual-based navigation. Eventually, autonomous navigation in walking robots requires that three main tasks be solved: self-localization, obstacle avoidance, and object manipulation (Clerentin, A., Delahoche, L., Brassart, E., and Drocourt, C., 2005). In this research, we develop a basic contact interaction-based navigation system for humanoid robot. The robot arms touch and grasp object surface to acquire the object position and orientation so that the robot can perform selflocalization, and then continuously generate suitable trajectories to correct its locomotion direction, avoiding obstacles, and conduct object manipulation when it reach to the target working area. Six-axis force sensors were attached at both robotic arms as end-effectors for force control so that humanoid robots can recognize their surroundings. For object manipulation, we going to use newly developed optical three-axis tactile sensor mounted on robotic finger, which is currently under evaluation. We focus on contact interaction because this method is suitable to improve current visual-based navigation, and practically suitable for humanoid robots to accurately structure and recognizes their surrounding conditions (Lim, M., Oh, S., Son, J., You B., and Kim, K., 2000),. In this research, we utilized a research prototype humanoid robot called Bonten-Maru II, as shown in Figure 1, as our experimental platform. Furthermore, we currently develop an optical three axis tactile sensor with robotic fingers system for object handling tasks. This tactile sensor is based on optical waveguide transduction method, 34

Contact Sensing Approach In Humanoid Robot Navigation and capable of acquiring normal and shearing force. In future, this system will be mounting on humanoid robot arms to replace the six-axis force sensors for navigation and object manipulation tasks. Figure 1. Humanoid robot Bonten-Maru II and its configuration of dofs. 2.0 MOTIVATION OF CONTACT INTERACTION IN ROBOT NAVIGATION Application of humanoid robots in the same workspace with humans inevitably results in contact interaction. Some studies in robotics have proposed methods of interaction with environments using non-contact interaction such as using cameras, ultrasonic wave sensor, vision image processing, etc. (Cheng, G., Nagakubo, A., and Kuniyoshi, Y., 2001; Ogata, T., Matsuyama, Y., Komiya, T., Ida, M., Noda, K and Sugano, S., 2000). However, besides the rapid growth in visual sensor technology and image processing technology, identification accuracy problems due to approximate data obtained by the visual sensor and interruption of environment factors such as darkness, smoke, dust, etc. seems to reduce the robots performances in real environments. Figure 2 Prototype humanoid robot arm with fingers and the finger mounted with optical three-axis tactile sensor. 35

Journal Advanced Manufacturing Technology Meanwhile, some work also reported the use of robotic armed mobile robot and artificial humanoid robot arm to analyze object surface by grasping and obtain information to perform certain locomotion (Lim, M., Oh, S., Son, J., You B., and Kim, K., 2000; Kanda, T., Ishiguro, H., Ono, Y., Imai M., and Nakatsu, R., 2002). Overall there has been very little work reported about application of contact interaction in humanoid robot navigation (Konno, A.,1999). Indeed, contact interaction offers better options for robots to accurately recognize and structure their environment (Coelho, J., Piater, J., and Grupen, R., 2005; Kim, J., Park, J., Hwang Y.K., and Lee, M., 2004), making it easier and safer to perform tasks and improve efficiency to operate in real environment. We believe that contact interaction is a relevant topic in research and development of humanoid robot navigation. In fact contact interaction is a fundamental feature of any physical manipulation system and the philosophy to establish working coexistence between human and robot. Most research on navigation of walking robots is related to perception-guided navigation, dealing particularly with visual-based navigation. In current research, we proposed a basic contact interaction-based navigation system for humanoid robot capable of defining self-localization and obstacle avoidance to support the visual-based navigation. This system is based on contact interaction with the aim to improve visual-based navigation for humanoid robots to effectively operate in real environments. In order to make humanoid robot recognize its surrounding, sixaxis force sensors were attached at both robotic arms as end effectors for force control. In this report we present selflocalization tasks and obstacle avoidance tasks using the contact interaction approach. We present experimental results of the proposed navigation system using 21-dof humanoid robot Bonten-Maru II. Meanwhile, for object handling tasks, we developed a robotic finger system mounted with an optical three-axis tactile sensor capable of acquiring normal and shearing force, with the aim of installing it on the humanoid robot arm. Although current robot hands are equipped with six-axis force sensors to detect contact force, they do not make use of tactile sensors capable of detecting an object's hardness and/or softness, nor can they recognize the shape that they grip. For a robot hand to grip an object without causing damage to it, or otherwise damaging the sensor itself, it is important to employ sensors that can adjust the gripping power. The tactile sensor developed in this research capable of acquiring normal and shearing force, therefore suitable for object manipulation. In this report, we explain the tactile sensor system's structure and principle. Since the tactile sensor system is still under evaluation, we conduct the object manipulation tasks experiment using 3-dof robotic finger mounted on a prototype humanoid robot arm as shown in Figure 2. 36

Contact Sensing Approach In Humanoid Robot Navigation 3.0 THE ROBOTS In this research, we utilized the 1.25-m tall, weights 32.5-kg research prototype humanoid robot Bonten-Maru II (see Figure 1). Bonten-Maru II was designed to mimic human characteristics as closely as possible, especially in relation to basic physical structure through the design and configuration of joints and links. The robot has a total of 21 dofs: six for each leg, three for each arm, one for the waist, and two for the head. The high number of dofs and configuration of joints that closely resemble those of humans provide Bonten-Maru II with the possibility of realizing complex trajectories to attain human-like motion. Each joint is driven by a DC servomotor with a rotary encoder and a harmonic drive-reduction system, and is controlled by a PC with the Linux OS. The motor driver, PC, and power supply are placed outside the robot. Bonten-Maru II is equipped with a force sensor in both arms. As for the legs, there are four pressure sensors under each foot: two under the toe area and two under the heel. These provide a good indication that both legs are in contact with the ground. The robot's control system consists of three modules: a motion instructor, a robot controller and shared memory. The robot systems can operate in simulation mode and real-time mode. To evaluate the performance and to ensure safety of the humanoid robot and the tactile sensor itself, we conduct evaluation experiments using a newly developed prototype multi-fingered 3-dof humanoid robot arm. The arm was designed to replicate the Bonten-Maru II's arm. However, the arm's control system has been refined to comply with the finger and tactile sensor system. Figure 2 shows the humanoid robot arm with fingers and the finger mounted with the optical three-axis tactile sensor. 4.0 NAVIGATION SYSTEM ALGORITHM In the contact interaction-based navigation system, to perform self-localization and obstacle avoidance tasks, we create an algorithm in the humanoid robot's control system to control the motions of the robot's arms and legs based on information obtained from grasping process. The algorithm comprises formulations to generate trajectory for each robotic joint (Hanafiah, Y., Yamano, M., Nasu, Y., and Ohka, M., 2005). The formulations involve solutions to forward and inverse kinematics problems, interpolation of the manipulator's end-effector, and force-position control formulations. Figure 3 shows a flowchart of the contact interaction-based navigation system's algorithm. The algorithm consists of four important processes: searching and detecting, touching and grasping object surface, correction of locomotion direction, and obstacle avoidance. In future, object handling tasks will be combined in this algorithm. The proposed algorithm is applied within the humanoid robot control system. Figure 4 displays the control system structure consists of two main 37

Journal Advanced Manufacturing Technology processes to control the humanoid robot motion: robot controller and motion instructor. Shared memory is used for connection between the two processes to send and receive commands. Object found Search front object Object not found Grasping front object Object found Search side Grasping side object Walk sidestep Correction robot position Obstacle found Checking obstacle No obstacle Rotate to face obstacle Correction of robot Confirm obstacle Walk forward Walk side-step Figure 3 Contact interaction-based navigation algorithm combining searching, selflocalization, correction, and obstacle avoidance tasks. Robot Controller Motion finished Motion instruction receiver Contact interactionbased navigation algorithm Shared Memory Motion Instructor Access permission Instruction modes Trajectory generation Robot motion Figure 4 Control system structure of humanoid robot Bonten-Maru II. 38

Contact Sensing Approach In Humanoid Robot Navigation Figure 5 Motion planning in contact interaction-based humanoid robot navigation system. 5.0 MOTION PLANNING IN HUMANOID ROBOT NAVIGATION Figure 5 shows motion planning of the proposed navigation system. Referring to this figure, the humanoid robot performed self-localization by grasping a wall surface, and then responds by correcting its orientation and locomotion direction. During correction process, however, the existence of obstacles along the correction area creates the possibility of collisions. Therefore, the humanoid robot recognizes the existence of obstacle in the correction area and performs obstacle avoidance to avoid the obstacle. When the robot reach to the target working area, the robot will be able to manipulate objects. For instance during operation in dark area, the robot arm will search for switch on the wall and turn lights on. This operation will definitely require fingers system with tactile sensors which is currently under development. In this research, the process of searching and detecting tasks are classified in two situations: grasping front object and grasping side object. These two tasks are different in term of formulation of force and position control, and the humanoid robot's correction trajectory after the grasping process. 6.0 FORMULATION OF FORCE AND POSITION CONTROL During grasping on object surface, force and positions of the end-effector was controlled based on calculations applying values of maximum force F max, minimum force F min and arm's endeffector's shifted distance in one sampling time l, as a parameter values. Reference position or the targeted point of the end-effector is defined as P ref, meanwhile current position is defined as P cur. Here, based on the detected force F which controlled by the fixed parameter values F max, F min and l, the endeffector's reference position P ref is defined from end-effector's current position P cur in one sampling time. The force position control formulations for grasping front and side object surface are defined in 39

Journal Advanced Manufacturing Technology the following equations: i. For grasping object at front position: P ref = P cur -lx, when F> F max (1) P ref = P cur -ly, when F min < F < F max (2) P ref = P cur -lx, P ref = P cur -ly, when F< F min (3) ii. For grasping object at right-side position: P ref = P cur -lx, P ref = P cur + ly, when F> F max (4) P ref = P cur -lx, when F min < F < F max (5) P ref = P cur -ly, when F< F min (6) From above equations, computations to decide the end-effector positions were performed. Meanwhile, the joint angles for each joint were calculated using forward kinematics to define coordinate of the end-effector. Repetition of above mentioned position and force control in specified motion time will result a series of end-effector position data as shown in Figure 6. Based on these data, the robot could recognize the object orientation and its position towards the object. 7.0 SELF-LOCALIZATION AND CORRECTION TASKS The end-effector data obtained during grasping process are calculated with the least-square method to result a linear equation as shown in (7). Here, distance and grasping angle between the robot to the object, described as L and ϕ, respectively, are defined by applying following formulations. At first, a straight line from the reference coordinates origin and perpendicular with (7), which described the shortest distance from robot to object, is defined in (8). The intersection coordinate in X-Y axes plane is shown in (9). Grasping angle ϕ is an angle from X-axis of the robot reference coordinates to the perpendicular line of (8). (7) (8) 40

Contact Sensing Approach In Humanoid Robot Navigation Figure 6 End-effector position data obtained from grasping process. (9) Here, distance L and grasping angle ϕ are shown in (10) and (11), respectively. In this research, correction of the robot position and orientation are refers to values of L and ϕ. (10) In grasping front object, correction of robot's distance was simply performed by generating trajectory for legs to walk to backwards direction, whereby quantity of steps is defined in (12). Here, q is step quantity, and L is the measured distance (shortest distance) from the intersection point of arm's shoulder joints to the wall, which obtained from grasping result. Refer to Figure 7, L1 is length from the shoulder joints to the elbow joint, Lt is the total length of arm from the shoulder joints to the end effector, and L3 is the step size of the robot's leg. Relatively, Lm that indicates in (12) is defined in (13). (11) (12) (13) 41

Journal Advanced Manufacturing Technology In grasping right-side object, correction of distance involves trajectory generation of legs to walk side-step away from the wall. However, if the grasping angle ϕ is 0<ϕ<45, it is still possible for the robot to collide with the object. In this case, the robot will walk one step to backward direction, before proceed to walk side-step. Eventually, if the grasping angle ϕ is 45 <ϕ<90, the robot will continue to correct its position by walking side-step away from the object. At this moment, the sidestep size S is defined from (14). Here, Lb is a parameter value which considered safety distance between the robot to the object during walking. Continuously, from (14), boundary conditions are fixed as (15) and (16). Here, a and β are parameter values which consider maximum side-step size of the humanoid robot legs. Value of a is fixed at minimum side-step size, while β is fixed at maximum side-step size. Correction of the robot's orientation is performed by changing the robot's locomotion direction to 90 -ϕ, so that the final robot's orientation is parallel with wall's surface orientation. In grasping front object X In grasping rightside object L 2 Y L 1 Left Right L 1 Figure 7 Structural dimensions, searching and detecting area of the humanoid robot's arm in the contact interaction-based navigation system. Wall X φ L b L Wall Y 90 0 φ X Correct angle Correct distance X Correct angle Y 90 0 φ X φ Y Y Correct distance (a) Grasping front wall. (b) Grasping right-side wall. Figure 8 Humanoid robot orientation after grasping front and right-side objects. 42

Contact Sensing Approach In Humanoid Robot Navigation (14) (15) (16) Experiments were conducted to evaluate the performance of the self-localization tasks and correction of robot's position and orientation. Figures. 8 (a) and (b) are respectively shows geometrical analysis of the robot's position and orientation at X- Y axes plane before and after correction of distance and angle in grasping front wall and grasping right-side wall, based on grasping result. Axes X-Y indicating orientation before correction, while axes X'-Y' are after correction is finished. Figures. 9 and 10 show photographs during the experiments of grasping front and right-side wall. Experimental results show smooth and controlled motion of humanoid robot to recognize its orientation and correct the position and orientation. Figure 9 Sequential photographs of grasping front wall experiment. Figure 10 Sequential photographs of grasping right-side wall experiment. Figure 11 Sequential photographs of obstacle avoidance experiment. 43

Journal Advanced Manufacturing Technology 8.0 OBSTACLE AVOIDANCE TASKS Basically, obstacle avoidance is performed after correcting the robot's distance to the wall, before proceeding to the correct angle. It consists of four important tasks: checking obstacle using arm, rotate robot's orientation, and confirm obstacle using arm. However, if no obstacle has been detected, the robot will continue correcting its orientation. Figure 11 shows sequential photographs of the actual locomotion during obstacle avoidance experiments using Bonten-Maru II. In this experiment, the robot performed smooth and controlled trajectory to recognize and avoid the obstacle. Referring to this figure, while checking for an obstacle, if the arm's end-effector touches an object, the force sensor will detect the force applied to it and send the force data to the robot's control system. By solving direct kinematic calculations of the joint angles, the end-effector's position is obtained. The left arm's range of motion while checking for obstacles is equal to the correction angle, 90 -ϕ, where ϕ is the grasping angle. Any objects detected within this range are considered as obstacle. Once an obstacle has been detected during the process of checking for an obstacle using arm, the robot will rotate its orientation to face the obstacle in order to confirm the obstacle's position at a wider, more favorable angle, finally avoiding it. After the obstacle is detected and the robot orientation has changed to face the obstacle, it is necessary to confirm whether the obstacle still exists within the locomotion area. This process is performed by the robot's arm, which searches for any obstacle in front of the robot within its reach. If the obstacle is detected within the search area, the arm will stop moving, and the robot will perform walking to side-step direction. The robot's arm will repeat the process of confirming the obstacle's presence until the obstacle is no longer detected. Once this happens, the robot will walk forward in a straight trajectory and complete the process of avoiding obstacle. 9.0 OBJECT MANIPULATION Object manipulation is one of the important factors in the proposed humanoid robot navigation system. Recent researches reported that a tactile sensor system is essential as a sensory device to support the robot control system (Omata, S., Murayama Y., and Constantinou, C.E., 2004; Kerpa, O., Weiss, K., and Worn, H., 2003). In this research, with the aim of establishing object manipulation ability in the proposed navigation system, we have developed a novel optical three-axis tactile sensor capable of acquiring normal and shearing force to mount on fingertips of 3- dof robotic finger (Ohka, M., Kobayashi, H., and Mitsuya, Y.,2006), Figure 12 displays the integrated control system structure of finger and optical three-axis tactile sensor used in this research. The tactile sensor is using optical waveguide 44

Contact Sensing Approach In Humanoid Robot Navigation transduction method, applying image processing techniques. This sensor is designed in a hemispherical dome shape consist of an array sensing elements. This system was designed to replace the six-axis force sensor on the humanoid robot arm for navigation and object manipulation tasks because of its capability of acquiring normal and shearing force, which are especially required in object manipulation. The structure of the optical three-axis tactile sensor consists of an acrylic hemispherical dome, an array of silicon rubber sensing elements, a light source, an optical fiber-scope, and a CCD camera. Figure 13 shows a diagram of the tactile sensor's structure and the silicone rubber sensing element comprises one columnar feeler and eight conical feelers. The eight conical feelers remain in contact with the acrylic surface while the tip of the columnar feeler touches an object.. Figure 12 Control system structure of robotic finger and optical three-axis tactile sensor. Figure 13 Structure of hemispherical optical three-axis tactile sensor (left). Structure of sensing element (right). Figure 14 Arrangement of sensing elements on fingertip. 45

Journal Advanced Manufacturing Technology Figure 15 Principle of optical three-axis tactile sensor system. Figure 16 CCD camera-captured image of contact phenomenon of the optical three-axis tactile sensor on fingertip. The sensing elements are arranged on the hemispherical acrylic dome in a concentric configuration with 41 sub-regions as shown in Figure 14. Meanwhile, Figure 15 shows the sensing principle of the optical three-axis tactile sensor system. The light emitted from the light source is directed towards the edge of the hemispherical acrylic dome through optical fibers. When an object contacts the columnar feelers, resulting in contact pressure, the feelers collapse. At the points where the conical feelers collapse, light is diffusely reflected out of the reverse surface of the acrylic surface because the rubber has a higher reflective index. The contact phenomena consisting of bright spots caused by the feelers' collapse are observed as image data, as shown in Figure 16. In this situation, the normal force of F x, F y and F z are calculated using the integrated gray-scale value G, while shearing force is based on the horizontal centroid displacement. The displacement of the gray-scale distribution u is defined in (17), where i and j are orthogonal base vectors of the x and y-axes of a Cartesian 46

Contact Sensing Approach In Humanoid Robot Navigation coordinate, respectively. Each force component is defined in (18). (17) (18) In real-time applications, robots might have to comply with objects of various shapes and properties. It is therefore necessary for them to recognize the object and generate a suitable trajectory to grip it. In this experiment, we used a tennis ball to evaluate the performance of the proposed tactile sensor and finger system at recognizing a spherical object. In addition we conduct experiment using real egg to evaluate feedback control performance of normal and shearing force with the finger control system. In this experiment, minimum force parameter is specified at 0.3~0.5 N (threshold 1), with the maximum side being 1.4~1.5 N (threshold 2). Meanwhile 2 N is fixed as the critical limit. To simplify the evaluation, sensing data were taken only from sensing element number 0 (see Figure 14). In the experiment with tennis ball, the initial touch point was shifted slightly from the peak point of the tennis ball so that the movement characteristic of the robot finger against the sphere's shape surface could be evaluated. Referring to Figure 17, when the touch force increased, the rotation moment acted on the tennis ball, causing the ball to try to rotate. The shearing force increased simultaneously and when the applied normal force exceeded the threshold 2, the ball started to slip out. Hence, the finger corrected its position to move nearer to the object's surface and adjust the gripping force. This process was repeated, finally resulting in a change to the finger orientation in order to comply with the spherical surface shape stopping when the finger reached a stable gripping position. Next, experiment of manipulating real egg was conducted, as shown in Figure 18, using the same concept as above. Control feedback of the finger system and the optical three-axis tactile sensor produced optimum griping force and rolling motion on the egg without breaking it or damaging the sensors. 47

Journal Advanced Manufacturing Technology Figure 17 Object manipulation of the robot finger mounted with optical three-axis tactile sensor on tennis ball. Figure 18 Gripping and rolling movement of the robot finger mounted with optical three-axis tactile sensor on real egg. 10.0 CONCLUSIONS In this research, we proposed tasks of self-localization, obstacle avoidance and object manipulations in a contact sensing-based navigation system for humanoid robot. We proposed motion algorithm consists of four important processes: searching and detecting, grasping object surface, correction of locomotion direction, and obstacle avoidance. The proposed tasks were evaluated in experiments utilizing humanoid robot Bonten-Maru II. The experimental results indicated that the humanoid robot is able to perform self-localization by grasping on wall surface, and then responds to correct its orientation and locomotion direction, and avoiding obstacle. Meanwhile for object manipulation, the experiment utilizing robotic finger mounted with the 48

Contact Sensing Approach In Humanoid Robot Navigation optical three-axis tactile sensor revealed good performance to recognize and control gripping power using normal force and shearing force acquired by the tactile sensor. It shows good potential toward future application of the tactile sensor and robotic finger system to the humanoid robot's arm to perform effective object manipulation in the proposed navigation system. The proposed idea of contact interaction-based humanoid robot navigation should contribute to better understanding in human-robot interaction. Furthermore, future refinement of tactile sensor system and navigation strategy are expected to provide better performance and consequently enabling any type of anthropomorphic robots fitted with it to operate effectively in the real environments. It is anticipated that using this navigation system will help advance the evolution of human and humanoid robots working together in real life. ACKNOWLEDGEMENT Part of this study was supported by a fiscal 2006 grants from the Japan Ministry of Education, Culture, Sports, Science and Technology (Grant-in-Aid for Scientific Research in Exploratory Research, No. 18656079). REFERENCES Salter, T., Dautenhahn K., and de Boekhorst, R., 2006, Learning about natural humanrobot interaction styles, Journal of Robotics and Autonomous Systems, vol.54, issue 2, pp.127-134. Nehmzow, U., and Walker, K., 2005, Quantitative description of robotenvironment using chaos theory, Journal of Robotics and Autonomous Systems, vol.53, pp.177-193. Hirai, K., Hirose, M., Haikawa, Y., and Takenaka, T., 1998, The Development of Honda Humanoid Robot, in Proc. of IEEE Int. Conference on Robotics and Automation, pp. 1321-1326. Vukobratovic, M., Borovac, B., and Babkovic, K., 2005, Contribution to the study of Anthropomorphism of Humanoid Robots, Journal Humanoids Robotics, vol. 2, No.3, pp. 361-387. Clerentin, A., Delahoche, L., Brassart, E., and Drocourt, C., 2005, Self localization: a new uncertainty propagation architecture, Journal Robotics and Autonomous Systems, vol. 51, pp. 151-166. Lim, M., Oh, S., Son, J., You B., and Kim, K., 2000, A human-like real-time grasp synthesis method for humanoid robot hands, Journal of Robotics and Autonomous System, vol.30, pp.261-271. 49

Journal Advanced Manufacturing Technology Cheng, G., Nagakubo, A., and Kuniyoshi, Y., 2001, Continuous Humanoid Interaction: An Integrated Perspective-Gaining Adaptivity, Redundancy, Flexibility-in One, Journal of Robotics and Autonomous Systems, vol. 37, issues 2-3, pp 161-183, 2001. Ogata, T., Matsuyama, Y., Komiya, T., Ida, M., Noda, K and Sugano, S., 2000, Development of Emotional Communication Robot: WAMOEBA-2R-Experimental Evaluation of the Emotional Communication between Robots and Human, in Proc. of IEEE /RSJ Intelligent Robots and Systems (IROS-00), vol. 1, pp. 175-180. Kanda, T., Ishiguro, H., Ono, Y., Imai M., and Nakatsu, R., 2002, Development and Evaluation of an Interactive Humanoid Robot Robovie, in Proc. IEEE Int. Conf. Robotics and Automation (ICRA'02), vol. 2, pp. 1848-1855. Konno, A.,1999, Development of an Anthropomorphic Multi-Finger Hand and Experiment on Grasping Unknown Object by Groping, Transaction of the JSME, vol.65, no. 638, pp. 4070-4075. Coelho, J., Piater, J., and Grupen, R., 2005, Developing Haptic and Visual Perceptual Categories for Reaching and Grasping with a Humanoid Robot, Journal Robotics and Autonomous Systems, vol. 37, pp. 195-218. Kim, J., Park, J., Hwang Y.K., and Lee, M., 2004, Advance grasp planning for handover operation between human and robot: three handover methods in esteem etiquettes using dual arms and hands of home-service robot, in Proc. of 2nd Int. Conf. on Autonomous Robots and Agents (ICARA04), pp. 34-39. Hanafiah, Y., Yamano, M., Nasu, Y., and Ohka, M., 2005, Trajectory generation in groping locomotion of a 21-DOF humanoid robot, in CD-R Proc. 9th Int. Conference on Mechatronics Technology (ICMT'05), Omata, S., Murayama Y., and Constantinou, C.E., 2004, Real time robotic tactile sensor system for determination of the physical properties of biomaterials, Journal of Sensors and Actuators A, vol. 112, pp. 278-285, Kerpa, O., Weiss, K., and Worn, H., 2003, Development of a flexible tactile sensor system for a humanoid robot, in CDR Proc. IROS2003. Ohka, M., Kobayashi, H., and Mitsuya, Y.,2006, Sensing precision of an optical three-axis tactile sensor for a robotic finger, in Proc. of 15th IEEE Int. Symposium on Robot and Human Interactive Communication RO-MAN2006, pp. 220-225. 50