HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Similar documents
Biomimetic Design of Actuators, Sensors and Robots

HMM-based Error Recovery of Dance Step Selection for Dance Partner Robot

Sensor system of a small biped entertainment robot

Evaluation of Five-finger Haptic Communication with Network Delay

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

these systems has increased, regardless of the environmental conditions of the systems.

Cooperative Transportation by Humanoid Robots Learning to Correct Positioning

4R and 5R Parallel Mechanism Mobile Robots

A NEW ROBOTIC MANIPULATOR IN CONSTRUCTION BASED ON MAN-ROBOT COOPERATION WORK

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Wearable Haptic Display to Present Gravity Sensation

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

The Advent of New Information Content

Design and Control of the BUAA Four-Fingered Hand

Birth of An Intelligent Humanoid Robot in Singapore

Haptic Invitation of Textures: An Estimation of Human Touch Motions

Objective Data Analysis for a PDA-Based Human-Robotic Interface*

Intent Expression Using Eye Robot for Mascot Robot System

UKEMI: Falling Motion Control to Minimize Damage to Biped Humanoid Robot

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

Elements of Haptic Interfaces

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Touching and Walking: Issues in Haptic Interface

Interaction rule learning with a human partner based on an imitation faculty with a simple visuo-motor mapping

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

World Automation Congress

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Development of Micro-manipulation System for Operation in Scanning Electron Microscope

Self-learning Assistive Exoskeleton with Sliding Mode Admittance Control

Pr Yl. Rl Pl. 200mm mm. 400mm. 70mm. 120mm

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Adaptive Action Selection without Explicit Communication for Multi-robot Box-pushing

SnakeSIM: a Snake Robot Simulation Framework for Perception-Driven Obstacle-Aided Locomotion

THE HUMAN POWER AMPLIFIER TECHNOLOGY APPLIED TO MATERIAL HANDLING

Tele-operation of a Robot Arm with Electro Tactile Feedback

Robots Learning from Robots: A proof of Concept Study for Co-Manipulation Tasks. Luka Peternel and Arash Ajoudani Presented by Halishia Chugani

Design and Controll of Haptic Glove with McKibben Pneumatic Muscle

Development of a telepresence agent

CAPACITIES FOR TECHNOLOGY TRANSFER

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

The Design of key mechanical functions for a super multi-dof and extendable Space Robotic Arm

Five-fingered Robot Hand using Ultrasonic Motors and Elastic Elements *

FORCE LIMITATION WITH AUTOMATIC RETURN MECHANISM FOR RISK REDUCTION OF REHABILITATION ROBOTS. Noriyuki TEJIMA Ritsumeikan University, Kusatsu, Japan

HAPTIC DEVICES FOR DESKTOP VIRTUAL PROTOTYPING APPLICATIONS

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Intelligent interaction

Sensing the Texture of Surfaces by Anthropomorphic Soft Fingertips with Multi-Modal Sensors

The Humanoid Robot ARMAR: Design and Control

Small Occupancy Robotic Mechanisms for Endoscopic Surgery

Development of Shape-Variable Hand Unit for Quadruped Tracked Mobile Robot

Autonomous Cooperative Robots for Space Structure Assembly and Maintenance

Applying Model Mediation Method to a Mobile Robot Bilateral Teleoperation System Experiencing Time Delays in Communication

Associated Emotion and its Expression in an Entertainment Robot QRIO

STUDY ON REFERENCE MODELS FOR HMI IN VOICE TELEMATICS TO MEET DRIVER S MIND DISTRACTION

R (2) Controlling System Application with hands by identifying movements through Camera

Baxter Safety and Compliance Overview

Stationary Torque Replacement for Evaluation of Active Assistive Devices using Humanoid

On Observer-based Passive Robust Impedance Control of a Robot Manipulator

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

The Haptic Impendance Control through Virtual Environment Force Compensation

Haptics CS327A

Shuffle Traveling of Humanoid Robots

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Robotics and Artificial Intelligence. Rodney Brooks Director, MIT Computer Science and Artificial Intelligence Laboratory CTO, irobot Corp

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Wearable Force Display Using a Particle Mechanical Constraint

Shape Memory Alloy Actuator Controller Design for Tactile Displays

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

VOICE CONTROL BASED PROSTHETIC HUMAN ARM

* Institute of Electrical and Electronics Engineers (IEEE), WORLD HAPTICS CONGRESS 2011 held in Istanbul, Turkey, in June 2011

Computer Haptics and Applications

On the Role Duality and Switching in Human-Robot Cooperation: An adaptive approach

Real-Time Bilateral Control for an Internet-Based Telerobotic System

Controlling Humanoid Robot Using Head Movements

CONTACT SENSING APPROACH IN HUMANOID ROBOT NAVIGATION

Haptic presentation of 3D objects in virtual reality for the visually disabled

Robots in the Loop: Supporting an Incremental Simulation-based Design Process

2. Introduction to Computer Haptics

2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure

Design of a Compliant and Force Sensing Hand for a Humanoid Robot

Introduction of Research Activity in Mechanical Systems Design Laboratory (Takeda s Lab) in Tokyo Tech

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Bibliography. Conclusion

Multi-Modal Robot Skins: Proximity Servoing and its Applications

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

3D Form Display with Shape Memory Alloy

Prospective Teleautonomy For EOD Operations

Robotic Capture and De-Orbit of a Tumbling and Heavy Target from Low Earth Orbit

Development of Flexible Pneumatic Cylinder with Backdrivability and Its Application

Introduction to robotics. Md. Ferdous Alam, Lecturer, MEE, SUST

Physical Human Robot Interaction

ROBOTICS, Jump to the next generation

Transcription:

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA RIKU HIKIJI AND SHUJI HASHIMOTO Department of Applied Physics, School of Science and Engineering, Waseda University 3-4-1 Okubo, Shinjuku-ku, Tokyo 169-8555 JAPAN E-mail: {riku, shuji}@shalab.phys.waseda.ac.jp Aiming at realization of direct and intuitive communication between human and robot, we propose an interface system for an autonomous mobile robot that can take physical communication with its user via haptic, or hand-to-hand force interaction. The hand-shaped device equipped on the robot has 1 DOF at the finger part and is capable of grasping a human hand when actuated. A micro-switch on the bottom side of the hand is used to sense grasping of human. Intentional force information is acquired by strain gages on the flexible rubber-made arm that physically supports the hand. The robot s motion is determined by the force input and/or the environmental condition. The robot recognizes obstacles by using bumper sensors and ultrasonic sensors around the body. When it touches an obstacle, the robot changes the route regardless of the user s intentional force and thus informs the user of such situation of obstacle avoidance. In this research, we design simple algorithms for both human-following and humanleading motions, and devise experiments with human users. Qualitative and quantitative evaluations of the experimental results are also presented. Key Words: human-machine interface, human-cooperative robot, haptic interface, Kansei, force interaction 1 Introduction 1.1 Background Arguably the most popular media tool of information and communication in the past few decades has been television. In recent years, computer has emerged as the more interactive multi-media tool and is growing in number extremely. Including these two above, most of the multimedia devices practiced in human communication today are realized in forms of either sound or images. In approach to more informative multi-modal communication, utilization of force information seems to be effective. Many researchers have proposed application of haptic or hand-to-hand force interaction to their newly designed media devices. In our laboratory, some reports have been made to realize haptic interaction between human users in distance. The handshake telephone system utilizes grasping force information [1] and the haptic and tactile telecommunication system both force and tactile information [2], in order to examine capability of haptics as another medium for digital human communication.

But when thinking about applying force information to multi-media communication, such media devices must have physical structure that can be actuated. Unlike other perception, force sensing and exerting processes require respective physical action. It is expectable that active apparatus like robots becomes new media of human communication in the close future. In fact, the recent development of robot interface development has been clearly shifted to multi-modal utility. As robot interfaces become multi-modal, there seems to be various possibilities in application of robotic systems as new type of active multi-media terminal. 1.2 Human-Cooperative Robots By the way, prevailing trends of robot researches today includes application of human-friendly robots to daily operations in human environment. Robots are now required to hold more and more works in human environment such as in fields of nursing, aiding, entertainment, etc, and the stance of robots is shifting from in place of human to with human. But since it seems to require more time until appearance of self-controlled autonomous mobile robots that work without any support from others, considerable attention of researchers have been drawn to development of human-cooperative robots. These robots operate as acquiring support from human and thus believed to be more practical than self-controlled robots. There are several papers on human-robot cooperation efficiently achieved by applying force information to the robotic systems. [3-6,10]. Kosuge has recently developed an armed robot that can carry object in cooperation with a human, and multiple mobile robots that carry a large or heavy object in distribution [3]. Method of compliant motion control based on the Virtual Internal Model is also reported [4]. Cooperative carrying task by a human and a manipulator robot with Variable Impedance Control [5] and human-following experiment with biped humanoid robot are achieved as well [6]. All of the above utilize force information in achieving cooperative tasks or communication, but most of them hardly have essential design to be human interface. Efficient interface system for human-robot cooperation must afford, or appeal to Kansei of, human users to interact with the robot. Kansei is a human ability of making perception in non-logical way [7]. In addition, the study of interface system utilizing force [8] suggests that handling, safety and impression are also important factors. Cooperation between human and robots may require massive transactions of information. Robots must be able to profoundly interact with human users because human characteristics are unknown and operational environment is dynamically changing [9,10]. Multi-modal interface systems can be very helpful for such human-robot communication. While perceptional, auditory, and linguistic communication systems are often too immense to be implemented on autonomous

mobile robots and existing interfaces such as keyboards, touch panels, and remote controllers lacks intuitiveness, force interaction system can provide direct and intuitive physical interaction with considerably simple application. 1.3 Haptic Interaction and Application to Robot Interface One of the most intuitive communication methods between humans is haptic interaction. Hand-to-hand force interaction provides us transactions of direct physical force information as well as a favorable mental effect of togetherness, amiability, and security. It seems efficient to utilize haptic interaction in an interface system of human-cooperative robot. Thus we propose the hand-shaped force interface for the autonomous mobile robot that can take haptic interaction with the human. In achieving human-robot cooperation, we suppose the robot can communicate with its user only through haptic force interaction by using the hand-shaped force interface. The hand-shaped device is actuated at the finger part with 1 DOF to achieve gentle grasp. The force information is acquired by the strain gages that are attached on the flexible rubber-made arm physically supporting the hand. The robot's motion is determined by the force input and/or the environmental condition. Fundamental obstacle recognition is achieved by using bumper sensors and ultrasonic sensors around the body. The robot informs the user of obstacles he/she is not aware of by changing the route regardless of the intentional force direction. We design Simple algorithms for both human-following and human-leading tasks. We devise experiments with human users for each task. Qualitative and quantitative evaluations are presented to examine the system s efficiency. We also mention about possible future application of the system as multi-media interface. 2 System This section explains the structure and function of the interface system in application to the human-cooperative mobile robot. First, we introduce the base mobile robot. Then, we view the whole system and function of the proposed force interface. 2.1 Robot Body The robot we use in this research is two-wheeled mobile robot that can move forward/backward and rotate clockwise/counter-clockwise (Figure1). Obstacle sensors equipped on the robot body are bumper sensors and ultrasonic sensors. The bumper sensors are equipped in front and on the tail of the base body and can sense obstacle contact in six different directions (Figure2). The ultrasonic sensors are mounted to detect obstacles in distance in the robot s forward direction.

Figure1 Outlook of the base robot Figure2 Equipment of the bumper sensors and directions of obstacle sensing (Top View) 2.2 Interface Structure Appearance of the whole robot is shown in Figure3. The haptic, or force, interface system is composed of the hand-shaped device supported by the flexible arm. The hand part is made of a plastic skeleton covered with a rubber glove, and is capable of gentle grasp with 1 DOF at the fingers. When the hand part is grasped, it is actuated to grasp back the human hand. The arm part is made of two rubber sticks, one vertically fixed on the top of the robot body and the other horizontally on top of the vertical one. Figure3 Outlook of the robot with the Hand-Shaped Force Interface

2.3 The Arm Part and Force Sensing The rubber-made arm part that is physically supporting the hand part can easily be bent when an intentional force is exerted to the hand. Flexibility of the arm thus provides a structural compliance to the system, meaning the human user can handle the interface almost freely regardless of robot s motion and orientation. We adopt the Four Active Gage Method for measuring the intentional force exerted to the hand part. Each set of the two strain gage bridges (one on the vertical part of the arm and the other on the horizontal part) outputs an independent force/torque information corresponding to the bend in a particular direction, that is, either forward/backward or clockwise/counter-clockwise (Figure4). The separation of as well as the linearity of the force sensor output has been confirmed in the experiment (Figure5). Figure4 Decomposed intentional force exerted to the arm part (Top View) Figure5 Bend sensor output when force is exercised either forward, backward, to the left, or to the right

2.4 The Hand Part and Grasping On the bottom side of the hand part is a micro-switch as a human grasp sensor (Figure6). When the hand part is grasped, the micro-switch turns on, and as the robot recognize its hand being grasped, it actuate the finger part to respond with gently grasping back the human hand. In order to realize the grasping, an electro-thermal actuator (BMF250, Toki Corporation [11]) is used. This is made of threadlike Shaped Memory Alloy (SMA). It contracts like muscles when electric current flows, and it elongates when cooled. The 1 DOF fingers are directly attached to the actuator and thus grasp action is realized (Figure7). Figure6 The micro-switch sensor on the bottom side of the Hand Part fixed SMA actuator fixed Figure7 The structure of SMA actuator

3 Control This section describes how to control the whole robotic system with the proposed interface. 3.1 Control Structure The intentional force exerted to the interface system gives the set point of the robot s mobilization control. Figure8 shows the entire structure of the motion control system. The algorithm is described in the following section. The control of the grasp mechanism is open-looped, and the grasping force is determined experimentally. Bend Sensor Grasp Sensor Interface A/D Converter Micro-Controller Algorithm Serial Signals Robot Body Micro-Controller Mobilization Bumper Ultrasonic System Sensors Sensors Figure8 Diagram of the control structure

3.2 Algorithm We have developed two different algorithms, one for human-following task and the other for human-leading task. With the human-following algorithm, the robot moves so as to cancel out the intentional force exerted to the hand-shaped interface (Figure9). With the human-leading algorithm, route of the robot s leading task is given in advance, and the robot executes the task unless an excessive counterdirectional force is exerted (Figure10). When the human follower pulls the robot s hand toward the opposite direction of the leading motion, the robot stops executing the leading task until the intentional force ceases, meaning the follower can catch up the motion delay. In both algorithms, when the robot touches an obstacle, it executes obstacle avoidance motion (Figure11) regardless of the intentional force input by the human partner. Since the robot and the partner are taking hands of each other, force information can be directly communicated, and thus the robot can provide the obstacle information to the partner. The robot and the human can avoid obstacles cooperatively even in case of the human not aware of obstacles. Obstacle Sensor Check Obstacles? no Force Sensor Check Torque > Threshold? no Force > Threshold? Stay Still yes Obstacle Avoidance Motion yes Rotate Right / Left yes Forward / Backward Figure9 Algorithm flow chart of human-following task

Obstacle Sensor Check Obstacles? no Force Sensor Check yes Obstacle Avoidance Motion Torque / Force > Threshold no yes Stop the Leading Task Execute Leading Task Figure10 Algorithm flow chart of human-leading task Direction of Force/Torque front left-front right-front left right tail Robot s motion move backward for about 1[sec] rotate counter-clockwise for 30[deg] rotate clockwise for 30[deg] move forward for about 1[sec] Figure11 Obstacle avoidance motion in human-following task 4 Experiment In order to examine the efficiency of the proposed interface, 3 different experiments are devised. First, human-following and human-leading experiments are executed, and then, efficiency of the proposed interface system is evaluated in comparison with other existing interfaces.

4.1 Human-Following Experiment In this experiment, the human user leads the robot from the start point to the goal point in two-dimensional static environment. Motion Capture is used to acquire the trajectories of the human and the robot (Figure12) and the fluctuation of the distance between them during the task (Figure13). These results support the achievement of the elementary human-following task. Figure12 Trajectories of the human and the robot in the Human-Following Experiment Figure13 Fluctuation of distance between the human and the robot in the Human-Following Experiment

4.2 Human-Leading Experiment In this experiment, the human volunteers are requested to follow the robot s lead with an eye mask on. The robot is programmed to execute the human-leading task in the experimental environment as shown in Figure14. The average goal time of the humanleading tasks 10 volunteers is comparable to the goal time of the robot moving by itself without human follower (Figure15). This suggests that an effective human-leading task is achieved. Result of the questionnaire after the experiment supports our proposition as well (Figure16) Start Obstacle 240 50 Goal 50 135 Figure14 Environmental map of the Human-Leading Experiment Average goal time of 10 volunteers 29 [sec] led by the robot Goal time of the robot when moving 23 [sec] by itself without human 5 follower Figure15 Goal time of the tasks with and without human follower Yes No 1) was able to feel intentional force 10 0 from the leading robot 2) felt securely lead to the goal 8 2 Figure16 Questionnaire answers of the Human-Leading Experiment

4.3 Comparative Experiment of Interface In order to evaluate efficiency of the interface system, a comparative experiment is also held with the help of the same 10 volunteers. Two other different types of the existing interface devices along with the hand-shaped force interface are used for comparison (Figure17). In the same experimental environment as shown in Fig.14, this time, the volunteer users are requested to lead the robot from the start to the goal. Two of the examined interface devices are a digital joystick and a remote controller. Each interface device, as well as the hand-shaped force interface, is handed to the user without any instruction. Leading tasks begin when the user confidently feels that he/she has learned enough to handle the robot with each interface. The average goal time of all users suggests that the hand-shaped force interface is useful in executing such task (Figure18). Afterwards, questionnaire on qualitative evaluation is held. In each category, users must rank the interfaces in order of quality. The scores are given in integers from 3 (best) to 1 (worst), and none of the scores must be repeated more than once. The result supports that the newly designed interface excels in all factors of human interface, especially in affordance, or Kansei appeal, and impression. Handling of the interface seems also as efficient as other two (Figure19). (a) Hand-Shaped Force Interface (b) Joystick (c) remote controller Figure17 Outlook of the robot equipped with each interface used in the comparative experiment Type of Interface (a) (b) (c) Average goal time 39 30 45 Figure18 Average goal time of using different interfaces in comparative experiment

(a) The hand-shaped force interface (b) joystick (c) remote controller CATEGORY (a) (b) (c) Was able to handle (1) intuitively 2.6 1.9 1.5 ( Kansei appeal) Had good handling (2) of the whole robot 2.1 2 1.9 (Handling) Felt affinity (3) or amiability 2.9 2 1.1 (Impression) Figure19 Average scores of the questionnaire comparative experiment 5 Conclusion In this paper, the hand-shaped force interface for human-cooperative mobile robot is proposed. By utilizing hand-to-hand force interaction, profuse communication with intentional force between a human and a robot is achieved. In the humanfollowing task, the robot not only follows the human user to the direction in which the intentional force is exerted, but also recognizes obstacles and communicates that information to the user. In the human-leading task, the robot moves as it is pre-programmed. It stops when the human follower exerts intentional force to the opposite direction of its motion. As for evaluation of the proposed robotic system, we experimented on both tasks in real human-robot cooperation. Efficiency of the human interface system is also testified in comparison to other interfaces. The experimental results suggest that the proposed system fulfill the important requirements of human interface. The developed system can be another alternative for multi-modal robot interface. Now, we are planning to apply a velocity/acceleration control to the robot for achieving smooth motion. We are also considering on supplementing utilization of sound information for more informative communication between a human and a robot. Possible future application of the interface system includes realization of cooperative carrying task between a human and a robot, and integration to the twowheeled humanoid robot, in achieving interactive performance with a human such as dance and cooperative carrying with human.

References 1. K. Ouchi and S. Hashimoto, Handshake Telephone System to Communicate with Voice and Force, Proc. of IEEE International Workshop on Robot and Human Communication, pp.466-471, 1997. 2. Y. Fujita and S. Hashimoto, Experiments of Haptic and Tactile Display for Human Telecommunication, Proc. of the 8th IEEE International Workshop on Robot and Human Interaction (RO-MAN 99), pp.334-337, 1999. 3. K. Kosuge, MR Helper and DR Helper, Proceedings of First Europe-Japan Symposium on Human-Friendly Robotics, pp.379-380, 1999. 4. K. Kosuge, S. Hashimoto and H. Yoshida, Human-Robots Collaboration System for Flexible Object Handling, Proceedings of the 1998 IEEE International Conference on Robotics and Automation, pp.1841-1846, 1998. 5. R. Ikeura and H. Inooka, Variable Impedance Control of a Robot for Cooperation with a Human, Proc. of 1995 IEEE International Conference on Robotics and Automation, pp.3097-3102, 1995. 6. J. Yamaguchi, S. Gen, S. A. Setia Wan and A. Takanishi, Interaction between Human and Humanoid through the Hand Contact, Proc. of 16th Conference of the Robotics Society of Japan, pp.951-952, 1998 (in Japanese). 7. S. Hashimoto, KANSEI as the third target of information processing and related topics in Japan, KANSEI The Technology of Emotion AIMI International Workshop proceedings, pp.101-104, 1997 8. J. Yokono and S. Hashimoto, Center of Gravity Sensing for Motion Interface, Proc. of IEEE International Conference on Systems, Man and Cybernetics, pp.1113-1118, 1998. 9. H. Kazerooni, Human-Robot Interaction via the Transfer of Power and Information Signals, IEEE Transactions on Systems, Man, and Cybernetics, pp.450-463, 1990. 10. K. Kosuge, H. Yoshida and T. Fukuda, Dynamic Control for Robot-Human Collaboration, Proceedings of IEEE International Workshop on Robot and Human Communication, pp.398-401, 1993. 11. Toki Corporation Website URL: http://www.toki.co.jp/biometal/_index.html.