Experiments of Vision Guided Walking of Humanoid Robot, KHR-2

Similar documents
Mechanical Design of Humanoid Robot Platform KHR-3 (KAIST Humanoid Robot - 3: HUBO) *

Development of Humanoid Robot Platform KHR-2 (KAIST Humanoid Robot - 2)

DEVELOPMENT OF THE HUMANOID ROBOT HUBO-FX-1

Mechanical Design of the Humanoid Robot Platform, HUBO

Development of a Humanoid Biped Walking Robot Platform KHR-1 - Initial Design and Its Performance Evaluation

Korea Humanoid Robot Projects

Sensor system of a small biped entertainment robot

Development of Biped Humanoid Robots at the Humanoid Robot Research Center, Korea Advanced Institute of Science and Technology (KAIST)

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

Humanoid Robot HanSaRam: Recent Development and Compensation for the Landing Impact Force by Time Domain Passivity Approach

Rapid Development System for Humanoid Vision-based Behaviors with Real-Virtual Common Interface

Design and Control of the BUAA Four-Fingered Hand

Team Description 2006 for Team RO-PE A

Optic Flow Based Skill Learning for A Humanoid to Trap, Approach to, and Pass a Ball

Experiment on Underwater Docking of an Autonomous Underwater Vehicle 'ISiMI' using Optical Terminal Guidance

Autonomous Stair Climbing Algorithm for a Small Four-Tracked Robot

KMUTT Kickers: Team Description Paper

Robo-Erectus Jr-2013 KidSize Team Description Paper.

Converting Motion between Different Types of Humanoid Robots Using Genetic Algorithms

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2014

Shuffle Traveling of Humanoid Robots

UKEMI: Falling Motion Control to Minimize Damage to Biped Humanoid Robot

Design and Implementation of a Simplified Humanoid Robot with 8 DOF

Robo-Erectus Tr-2010 TeenSize Team Description Paper.

ZJUDancer Team Description Paper Humanoid Kid-Size League of Robocup 2015

Space Research expeditions and open space work. Education & Research Teaching and laboratory facilities. Medical Assistance for people

Vision based behavior verification system of humanoid robot for daily environment tasks

Limits of a Distributed Intelligent Networked Device in the Intelligence Space. 1 Brief History of the Intelligent Space

Footstep Planning for the Honda ASIMO Humanoid

Pr Yl. Rl Pl. 200mm mm. 400mm. 70mm. 120mm

Kid-Size Humanoid Soccer Robot Design by TKU Team

Team KMUTT: Team Description Paper

RoboCup TDP Team ZSTT

EROS TEAM. Team Description for Humanoid Kidsize League of Robocup2013

CONTROL SYSTEM TO BALANCE A BIPED ROBOT BY THE SENSING OF COG TRAJECTORIES

Team Description for Humanoid KidSize League of RoboCup Stephen McGill, Seung Joon Yi, Yida Zhang, Aditya Sreekumar, and Professor Dan Lee

Design and Experiments of Advanced Leg Module (HRP-2L) for Humanoid Robot (HRP-2) Development

The Origin of Artificial Species: Humanoid Robot HanSaRam

System Overview of The Humanoid Robot Blackmann

Advanced Digital Motion Control Using SERCOS-based Torque Drives

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2010 Humanoid League

Integration of Manipulation and Locomotion by a Humanoid Robot

Birth of An Intelligent Humanoid Robot in Singapore

Double-track mobile robot for hazardous environment applications

Robot Joint Angle Control Based on Self Resonance Cancellation Using Double Encoders

Development of Multi-fingered Hand for Life-size Humanoid Robots

YRA Team Description 2011

Nobutsuna Endo 1, Shimpei Momoki 1, Massimiliano Zecca 2,3, Minoru Saito 1, Yu Mizoguchi 1, Kazuko Itoh 3,5, and Atsuo Takanishi 2,4,5

FLCS V2.1. AHRS, Autopilot, Gyro Stabilized Gimbals Control, Ground Control Station

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Graphical Simulation and High-Level Control of Humanoid Robots

Team Description Paper: HuroEvolution Humanoid Robot for Robocup 2014 Humanoid League

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

Hanuman KMUTT: Team Description Paper

BLuAC5 Brushless Universal Servo Amplifier

Development and Evaluation of a Centaur Robot

Performance Assessment of a 3 DOF Differential Based. Waist joint for the icub Baby Humanoid Robot

The Tele-operation of the Humanoid Robot -Whole Body Operation for Humanoid Robots in Contact with Environment-

Advanced Distributed Architecture for a Small Biped Robot Control M. Albero, F. Blanes, G. Benet, J.E. Simó, J. Coronel

ZJUDancer Team Description Paper

Artificial Beacons with RGB-D Environment Mapping for Indoor Mobile Robot Localization

BLuAC5 Brushless Universal Servo Amplifier

The Future of AI A Robotics Perspective

A Semi-Minimalistic Approach to Humanoid Design

Rapid Control Prototyping for Robot Soccer

NimbRo 2005 Team Description

Combot: Compliant Climbing Robotic Platform with Transitioning Capability and Payload Capacity

Elements of Haptic Interfaces

Development of Drum CVT for a Wire-Driven Robot Hand

An In-pipe Robot with Multi-axial Differential Gear Mechanism

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Baset Adult-Size 2016 Team Description Paper

Development of a Walking Support Robot with Velocity-based Mechanical Safety Devices*

DETC EARLY DEVELOPMENTS OF A PARALLELLY ACTUATED HUMANOID, SAFFIR

An Adaptive Action Model for Legged Navigation Planning

AN HYBRID LOCOMOTION SERVICE ROBOT FOR INDOOR SCENARIOS 1

Concept and Architecture of a Centaur Robot

CIT Brains (Kid Size League)

Pushing Manipulation by Humanoid considering Two-Kinds of ZMPs

Randomized Motion Planning for Groups of Nonholonomic Robots

Development of the Lower Limbs for a Humanoid Robot

UNIT VI. Current approaches to programming are classified as into two major categories:

ServoStep technology

Concept and Architecture of a Centaur Robot

Realization of Humanoid Robot Playing Golf

KUDOS Team Description Paper for Humanoid Kidsize League of RoboCup 2016

Running Pattern Generation for a Humanoid Robot

Mechatronics of the Humanoid Robot ROMAN

University of Florida Department of Electrical and Computer Engineering Intelligent Machine Design Laboratory EEL 4665 Spring 2013 LOSAT

Estimation of Absolute Positioning of mobile robot using U-SAT

EF-45 Iris Recognition System

The Real-Time Control System for Servomechanisms

RoboPatriots: George Mason University 2010 RoboCup Team

Design of intelligent vehicle control system based on machine visual

Group Robots Forming a Mechanical Structure - Development of slide motion mechanism and estimation of energy consumption of the structural formation -

ME375 Lab Project. Bradley Boane & Jeremy Bourque April 25, 2018

Intuitive Vision Robot Kit For Efficient Education

Logosol AC/DC Intelligent Servo Drive for Coordinated Control LS-174WP

Laboratory of Advanced Simulations

Transcription:

Proceedings of 2005 5th IEEE-RAS International Conference on Humanoid Robots Experiments of Vision Guided Walking of Humanoid Robot, KHR-2 Jung-Yup Kim, Ill-Woo Park, Jungho Lee and Jun-Ho Oh HUBO Laboratory, Humanoid Robot Research Center, Mechanical Engineering, Korea Advanced Institute of Science and Technology, 373-1 Guseong-dong, Yuseong-gu, Daejeon 305-701, South Korea Kirk1@mclab3.kaist.ac.kr, Junhoh@ohzlab.kaist.ac.kr Abstract This paper introduces an integration of vision system and a visual guided walking of humanoid robot as a its application. Two CCD cameras are installed in a head which has 6 DOFs in total. Eyes and neck have the pan and tilt mechanism to move the view direction freely. All joints are driven by DC servo motors. We developed the motor controller to move all joint axes of the head. Each CCD camera transmits the NTSC formatted images to a frame grabber witch is installed on a main computer continuously. And then, the frame grabber captures the image frames in the frequency of 10 ~ 15 Hz. For a basic study, we construct the visual processing algorithm so that the robot can always gaze a red light marker. Besides, we establish the strategy of combining non real-time visual information and real-time walking pattern. Finally, vision guided walking algorithm which enables the robot to follow the red light marker on foot, is tested experimentally by using a humanoid robot, KHR-2. according to user command. Accordingly, we have made an effort to give the intelligent for autonomous walking. To realize the autonomous walking, the vision guided walking can be one of the essential functions. In this paper, we describe a vision system, a mechanical head design, the image processing and a vision guided walking experiment. As a visual sensor, we use the color CCD camera. To move the camera angle freely, we applied pan and tilt mechanism for both eyes and neck. We also developed a motor controller for control all joints of the head. A target tracking algorithm of the robot head is realized. Finally, we show the autonomous walking experiment by using visual target tracking. By this experimental research, we can confirm the good possibility of the vision application for humanoid robots and enhance the intelligence of KHR-2. Index Terms Humanoid robot, KHR-2, Vision, Walking guidance I. INTRODUCTION Recently, biped humanoid robots have been studied intensively by many researchers in various countries [1, 2]. Since these kinds of robots are representative human friendly intelligent robots, people have expected them to perform many functions such as dynamic walking, speaking, visual & image recognizing, human-robot interaction, artificial intelligence, etc. Early researches of biped humanoid robots only focused on the realization of stable dynamic walking because the biped walking is the most fundamental function of them. However, researches of humanoid robots are diverging into the various categories gradually. Particularly, the visual recognition and walking guidance from the image processing are relatively spotlighted research fields of humanoid applications [3-7]. In fact, the machine vision has been already investigated in many research areas long ago. Important thing is that the image processing techniques will much enhance the intelligence of the biped humanoid robot. Until now, we have primarily studied about a realization of stable dynamic walking during the past three years. Consequently, our humanoid robot, KHR-2 (Fig. 1) can walk stably on the uneven terrains with speed of 1.25 km/h [8]. It can walk forward, backward, sideward, turn around and curve. These movements were commanded by user through the wireless LAN. Though the robot has no physical connection, it was not autonomous yet. That is, KHR-2 just moved Fig. 1 The humanoid robot, KHR-2 II. VISION SYSTEM OF KHR-2 A. Stereo CCD Camera KHR-2 has two color CCD cameras. To secure sufficient space for movement in the head, we chose the compact sized cameras. They have a wide visible range and auto white balancing function. The specification of the camera is described in Table. I. 0-7803-9320-1/05/$20.00 2005 IEEE 135

TABLE I SPECIFICATION OF CAMERA Model GIC-100PC Weight 100 g Imaging sensor 1/4 Color CCD x 2 Image size 640(H) x 480(V) Focal length 3.6 mm Output signal NTSC Dimensions 22(Diameter)mm x 67.5(Depth)mm Power consumption 12V@0.25A B. Frame Grabber and Main Computer KHR-2 digitizes the image by using a frame grabber. The frame grabber is installed on a main computer. Two CCD cameras are connected with it, and then, it transmits the digital images to the computer continuously. The main computer is a commercial single board computer with small size (3.5 inch 2 ). Its OS (Operating system) is Windows XP. Table II and III show the specifications of the frame grabber and main computer respectively. TABLE II SPECIFICATION OF FRAME GRABBER Model Matrox Meteor-II Standard BUS type PC 104 + Video source NTSC Number of video inputs 2 Host OS Windows XP Frame rate 30Hz Pixel format 24bit RGB Frame buffer memory 4 Mbyte TABLE III SPECIFICATION OF MAIN COMPUTER CPU EBX Ezra 800 MHz System memory 512 MB Chipset VIA 8606T(Twister T)/82C686 Expansion PC104+, PC104 and PCI slot Power consumption Typical 5V @ 3.8A, Max 5V @ 4.5 A Size/Weight EBX form factor, 203 x 146 mm, 0.27 kg 2 x EIDE (Ultra DMA 100), 1 x FDD, 1 x K/B, 1x RS-232/422/485 3 x RS-232, 1 x LPT I/O Ethernet(IEEE 802.3u 100BAS0E-T) Audio(Mic in, Speaker out) 2 x USB 1.1 C. Head Design and Motor Controller Fig. 2 shows a head of KHR-2. The total degree of freedom is six. Each eye and neck joint has 2 DOF (Pan & Tilt mechanism). Actuators are DC motors with planetary gear heads. We also used the pulleys and belts to modify the final reduction ratio and to consider the space efficiency. These actuators are controlled by a motor controller which is developed (Fig. 3). The motor controller controls six DC motors by using PWM (Pulse Width Modulation) and encoder feedback. The capacity of motor amplifier is 48 Watt/ch, so it is sufficient to drive the small DC motors. The motor controller also has CAN(Controller Area Network) module so that it can receive the commands from the main computer and transmit the data to the main computer. Fig. 2 Photograph of the head Fig. 3 Motor controller of head III. GAZE CONTROL A. Control System Architecture The control system architecture of KHR-2 is distributed control system in Windows OS environment so as to reduce the computational burden of the main computer. Besides, it is easy to add the peripherals such as frame grabber, wireless LAN(Local Area Network), CAN interface card and so on. Fig. 4 shows the system configuration of KHR-2. The main computer communicates with the nineteen sub-controllers by CAN protocol. The CAN protocol has the feature of multimaster and multi-slave, so we can easily connect many subcontrollers in the type of daisy chains. Besides, it has very high speed (1Mbps). We have used a RTX(Real-Time extension, VenturCom Co.) which is a commercial software to realize the real-time control environment in Window XP. Fig. 4 Overall system configuration of the KHR-2 136

B. Image Processing Architecture We developed the robot operating program by using the visual C++ which is a commercial language software. This program grabs the images continuously but, it is not real-time since the Windows scheduler is not deterministic. We also developed the real-time motion control program which is executed by RTX scheduler. RTX scheduler is deterministic, so it makes the priority of the program highest. Consequently, robot operation and vision capture are working in the foreground (non-real-time), and motion control of the robot is executed in background (real-time). In this manner, the image capture and the processing are performed in robot operating program, and then the results are stored in the shared memory between the robot operating program and real-time motion control program. Finally, the real-time motion program controls the gaze according to the information of the shared memory. Fig. 5 describes the gaze control architecture. Frame grabber Image processing Robot operating program ( Non real-time) Stereo CCD cameras Shared memory Robot head Fig. 5 Gaze control architecture Motor controller Real-Time motion control program (Real-time) Main Computer C. Basic Target Searching Algorithm We developed the basic target searching algorithm to verify the performance of the gaze control architecture. A red light lamp is used as a target. When the image is grabbed, binarization is executed to find red light by using appropriate RGB thresholds. And then, the center position of the red light is calculated by the mass center technique which is very fast and insensitive to the noise. To increase the frame rate, target searching area is variable. If the target disappears in the image, the searching area becomes full size (640 x 480 pixels) of the image. However, once the target is detected, the local searching windows is generated in consider of the object size, the center position and the proper margin (Fig.6). (n) th searching window is used to search the (n+1) th target, so it must enclose the (n+1) th target. By the way, if the velocity of the target is very fast, (n) th searching window may not enclose the (n+1) th target. Therefore, we compensate the center position of the (n) th searching window by calculate the increment between (n-1) th and (n) th target position. Consequently, (n)th searching window position is adjusted by adding the scaled increment so that it can enclose the target perfectly. The target searching flow chart and its experimental example are represented in Fig. 7, 8 respectively. (n-1) th target K p Fig. 6 Local searching window D. Gaze Control To gaze the target all the time, the target position from the image must be kept near the image center (320, 240). Fig. 9 shows control block diagram. Where, P ref is reference position of the target on the image, P target is target position on the image. K I,n and K p,n are integral gain and proportional gain of the position control of the neck. K I,e and K p,e are integral gain and proportional gain of the position control of the eyes. In this diagram, we use the low pass filter for the neck control. This is because the response of the neck joint is slow. That is, the mass moment of inertia is large for neck joint. However, we do not use the low pass filer for eye control since it has very small mass moment of inertia. Local searching window maker Compensated (n) th searching window (n) th target (n+1) th target (n-1) th searching window (n) th searching window Image capture Already target was detected? Yes Local searching Binarization Calculate center position and area of the target Fig. 7 Target searching flow chart No Global searching 137

Detected Target Binarized Image the turning angle is 10 deg/step and curve angle is 10 deg/step respectively. Fig. 12 shows the sequential photographs that the robot follows the man who has the red light target. The time interval between the photos is 0.75 sec. Fig. 12 (a)~(d) show the robot is approaching the target according to the schedule. It is seen that the vision guided walking is successfully done. Local Searching Window 300 250 200 P err, x P err, y 150 100 LPF a s + a Fig. 8 Example of target searching K I, n + K s p, n Pixels 50 0-50 -100-150 P ref + - P err K I, e + K s p, e θ neck θeyes HEAD P target -200-250 -300 0 100 200 300 400 500 600 700 800 900 1000 Time(10 msec) Fig. 9 Block diagram of the gaze control Fig. 10 Tracking error diagram IV. VISION GUIDED WALKING EXPERIMENT A. Experiment of Gaze Control We tested the performance of the gaze control. In the gaze control, only one CCD camera is used in the experiment, and the other CCD camera gaze the same direction with the one. Fig. 10 describes the tracking error (P err ) and Fig. 11 shows roll/pitch angles of the neck and eyes when the red light target is circling with the speed of 30 RPM(round per minute) with the radius of 10 cm at 30cm in front of the camera. x is horizontal axis and y is vertical axis on the vision image. The maximum pixel errors on 640 x 480 pixels image are about ± 70 in x-dir and ± 100 in y-dir. It is easily seen that the each value is sufficient inside of the image boundaries ( ± 320 in x- dir, ± 240 in y-dir). The error curves are not somewhat smooth because the visual image processing is not real-time. However, the motions of the head joints are smooth so as to accomplish nice gaze control in real-time. Finally, these graphs show the successful gaze control. B. Experiment of Vision Guided Walking We scheduled the walking pattern in Table IV. In this manner, the robot always be standstill a specific prescribed distance in front of the target. We measured the distance between the robot and target by means of the number of detected pixels. Hence, before the test, we calibrated the relation between distance and number of detected pixels. In the experiment, the turning angle, curve angle, forward step length and backward step length are constant. The forward step length is 150 mm, the backward step length is 100 mm, Angle (Deg) 20 18 16 14 12 10 8 6 4 2 0-2 -4-6 -8-10 -12-14 -16-18 -20 Yawing Angle of Gaze (deg) Pitching angle of eye Yawing angle of eye Pitching angle of neck Yawing angle of neck 0 100 200 300 400 500 600 700 800 900 1000 Time (sec) Fig. 11 Orientation of eyes and neck TABLE IV SCHEDULE OF WALKING PATTERN FROM VISUAL INFORMATION 15 < θ y -15 θ <15 < y -15 > θ y N p <2000 Curve (CW) forward Curve (CCW) Number of Detected Pixel ( N p ) 2000<N p< 6000 Turn around (CW) Stand still Turn around (CCW) N p > 6000 backward backward backward 138

(a) Vision guided forward walking (b) Vision guided backward walking (c) Vision guided turning around 139

(d) Vision guided curved walking Fig. 12 Sequential photographs of vision guided walking V. CONCLUSION The vision system of the humanoid robot KHR-2 and vision guided walking were presented. The stereo color CCD camera, frame grabber and small main computer were used to capture the images and execute the image processing. The overall control system of KHR-2 was established. We designed the robot head and motor control board to carry out the gaze control. The image processing architecture was also built for combining the visual data with the motion of the robot. For experiments, the basic target searching algorithm which distinguishes the specific color object from the image was made. We designed control block diagram using the PI controllers to gaze the target all the time. The performance of the gaze control was verified experimentally. To realize the vision guided walking, the schedule of walking pattern according to the visual information was defined. Finally, we tested the vision guided walking, and it was successfully accomplished. [4] O. Lorch, A. Albert, J. Denk, M. Gerecke, R. Cupec, J. F. Seara, W. Gerth and G. Schmidt, Experiments in Vision-Guided Biped Walking, in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2484-2490, 2002. [5] S. Kagami, K. Nishiwaki, J. J. Kuffner Jr., Y. Kuniyoshi, M. Inaba and H. Inoue, Online 3D Vision, Motion Planning and Biped Locomotion Control Coupling System of Humanoid Robot : H7, in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2557-2562, 2002. [6] Y. Ogura, Y. Sugahara, Y. Kaneshima, N. Hieda, H. Lim and A. Takanishi, Interactive Biped Locomotion Based on Visual/Auditory Information in Proc. IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 253-258, 2002. [7] K. Nishiwaki, S. Kagami, J. J. Kuffner, M. Inaba and H. Inoue, Online Humanoid Walking Control System and a Moving al Tracking Experiment, in Proc. IEEE Int. Conf. on Robotics and Automation, pp. 911-916, 2003. [8] J. Y. Kim, I. W. Park, J. Lee, M. S. Kim, B. K. Cho and J. H. Oh, System Design and Dynamic Walking of Humanoid Robot KHR-2, in Proc. IEEE Int. Conf. on Robotics and Automation, pp. 1443-1448, 2005. ACKNOWLEDGMENT This research is mainly supported by KAIST (Korea Advanced Institute of Science and Technology) and partly supported by HWRS (Human Welfare Robotic System), IRRC (Intelligent Robot Research Center) and BK-21 (Brain Korea - 21) project. REFERENCES [1] K. Hirai, M. Hirose, Y. Haikawa, and T. Takenaka, The Development of Honda Humanoid Robot, in Proc. IEEE Int. Conf. on Robotics and Automations, pp.1321-1326, 1998. [2] Y. Sakagami, R. Watanabe, C. Aoyama, S. Matsunaga, N. Higaki, and K. Fujimura, The intelligent ASIMO: System overview and integration, in Proc. IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 2478-2483, 2002. [3] S. Hashimoto, Vision System for Humanoid Robot Toward Emotional Man-Machine-Environment Interaction-, in 2 nd Int. Conf. on Knowledge-Based Intelligent Electronic Systems, pp. 269-273, 1998. 140