A Remote Experiment System on Robot Vehicle Control for Engineering Educations Based on World Wide Web

Similar documents
AN ARDUINO CONTROLLED CHAOTIC PENDULUM FOR A REMOTE PHYSICS LABORATORY

Development of a telepresence agent

ReVRSR: Remote Virtual Reality for Service Robots

Boe-Bot robot manual

Implementation of a Self-Driven Robot for Remote Surveillance

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

Lab 8: Introduction to the e-puck Robot

Control and robotics remote laboratory for engineering education

EE-110 Introduction to Engineering & Laboratory Experience Saeid Rahimi, Ph.D. Labs Introduction to Arduino

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

Team Project: A Surveillant Robot System

Development of an Automatic Camera Control System for Videoing a Normal Classroom to Realize a Distant Lecture

Team Project: A Surveillant Robot System

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

EGG 101L INTRODUCTION TO ENGINEERING EXPERIENCE

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

OzE Field Modules. OzE School. Quick reference pages OzE Main Opening Screen OzE Process Data OzE Order Entry OzE Preview School Promotion Checklist

Proposal for a Rapid Prototyping Environment for Algorithms Intended for Autonoumus Mobile Robot Control

Wireless Robust Robots for Application in Hostile Agricultural. environment.

IRISYS ISI Series Imager Report Writing Software

Mobile Robot Platform for Improving Experience of Learning Programming Languages

ADVANCED SAFETY APPLICATIONS FOR RAILWAY CROSSING

MEMS Accelerometer sensor controlled robot with wireless video camera mounted on it

AUTOMATIC RAILWAY CROSSING SYSTEM

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Collaborative Virtual Environment for Industrial Training and e-commerce

Estimation of Absolute Positioning of mobile robot using U-SAT

Devastator Tank Mobile Platform with Edison SKU:ROB0125

LESSONS Lesson 1. Microcontrollers and SBCs. The Big Idea: Lesson 1: Microcontrollers and SBCs. Background: What, precisely, is computer science?

DEVELOPMENT OF A ROBOID COMPONENT FOR PLAYER/STAGE ROBOT SIMULATOR

Development of a Walking Support Robot with Velocity-based Mechanical Safety Devices*

6 System architecture

Hardware System for Unmanned Surface Vehicle Using IPC Xiang Shi 1, Shiming Wang 1, a, Zhe Xu 1, Qingyi He 1

Copyright 2014 SOTA Imaging. All rights reserved. The CLIOSOFT software includes the following parts copyrighted by other parties:

Design of Tracked Robot with Remote Control for Surveillance

RC-WIFI CONTROLLER USER MANUAL

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

A servo is an electric motor that takes in a pulse width modulated signal that controls direction and speed. A servo has three leads:

Design of double loop-locked system for brush-less DC motor based on DSP

DEVELOPMENT OF MOBILE PASSIVE SECONDARY SURVEILLANCE RADAR

go1984 Performance Optimization

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

Technology and the Stage:

Switching Control and Strain Suppression Using Ball Screw Drive Devices

Guidance of a Mobile Robot using Computer Vision over a Distributed System

1 Lab + Hwk 4: Introduction to the e-puck Robot

Remote Kenken: An Exertainment Support System using Hopping

CS Problem Solving and Structured Programming Lab 1 - Introduction to Programming in Alice designed by Barb Lerner Due: February 9/10

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

The MIT Microelectronics WebLab: a Web-Enabled Remote Laboratory for Microelectronic Device Characterization

Teleoperation and System Health Monitoring Mo-Yuen Chow, Ph.D.

Nebraska 4-H Robotics and GPS/GIS and SPIRIT Robotics Projects

Design and Application of Multi-screen VR Technology in the Course of Art Painting

INDUSTRIAL ROBOTS PROGRAMMING: BUILDING APPLICATIONS FOR THE FACTORIES OF THE FUTURE

Setup Download the Arduino library (link) for Processing and the Lab 12 sketches (link).

Momo Software Context Aware User Interface Application USER MANUAL. Burak Kerim AKKUŞ Ender BULUT Hüseyin Can DOĞAN

TOSHIBA MACHINE CO., LTD.

Double-track mobile robot for hazardous environment applications

Inter-Ing 2005 INTERDISCIPLINARITY IN ENGINEERING SCIENTIFIC CONFERENCE WITH INTERNATIONAL PARTICIPATION, TG. MUREŞ ROMÂNIA, NOVEMBER 2005.

Automatic Transfer Switch (ATS) Using Programmable Logic Controller (PLC)

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Design Concept of State-Chart Method Application through Robot Motion Equipped With Webcam Features as E-Learning Media for Children

DragonLink Advanced Transmitter

Design and Control of the BUAA Four-Fingered Hand

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Experiment #3: Micro-controlled Movement

Gesture Recognition with Real World Environment using Kinect: A Review

Lab 2: Blinkie Lab. Objectives. Materials. Theory

Quick Start Training Guide

Lab 7: Introduction to Webots and Sensor Modeling

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Digital Portable Overhead Document Camera LV-1010

Available online at ScienceDirect. Procedia Computer Science 76 (2015 ) 2 8

A Novel Approach for Image Cropping and Automatic Contact Extraction from Images

PaperCut PaperCut Payment Gateway Module - Payment Gateway Module - NuVision Quick Start Guide

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

A REMOTE EXPERIMENT ON MOTOR CONTROL OF MOBILE ROBOTS

Connect your robot with RoboDK (Kuka)

VOICE CONTROLLED ROBOT WITH REAL TIME BARRIER DETECTION AND AVERTING

REALIZATION OF TAI-CHI MOTION USING A HUMANOID ROBOT Physical interactions with humanoid robot

1. Controlling the DC Motors

Relationship to theory: This activity involves the motion of bodies under constant velocity.

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Wahl HSI3000 Series Imager Report Writer Software

Development of an Intuitive Interface for PC Mouse Operation Based on Both Arms Gesture

Design of a Remote-Cockpit for small Aerospace Vehicles

Lab 5: Inverted Pendulum PID Control

Arduino Control of Tetrix Prizm Robotics. Motors and Servos Introduction to Robotics and Engineering Marist School

Sensor system of a small biped entertainment robot

Official Documentation

STUDY ON REFERENCE MODELS FOR HMI IN VOICE TELEMATICS TO MEET DRIVER S MIND DISTRACTION

Instructions for the W0NE Remote HF Rig, IC-7300

Automated E-Billing and Supply Control using Power Line Communication

The goals for this project are to demonstrate, experience, and explore all aspects of Java Internet Programming.

Proprietary and restricted rights notice

3D and Sequential Representations of Spatial Relationships among Photos

Study of M.A.R.S. (Multifunctional Aero-drone for Remote Surveillance)

Transcription:

A Remote Experiment System on Robot Vehicle Control for Engineering Educations Based on World Wide Web Akira Yonekawa Information Technology Research Center, Hosei University 3-2-3 Kudankita, Chiyoda-ku, Tokyo, 102-8160 Japan, yonekawa@st0.k.hosei.ac.jp Hideki Hirano 1, Daisuke Yoshizawa 1 and Masami Iwatsuki 1 Abstract - State-of-the-art Internet technologies allow us to provide advanced and interactive distance education services. On the other hand, teleoperation systems with robot manipulator or vehicle via Internet have been developed in the field of robotics. By fusing two techniques, we can develop a remote experiment system for engineering educations based on World Wide Web, which allows students to take courses on experiments and exercises through the Internet anytime and anywhere they want. Although such remote experiment systems have been recently proposed by using an inverse pendulum, a coupled tank, heat transfer system and so on, they are inflexible because only their control parameters can be accessed by users. In this paper, we propose an unmanned and full-fledged remote experiment system comparable to regular local experiment systems that can provide flexible and effective practices to students, since the users can upload, compile and execute any source codes they create via Internet. The proposed system allows student users to exercise and practice remotely about principles of stepping motor control, image processing and their application to trajectory control for a robot vehicle. In this system, a student user has to prepare only a PC connected to Internet and installed with a browser and a text editor, because all procedures necessary for the experiments can be executed on the server side. Furthermore, this paper reports the results of a questionnaire survey on the proposed system for student users in our department in order to evaluate its effectiveness. Index Terms - engineering education, distance learning, remote experiment system, robot vehicle control. INTRODUCTION State-of-the-art Internet technologies allow us to provide advanced and interactive distance education services. However, we could not help but gather students for experiments and exercises in education on engineering because large-scale equipments and expensive software are required. On the other hand, teleoperation systems with robot manipulator or vehicle via Internet have been developed in the field of robotics [1],[2]. Furthermore, a system which measures by controlling a measuring instrument from the remote place on the WWW environment is also proposed [3], [4]. These techniques allow us to control robots and measure various physical quantities by experiments only with PCs which are connected Internet. By fusing these techniques, we can develop a remote experiment system for engineering educations based on World Wide Web, which allows students to take courses on experiments and exercises through the Internet anytime and anywhere they want. Since the remote experiment system can work at all hours of the day and night, we can provide experimental environments without restrictions of time or space. Furthermore we can expect to improve learning effects because the experiment can be performed not by each group of students but by each student. Morikami et al. have proposed an online learning system using a real inverted pendulum system that allows students to check the theory by simulation and control a real pendulum by sending values of its gain parameters and observing by a video camera via Internet [5]. Recently Web-based remote experiment systems have been proposed, which can control actual hardware such as robots and cameras by sending source codes written by remote users [6],[7]. By extending these systems, we can provide a full-fledged remote experiment system for engineering education for our students. This paper proposes an unmanned remote experiment system that can serve flexible and effective practices for student users, since the users can upload, compile and execute any source codes they create via Internet. The proposed system allows students to exercise and practice remotely about principles of stepping motor control, image processing and their application to trajectory control for a robot vehicle. In this system, a student user has to prepare only a PC connected to Internet and installed with a browser and a text editor, because all procedures necessary for the experiments can be executed on the server side. Furthermore, this paper presents the results of a questionnaire survey on the proposed system for undergraduate students in our laboratory in order to evaluate its effectiveness. 1 Graduate School of Engineering, Hosei University,3-7-2 Kajino-cho, Koganei-shi, Tokyo, 184-8584 Japan, iwatsuki @k.hosei.ac.jp S3H-24

FIGURE 1 CONFIGURATION OF THE PROPOSED SYSTEM. (a) (b) FIGURE 3 MOBILE ROBOT AND SURVEILLANCE CAMERA. 5) The system can grade multiple-choice answers for the exercises automatically except descriptive answers in order to recognize student s comprehension and progress. This function can keep the students from stepping forward the next exercise without comprehending the current experiment. HARDWARE CONFIGURATIONS FIGURE 2 EXPERIMENTAL SETUP. SPECIFICATIONS Our goal is to construct a distance learning system that allows student users to take remotely a course of full-fledged experiment. In order to realize this goal, the proposed system is designed for satisfying the following specifications: 1) A student user can perform all functions required for experiments distantly only with a PC connected to Internet and installed with a browser and a text editor. 2) In order to serve flexible and effective exercises to the user, user programs can be executed by uploading a source code as a text file. 3) The proposed system allows the user to concentrate only on the essential procedures in programming for the exercises by aggregating all interfaces on the server side and concealing unessential procedures such as file I/O, hardware initialization, data transmission, image drawing and so on. 4) The user can not only restart the experiment from any pausing point but also perform the experiments already finished over and over again Figure 1 shows the configuration of the proposed experiment system. This system consists of a Web server which performs experiments for control of a mobile robot vehicle and image processing, the robot with two stepping motors and a CCD camera, and a camera server which transmit the scene of the experiment to the user in real time. The student user can control the mobile robot remotely by using an experimental setup as shown in Figure 2. The mobile robot as shown in Figure 3 (a) is driven by controlling two stepping motors wirelessly, and video images of the mounted CCD camera are also transmitted wirelessly. The user can perform the experiments for control of the stepping motors and image processing with the images by uploading source codes described as functions of C language. A surveillance camera equipped at the ceiling above the setup as shown in Figure 3 (b) sends the scene of experiment to the user. Furthermore, the mobile robot can return the starting point and charge the battery automatically since the position an orientation of the mobile robot can be estimated by detecting a red triangular shape drawn at the top of the robot as shown in Figure 3(a). The mobile robot vehicle consists of two stepping motors, transmitting and receiving modules, a controller with a microcomputer H8-3052, and a lithium ion battery and a CCD camera with a UHF video transmitter. The stepping motors are controlled by receiving serial signals from the receiving module, processing them with the microcomputer and sending them to a drive circuit. The microcomputer also executes control commands and manages a function of automatic charge by a battery sensor. The battery can be charged with independent of the direction of the robot vehicle and kept from shorting the circuit as shown in Figure 2. S3H-25

FORDLL int User(void){ // Please write a program below. RGB *Buf; int xsize = 320; int ysize = 240; int i; Speed(60); // Set 60 pulse / second Forward(900); // Move forward by 900 pulse TurnL(500); // Rotate Counterclockwise by 500 pulse OutPort(0); // Stop the vehicle TakePhoto(); // Take a snapshot Buf = (RGB*) malloc (xsize * ysize * sizeof (RGB)); // Allocate memory LoadBmp (Buf); // Load the snapshot as a bitmap for(i=0;i<xsize*ysize;i++) { // Image inversion Buf[i].r = 255 - Buf[i].r; Buf[i].g = 255 - Buf[i].g; Buf[i].b = 255 - Buf[i].b; } SaveBmp (Buf,xsize, ysize); // Save the processed image free(buf); // Release the memory return 0; } FIGURE 5 SAMPLE CODE. FIGURE 4 FLOW CHART OF COMPILATION. SOFTWARE CONFIGURATIONS I. Server Side Programs and Components As described in the previous section, our system has to allow a student user to perform distantly all functions required for experiments anytime anywhere only with a PC connected to Internet and installed with a browser and a text editor. Therefore all procedures necessary for the experiments must be executed on the server side and the results must be sent to the client side. In order to realize such a function, we adopt Internet Information Services (IIS) on Microsoft Windows XP as Web server software and a server side programming technique called Active Sever Pages (ASP), which can create dynamic Web contents, interpret various scripts, execute commands in the server side, and transmit only data described in HTML plane text form to the client side. The functions of uploading files, compiling user programs to Dynamic Link Library (DLL), executing the DLL and so on are realized by a combination of the ASP and object components with ActiveX, which is a set of technologies from Microsoft that enables interactive content for the World Wide Web and easy implementation of PC software for hardware control. The proposed system uses the following ActiveX components: 1) A file upload component called BASP21 that is provided free [8] 2) A compilation component that produces DLLs executable on ASP 3) An execution component that executes the DLLs In the proposed system, we use cl.exe included in the Microsoft Visual C++ as a command line compiler. The compilation component has a function of sending messages which inform whether the compilation is succeeded or not to the user. If the compiler puts error messages, then the user can receive these messages. The most important function in the proposed system is to compile user programs described as text files into DLLs in order to perform the experiment only with a text editor that come with the operating system. This function is processed according to the following steps as shown in Figure 4 Step1) A program file described as a function User with the file extension name cpp is uploaded. Step2) The file upload component is called by the server side program and saves the uploaded file in a folder on the server if the file extension name is cpp and contains the function User. If not, the component sends an error message to the effect that the file is inappropriate and exits the program. Step3) The compilation component generates a DLL if the compilation is succeeded. If not, it sends the error message put by the compiler to the user and exits the program. Step4) The execution component is called by the server side program and executes the DLL and sends a success message. II. Description of User Programs The student users can create their original programs for the exercises by using the previously provided library that consists of functions for managing a serial port device, transmitting data to the mobile robot vehicle, moving the vehicle forward and backward, changing the vehicle s speed, grabbing snapshot images from the mounted camera and displaying the image on the user's browser. The header file for the library is automatically included with the user program for the convenience of the users. Thanks to these functions, the users S3H-26

FIGURE 6 JAVA SIMULATOR. can concentrate on describing only the essential functions in programming for the exercise. Figure 5 shows a sample code of the user program that performs the following operation: 1) Initialize pulse interval time of pulse to 60 pulses per second, 2) Move the robot vehicle forward by 900 pulses, 3) Rotate the vehicle counterclockwise by 500 pulses, 4) Stop the vehicle, 5) Grab a snapshot image from the mounted camera, 6) Allocate memory space for loading the image, 7) Load the image in the memory space, 8) Invert RGB colors of the image, 9) Save the inverted image in a folder of the server and display it on the user s browser. III. Java Simulator A Java simulator as shown in Figure 6 can generate automatically source codes which enable the robot vehicle to move on the desired trajectories. A trajectory is given by clicking desired coordinates of the 2D plane on the Java simulator by a mouse in order of a set of points passing on the trajectory. IV. Session Management Function A session management function implemented on the Web server restricts only one user who can operate the experiment setup and keep the next users on standby. The standby users can observe the scene of the experiment operated by the current user from the surveillance camera and know the current queue size and the user names on standby on their browsers. V. Functions for keeping user histories and grading exercises automatically By a function for keeping user histories, the users can not only restart the experiment from any pausing point but also perform the experiments already finished over and over again. FIGURE 7 AN EXAMPLE OF THE RESULTING GRADE OF EXERCISES. Furthermore, the proposed system can grade multiple-choice answers for the exercises automatically except descriptive answers in order to recognize student s comprehension and progress. This function can keep the user from stepping forward the next exercise without comprehending the current experiment. An example of the resulting grade of exercises is shown in Figure 7 where each item means the following state: O: The user has already read the explanation on the subject, A: The user has already answered the subject correctly, B: The user has not yet answered or answered wrongly. EXPERIMENT SUBJECTS The proposed system serves the following four experiment subjects to the students in order to understand the fundamental structure and mechanism of vision based robot vehicle control step by step. Step 1: Control of stepping motors At the first step, the student users learn how to control a single stepping motor and two stepping motors simultaneously. They study rotation angles can be controlled by describing source codes according to formats of the control commands which specify the number of pulse. Figure 8 shows an example of the browsing image at Step 1, where the users can observe motions of the stepping motors by filling a control command value in the lower right text box. The users study how to rotate the single motor clockwise and counterclockwise, and then to control the two motors simultaneously. S3H-27

Session S3H FIGURE 8 STEP 1: CONTROL OF STEPPING MOTORS. FIGURE 10 STEP 3: IMAGE PROCESSING. FIGURE 9 STEP 2: TRAJECTORY CONTROL OF THE MOBILE ROBOT VEHICLE. FIGURE 11 STEP 4: VISION BASED ROBOT VEHICLE CONTROL. Step 2: Trajectory control of a mobile robot vehicle At the second step, the users understand that a mobile robot vehicle with two stepping motors can be controlled as an application of Step 1. In this step, the users should create four functions "Forward," "Backward," "TurnR" and "TurnL" for moving the robot vehicle forward and backward and rotate it clockwise and counterclockwise respectively by rotating both of the two motors in the same or reverse direction. Furthermore, the users learn how to control the trajectory of the vehicle robot by creating the user program. By uploading the program, the student users can observe motions of the robot vehicle in real time with a browsing image as shown in Figure 9. In this figure, the upper left frame is the Java simulator, the upper right is an image captured from the surveillance camera and the lower left is an image taken by the camera mounted on the robot vehicle. If an error message is sent from the server, it is displayed at the lower right frame. The users can check whether their programs generate the trajectories as intended by the Java simulator before they operate actually the vehicle. The simulator can also generate automatically source codes which enable the robot vehicle to move on the desired trajectories. Step 3: Image processing At the third step, the users learn about the fundamental programming for image processing as a previous knowledge of vision based robot vehicle control. The users exercise this subject on a browsing display as shown in Figure 10. Firstly, by filling a value in a lower right text box, the users study about image binarization, inversion and so on. Secondly, they create a program for calculating the maximum red area and its center of mass, which is applied to vision based robot vehicle control at the final step. Step 4: Vision based robot vehicle control. At the final step, the students study how to control the trajectory of the robot vehicle based on vision by combining the knowledge of Step 2 and 3. They exercise this subject on a browsing display as shown in Figure 11 where the Java simulator, an image from the surveillance camera and an image from the mounted camera are arranged at the left, middle and right of the upper frames, and a snapshot image taken by the mounted camera, a processed image by a user 0-7803-9077-6/05/$20.00 2005 IEEE October 19 22, 2005, Indianapolis, IN 35th ASEE/IEEE Frontiers in Education Conference S3H-28

program and the file upload frame are at the left, middle and right of the lower frames respectively. This figure demonstrates that the user is controlling the robot vehicle by estimating its position and orientation from the maximum blue area and its center of mass. Session S3H QUESTIONNAIRE We have carried out a questionnaire survey to evaluate the validity of the proposed remote experiment system for nine undergraduate students actually experimented as a subject of our laboratory. The questionnaire contains several multiplechoice answers which check student s comprehension of the experiments and operability of the system and a comment field. The students answered by selecting one of the four grades with 4: Excellent, 3: Good, 2: Fair and 1: Poor for the following questions. Q1: How was the operability of the remote experiment system? Q2: How was the responsive performance of the remote experiment system? Q3: How was the quality of the images captured by the surveillance camera and the mounted camera? Q4: Haw was the ease of the file upload procedures? Q5: How were the functions for restarting the experiment and grading automatically exercises? Q6: Could you follow the instructions easily? Q7: Could you feel an actual experience for the remote experiment? Q8: Could you get interested in the experiment? Q9: Do you want to perform more remote experiments like this in the future? The average values of the grades to the above questions are shown in Figure 12. This figure shows that most of the students have favorable impressions for the remote experiment from the results of Q7, Q8 and Q9. Unfortunately, the users have unfavorable impressions for the operability, responsive performance and quality of images because of low-bandwidth of Internet line. The student s comments about the remote experiment are categorized as favorable or unfavorable ones. The excerpts of the comments are as follows: Favorable comments A1: It has the advantage that users can perform the experiments already finished over and over again anytime anywhere. A2: I could deeply understand the experiment subjects by solving the exercises step by step. A3: I could understand the mechanism of the stepping motor very well by observing its motions in real time. A4: I could perform the experiment at home. Unfavorable comments A1: When an accident happens during the experiment, the remote user can do nothing to fix it. A2: It was difficult for me to perform the experiment because the knowledge of C programming is required. A3: The image captured from the surveillance camera could not be observed smoothly due to low-bandwidth connection. FIGURE 12 RESULT OF A QUESTIONNAIRE. From the above favorable comments, we can conclude that the remote accessibility, that is one of the main advantages in our remote experiment system, is accepted by the students. However, it has disadvantages that the students can not get advices from assistants on the spot nor handle the experiment setup directly when an accident happens. CONCLUSIONS In this paper we have proposed an unmanned remote experiment system that can serve flexible and effective practices for students, since the users can upload, compile and execute any source codes they create via Internet. The proposed system allows student users to exercise and practice remotely about principles of stepping motor control, image processing and their application to trajectory control for a robot vehicle. We have presented the results of a questionnaire survey on the proposed system for undergraduate students in our laboratory in order to evaluate its effectiveness. REFERENCES [1] B. Dalton and K. Taylor,, "Distributed Robotics over the Internet", IEEE Robotics and Automation Magazine, Vol.7, No.2, June 2000, pp.22-27. [2] R. Simmons, J. L. Fernandez, R. Goodwin, S. Koenig and J. O'Sullivan, "Lessons Learned from Xavier", IEEE Robotics and Automation Magazine, Vol.7, No.2, June 2000, pp.33-39. [3] K.Kawahara,M,Miyake,and Y.Katsuyama, ""Implementation of Remote Measurement System by using Direct Communication between Clients Over WWW, T.IEICE, Vol.J82-B,No.10, pp.1942-1944(1999-10)(in Japanese) [4] O.Koyama and Y.Katsuyama, "Remote Measurement/Calculation System Over WWW", T.IEICE,Vol.J82-B,No.2,pp.303-305(2001-2)(in Japanese). [5] A.Morikami, J.Yoshizawa, M.Kimura, Y.Ohyama, and J.She, "Construction of the On-line Study System Using Inverted Pendulum", Proc. of the 2002 Japan Industry Application Society Conf., Vol1, pp.41-44(2002-8)(in Japanese) [6] E. Guimaraes, A. Maffeis, J. Pereira, B. Russo, E. Cardozo, M. Bergerman,M. F. Magalhaes, "REAL: A Virtual Laboratory for Mobile Robot Experiments", IEEE Trans. on Education, Vol. 46, No. 1, pp.37-42, Feb. 2003. [7] Masami Iwatsuki, Yoriyuki Kato, Akira Yonekawa, "Prototyping of Remote Experiment and Exercise Systems for an Engineering Education based on World Wide Web",IEEJ Trans.IA, Vol.123, No.8, pp.903-910, Aug. 2003. [8] http://www.hi-ho.ne.jp/babaq/basp21.html, (in Japanese) S3H-29