Development of a telepresence agent

Similar documents
VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Geo-Located Content in Virtual and Augmented Reality

Omni-Directional Catadioptric Acquisition System

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Virtual Reality Calendar Tour Guide

Real-Time Bilateral Control for an Internet-Based Telerobotic System

Design and evaluation of a telepresence robot for interpersonal communication with older adults

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

DEVELOPMENT OF A TELEOPERATION SYSTEM AND AN OPERATION ASSIST USER INTERFACE FOR A HUMANOID ROBOT

HAPTIC BASED ROBOTIC CONTROL SYSTEM ENHANCED WITH EMBEDDED IMAGE PROCESSING

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Advancements in Gesture Recognition Technology

HeroX - Untethered VR Training in Sync'ed Physical Spaces

MAX: Wireless Teleoperation via the World Wide Web

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Wirelessly Controlled Wheeled Robotic Arm

Design and Control of the BUAA Four-Fingered Hand

Virtual Grasping Using a Data Glove

RF(433Mhz) BASED PROJECTS

Mars Rover: System Block Diagram. November 19, By: Dan Dunn Colin Shea Eric Spiller. Advisors: Dr. Huggins Dr. Malinowski Mr.

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Telepresence Robot Care Delivery in Different Forms

Remote Control Based Hybrid-Structure Robot Design for Home Security Applications

Hand Gesture Recognition Using Radial Length Metric

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

TELEOPERATED SYSTEM WITH ACCELEROMETERS FOR DISABILITY

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

A Very High Level Interface to Teleoperate a Robot via Web including Augmented Reality

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

ReVRSR: Remote Virtual Reality for Service Robots

Control and robotics remote laboratory for engineering education

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Gesture Recognition with Real World Environment using Kinect: A Review

Intelligent interaction

Evaluation of Five-finger Haptic Communication with Network Delay

Challenging areas:- Hand gesture recognition is a growing very fast and it is I. INTRODUCTION

Design of WSN for Environmental Monitoring Using IoT Application

Programmable Ubiquitous Telerobotic Devices

Robot Sensors Introduction to Robotics Lecture Handout September 20, H. Harry Asada Massachusetts Institute of Technology

How is a robot controlled? Teleoperation and autonomy. Levels of autonomy 1a. Remote control Visual contact / no sensor feedback.

Internet-based Teleoperation of a Robot Manipulator for Education

POLAR COORDINATE MAPPING METHOD FOR AN IMPROVED INFRARED EYE-TRACKING SYSTEM

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

EQUIPMENT OPERATOR TRAINING IN THE AGE OF INTERNET2

Designing Personal Tele-embodiment

THE PINNACLE OF VIRTUAL REALITY CONTROLLERS

Wireless Master-Slave Embedded Controller for a Teleoperated Anthropomorphic Robotic Arm with Gripping Force Sensing

Wheeled Mobile Robot Kuzma I

Controlling Humanoid Robot Using Head Movements

Sensor system of a small biped entertainment robot

Introduction to Virtual Reality (based on a talk by Bill Mark)

Live Hand Gesture Recognition using an Android Device

DETC AN ADMITTANCE GLOVE MECHANISM FOR CONTROLLING A MOBILE ROBOT

Input devices and interaction. Ruth Aylett

Enhancing Robot Teleoperator Situation Awareness and Performance using Vibro-tactile and Graphical Feedback

VR based HCI Techniques & Application. November 29, 2002

The Control of Avatar Motion Using Hand Gesture

Shape Memory Alloy Actuator Controller Design for Tactile Displays

An Autonomous Self- Propelled Robot Designed for Obstacle Avoidance and Fire Fighting

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

2.1 Dual-Arm Humanoid Robot A dual-arm humanoid robot is actuated by rubbertuators, which are McKibben pneumatic artiæcial muscles as shown in Figure

these systems has increased, regardless of the environmental conditions of the systems.

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

ROBOTICS & EMBEDDED SYSTEMS

Masatoshi Ishikawa, Akio Namiki, Takashi Komuro, and Idaku Ishii

An Inexpensive Experimental Setup for Teaching The Concepts of Da Vinci Surgical Robot

ARY Digital One ESNG Application

SENLUTION Miniature Angular & Heading Reference System The World s Smallest Mini-AHRS

Gesture Identification Using Sensors Future of Interaction with Smart Phones Mr. Pratik Parmar 1 1 Department of Computer engineering, CTIDS

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

PRoP: Personal Roving Presence

Trends & Milestones. History of Virtual Reality. Sensorama (1956) Visually Coupled Systems. Heilig s HMD (1960)

MOVING A MEDIA SPACE INTO THE REAL WORLD THROUGH GROUP-ROBOT INTERACTION. James E. Young, Gregor McEwan, Saul Greenberg, Ehud Sharlin 1

Passive Bilateral Teleoperation

HAND-SHAPED INTERFACE FOR INTUITIVE HUMAN- ROBOT COMMUNICATION THROUGH HAPTIC MEDIA

Application Areas of AI Artificial intelligence is divided into different branches which are mentioned below:

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

International Journal for Research in Applied Science & Engineering Technology (IJRASET) DTMF Based Robot for Security Applications

WiCon Robo Hand. Electrical & Computer Engineering Department, Texas A&M University at Qatar

Emergency Stop Final Project

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

User interface for remote control robot

Teleoperated Robot Controlling Interface: an Internet of Things Based Approach

Chapter 1 - Introduction

Medical Robotics. Part II: SURGICAL ROBOTICS

INCLINED PLANE RIG LABORATORY USER GUIDE VERSION 1.3

Applying virtual reality to remote control of mobile robot

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

History of Virtual Reality. Trends & Milestones

HAND GESTURE CONTROLLED ROBOT USING ARDUINO

Social Rules for Going to School on a Robot

Transcription:

Author: Chung-Chen Tsai, Yeh-Liang Hsu (2001-04-06); recommended: Yeh-Liang Hsu (2001-04-06); last updated: Yeh-Liang Hsu (2004-03-23). Note: This paper was first presented at. The revised paper was presented at.the final version of this paper is published in Asian Journal of Information Technology, Vol. 3, No.1, January 2004, p.27~34. Development of a telepresence agent Abstract The concept of telepresence describes the combinations of technologies that send the presence of a human operator to a distant place, and provide the operator a feeling of actual presence in the distant place. It is an interesting field that combines virtual reality implementations with human-system interface, communication technologies and robot control systems. This paper describes an on-going project in Yuan Ze University, Department of Mechanical Engineering, for developing a telepresence agent. The telepresence agent is accessed and controlled through the Internet, using the user s own biological signals. It is an extension of the distant user s own body, which enables the user to deal with simple tasks in daily life at the remote site. Sensory feedbacks such as visual, audio or multi-functional tactile are also provided to create an immersive experience for the user. We also plan to add some human behavioral characteristics into the telepresence agent such as facial expression to increase its humanity. The final goal is that the user 1

and people in a distance place can feel the presence each other through the telepresence agent. Keywords: telepresence, robot, Internet, virtual reality. 1. Introduction Telepresence is an interesting field that combines virtual reality implementations with human-system interface, communication technologies and robot control systems. Early in 1983, Akin et al. [1983] defined telepresence as, At the worksite, the manipulators have the dexterity to allow the operator to perform normal human functions. At the control station, the operator receives sufficient quantity and quality of sensor feedback to provide a feeling of actual presence at the worksite. Sheridan s [1986] definition of telepresence is, Telepresence means visual, kinesthetic, tactile or other sensor feedback from the teleoperator to the human operator that is sufficient and properly displayed such that the human feels that he is present at the remote site, and that the teleoperator is an extension of his own body. Arvin et al. [1998] also defined telepresence as, Telepresence has the ability to be present in another environment through the use and control of a system composed of local interfaces and remote devices such as vision and robots with mobility capabilities. In summary, the concept of telepresence describes the combinations of technologies that send the presence of a human operator to a distant place, and provide the user a feeling of actual presence in the distant place. Figure 1 diagrams our current model of the telepresence system. The user in the control station obtains a real immersive experience from the worksite when he/she controls the robot system. According to Sheridan s suggestion [1992] that virtual presence is a function of three independent components. The first is the resolution of sensory channels, including visual, audio or tactile. The second component is the user s ability to move the sensor about in space. The third component is the capability to actually modify the relative positions of objects in the environment. All three components will be included in a telepresence, typically a robot system that mimics 2

human. With a sense of being there, the human operator can contact other persons in the remote site, visit anywhere, work, etc., even though he is not real there. Figure 1. A model of the telepresence system Methods of controlling the telepresence devices are not new. Radio waves were commonly used in early developments of telepresence devices. As the Internet has become the most important communication tool today, researches combining elements of the Internet and telepresence have attracted many attentions. Goldberg et al. [1994] developed a system in their Mercury Project, which consists of an industrial robot arm fitted with a CCD camera and a pneumatic system. It was the first to allow World Wide Web (WWW) users to manipulate a remote environment. They pointed out that the WWW provides a low-cost and widely available interface that can make teleoperated resources accessible to anyone with a desktop (or laptop) computer and modem. 3

Paulos et al. [1997] used a blimp as a telepresence device called tele-embodiment over a network. They combined the lighter-than-air technology for the remote robot with a wide-area network such as the Internet for ubiquitous and inexpensive telepresence. Their goal in this work was to provide a truly immersive experience for users tele-visiting a remote space. They felt that, barring some extremely extenuating circumstances, any system developed today should be accessible to the entire internet community. Paulos et al. [1998] further presented a project called PROP (Personal Roving Presence), whose goal was to enable personal telepresence. They were interested in identifying and distilling a small number of human behavioral traits or skills that are inherent to human communication, understanding and interaction. The ultimate goal was to provide a reasonable degree of personal telepresence that allows humans to communicate and interact in a useful manner with remote people and places in ways beyond those available with current systems. This paper describes an on-going project in Yuan Ze University, Department of Mechanical Engineering, for developing a telepresence agent. We consider the telepresence agent a vehicle for the transmission of presence between the control site and the remote site. The telepresence agent may have many different forms and applications, but the final goal is still to develop a robot system that mimics human. This robot is accessed and controlled through the Internet, using the user s own biological signals. It is an extension of the distant user s own body, which enables the user to deal with simple tasks in daily life at the remote site. Sensory feedbacks such as visual, audio or multi-functional tactile are provided to create an immersive experience for the user. We also plan to add some human behavioral characteristics into the robot such as facial expression to increase its humanity. 2. Critical technologies There are three critical technologies that need to be developed in the telepresence agent project: human-system interface, Internet communication, and the robot control system. 4

2.1 Human-system interface The goal of the human-system interface is to provide an interface to enable the user to send his/her presence to a remote site, and to receive enough sensory feedback from the remote site so that the user obtains an immersive feeling of being there. Specifically, the user should be able to send commands to and receive sensory feedback from the telepresence agent at the remote site through this interface. As shown in Figure 2, there are two ways to send commands: (1) Indirect control By indirect control we mean that the user controls the telepresence agent through an extra controllable interface. Mouse, keyboard, and joystick are convenient tools for users to operate the telepresence agent using a common computer. Programming languages like JAVA, Visual C++, or Visual Basic to design software to provide different command options according to the telepresence agent s abilities and functions. (2) Direct control Direct control means that the telepresence agent follows the user s motions directly, so that the robot at the remote site synchronously mimics the user s motion. To achieve this, we use the user s biological signals to control the telepresence agent. Biological signals that are currently used in this project are head rotations, hand motions, steps of the legs, and the user s voice. Sensors are used to detect the biological signals, and these signals are then converted into commands and sent to the telepresence agent. 5

Figure 2. Two ways to send commands Technically, we use a two-axis tilt sensor to detect the rotation of the user s head. This magnetic sensor has heading, roll, and pitch data. The roll and pitch data is converted into commands to control the rotational degrees of freedom of the CCD camera set upon the telepresence agent to synchronize with the motion of the user s head. This way the user can rotate his/her head to get what he/she wants to see at the remote site via the Internet. Data gloves are generally used to detect the degrees of freedom of fingers in Virtual Reality researches. We design a sample robot hand with three talons, which is controlled using the data gloves via the Internet. Our telepresence agent has a wireless controlled motor system, which expands its range of activity. A stepping machine with optics sensors is used to measure motions of the user s legs to synchronously control telepresence agent s mobility. We also plan to use the user s voice to control the facial expression of the telepresence agent to increase its humanity. The facial expression will be made of simple mechanisms or LEDs. Sensory feedbacks are provided to give the user a sense of actual presence in the remote environment. As shown in Figure 3, visual and audio feedbacks are supported in our telepresence agent, while environmental data at the remote site like temperature, humidity, wind velocity, etc., are also displayed to the user. 6

Figure 3. Sensory feedback provided in our telepresence agent Visual and audio feedbacks are important and necessary. As mentioned earlier, the user can see through CCD that set upon the robot at the remote site. The field of view is controlled by the rotation of the user s head. A CCD camera with USB interface, which can play on computer directly like webcam, is used. As shown in Figure 4, an electrical wireless device is used to bridge the USB CCD camera and computer. We also plan to use a periscopic CCD camera to provide a wide-angle view similar to that of human eyes. In the mean time, a two-way stereo system can help the user to engage in remote conversations. Head mounted display (HMD) devices commonly used in virtual reality applications can give more reality to the user than receiving the visual feed back from a computer screen. HMD and the head rotation sensor can be integrated into one device. Figure 4. The wireless CCD camera 7

2.2 Internet communication The Internet has become the most important communication tool today. Our telepresence agent also uses the Internet to send commands to and receive sensory feedbacks from the remote site, as shown in Figure 5. There is an IP and a server connected with each telepresence agent. A web page that integrates controlling programs is set up on the server. This web page can be one way or two ways. The user can log in this web page merely to receive visual and audio feedbacks embedded in the web page. In order to control telepresence agent at the remote site via Internet, CGI (Common Gateway Interface) programs are developed to complete the two-way bridge between client and server. Figure 5. Internet communication service The user can log in the web pages to visit his favorite true distant place through the telepresence agent, and sensory feedbacks are also sent back to the user via the Internet. The user operates the telepresence agent according to the visual feedback sent back from the remote site. Therefore Internet bandwidth is a deciding factor for the quality and speed of the visual feedback. How to integrate the telepresence agent with broadband technologies is an important issue. 8

2.3 Robot control system Figure 6 diagrams the conception of the robot. A CCD camera is set upon a two-axis mechanism. Its motion is controlled be rotation of the head of the user. A robot hand with three talons is controlled by the motion of the hand of the user. A wireless controlled motor system gives the robot mobility so that it can move around to interact with people. The 8051 chips are used as the heart of the robot control system. Figure 6. Conception of the robot We plan to add human facial expression into the telepresence agent to transmit a richer representation of the remote user. We can use voice-controlled virtual electrical LED screen to display the user s facial expressions or we can simply set a LCD screen to display the user s real face. 3. Application examples Figure 7 summarizes the technologies discussed in the previous section. The blocks at the outer circle represent the technology modules required in this project. Actually a selected combination of the technology modules can produce a unique telepresence application. Three application examples are described below: 9

Figure 7. Technology modules required in the telepresence agent An Internet remote monitoring system As shown in Figure 8, technology modules 1, 4, and 15 can be combined to produce an Internet monitoring system, which is a very common telepresence application. A CCD camera is set up in the site to be monitored. Users can log in the web page and technology module 16 provides direct visual feedback. 10

Figure 8. An Internet remote monitoring system A telepresence agent with no mobility Figure 9 shows a telepresence agent with no mobility, which includes technology modules 2, 3, 4, 5, 8, 9, 10, 15 and 16. This telepresence agent only has a two-axis CCD camera (module 9) and a hand with three talons (module 10). After login in the server connected with the telepresence agent, the user can use mouse and keyboard to control the CCD camera and the robot hand (module 4, 15), and receive visual feedback from the computer screen (module 1, 16). Or the user can wear an HMD with the rotations detector to synchronically control the CCD camera (module 8, 9, 15) to obtain views he/she is interested in the HMD (module 2, 16). The user can also wear the data glove to synchronically control the hand with three talons (module 5, 10, 15). 11

Figure 9. A telepresence agent with no mobility An Internet racing-car game Figure 10 shows another application, an Internet racing-car game, which combines technology modules 1, 3, 4, 11, 13, 15 and 16. In this application, we add mobility to our telepresence agent, the racing car in this case. We design a racing car with a wireless controlled motor system (module 11, 13) and a wireless CCD camera (module 9, 13). We also build an exciting racing place as the remote site our telepresence agent is visiting. After login in the server connected with the telepresence agent, the user can use mouse and keyboard to control the telepresence agent, the racing car to move around in the remote site (module 15), and receive visual feedback from the computer screen (module 16). 12

Figure 10. An Internet racing-car game A first generation telepresence agent Figure 11 shows our recent achievement, the first generation telepresence agent, which includes technology modules 1, 3, 4, 9, 11, 12, 13 and 16. It has a two-axis CCD camera (module 9) and free mobility (module 11 and module 13), both are controlled by a joystick through Internet (module 4). With this agent, the user can move freely in the remote site to observe whatever he/she is interested in, in a position and angle he/she prefers. More importantly, this agent also incorporate two-direction Audio and Video communication (module 3 and module 12), so it has become an internet communication tool which enables people in the remote site to better feel the user s presence. 13

Figure 11. A first generation Telepresence agent 4. Conclusions and future work Our claim is that telepresence agent is a useful, functional assistant for supporting human presence and interaction at a distance. It finds applications in security monitoring, health care, elderly care, and even entertainments. Now we have presented a first generation telepresence agent. In the future, we will be developing all the technology modules and construct the full-function telepresence agent. In the mean time, we will try to find combinations of technology modules that might produce useful telepresence applications. 14

References Akin, D.L., Minsky, M.L., Thiel, E.D., and Kurlzman, C.R., Space applications of automation and robots and machine intelligence systems (ARAMIS) Phase, NASA Contract Rep. 3734, 1983. Agah, A., Walker, R., and Ziemer, R., A mobile camera robotic system controlled via a head mounted display for tele-presence, in Systems, Man, and Cybernetics, Vol.4, pp.3526-3531, 1998. Goldberg, K., Mascha, M., Gentner, S., and Rothenberg, N., Desktop teleoperation via the world wide web, in Proc IEEE Int. Conf. Robotics Automation, Vol.1, pp.654-659 1995. Paulos, E., and Canny, J., Ubiquitious tele-embodiment: Application and implication, International Journal of Human-Computer Studies/Knowledge Acquisition Special Issue on Innovative Applications of the World Wide Web, Vol.46, pp.861-877, 1997. Paulos, E., and Canny, J., Designing personal tele-embodiment, in Proc IEEE Int. Conf. Robotics Automation, Vol.4, pp.3173-3178, 1998. Sheridan, T.B., Human supervisory control of robot system, in Proc IEEE Int. Conf. Robotics Automation. San Francisco. CA. pp.808-812, 1986. Sheridan, T.B., Musing on telepresence and virtual presence, Presence, Vol.1, No.1, pp.120-126, 1992. 15