Telecommunication and remote-controlled

Similar documents
Haptic Media Construction and Utilization of Human-harmonized "Tangible" Information Environment

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Evaluation of Five-finger Haptic Communication with Network Delay

Telexistence and Retro-reflective Projection Technology (RPT)

From Encoding Sound to Encoding Touch

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

TELEsarPHONE: Mutual Telexistence Master-Slave Communication System Based on Retroreflective Projection Technology

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

TORSO: Development of a Telexistence Visual System Using a 6-d.o.f. Robot Head

Wearable Tactile Device using Mechanical and Electrical Stimulation for Fingertip Interaction with Virtual World

Haptic Interface using Sensory Illusion Tomohiro Amemiya

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Acquisition of Multi-Modal Expression of Slip through Pick-Up Experiences

Development of a telepresence agent

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Combination of Cathodic Electrical Stimulation and Mechanical Damped Sinusoidal Vibration to Express Tactile Softness in the Tapping Process *

Haptic Perception & Human Response to Vibrations

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Proprioception & force sensing

Touch. Touch & the somatic senses. Josh McDermott May 13,

Journal of Theoretical and Applied Mechanics, Sofia, 2014, vol. 44, No. 1, pp ROBONAUT 2: MISSION, TECHNOLOGIES, PERSPECTIVES

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

SmartTouch: Electric Skin to Touch the Untouchable

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Proceedings of the 33rd ISR (International Symposium on Robotics) October 7 11,

Touching and Walking: Issues in Haptic Interface

Abdulmotaleb El Saddik Associate Professor Dr.-Ing., SMIEEE, P.Eng.

IOSR Journal of Engineering (IOSRJEN) e-issn: , p-issn: , Volume 2, Issue 11 (November 2012), PP 37-43

Haplug: A Haptic Plug for Dynamic VR Interactions

Design of Cylindrical Whole-hand Haptic Interface using Electrocutaneous Display

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

Wearable Haptic Display to Present Gravity Sensation

Lecture 7: Human haptics

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

A Tactile Display using Ultrasound Linear Phased Array

Invisibility Cloak. (Application to IMAGE PROCESSING) DEPARTMENT OF ELECTRONICS AND COMMUNICATIONS ENGINEERING

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Design and Control of the BUAA Four-Fingered Hand

Technologies. Philippe Fuchs Ecole des Mines, ParisTech, Paris, France. Virtual Reality: Concepts and. Guillaume Moreau.

FlexTorque: Exoskeleton Interface for Haptic Interaction with the Digital World

A 360 Video-based Robot Platform for Telepresent Redirected Walking

Haptic presentation of 3D objects in virtual reality for the visually disabled

Optical camouflage technology

¾ B-TECH (IT) ¾ B-TECH (IT)

International Journal of Advanced Research in Computer Science and Software Engineering

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

A Glove Interface with Tactile feeling display for Humanoid Robotics and Virtual Reality systems

The Design of Internet-Based RobotPHONE

Input-output channels

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

Selective Stimulation to Skin Receptors by Suction Pressure Control

A Fingernail-Mounted Tactile Display for Augmented Reality Systems

Building a Cognitive Model of Tactile Sensations Based on Vibrotactile Stimuli

Flexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations

PROPRIOCEPTION AND FORCE FEEDBACK

2 Outline of Ultra-Realistic Communication Research

Remote Tactile Transmission with Time Delay for Robotic Master Slave Systems

Necessary Spatial Resolution for Realistic Tactile Feeling Display

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Medical Robotics. Part II: SURGICAL ROBOTICS

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Fuzzy Logic Based Force-Feedback for Obstacle Collision Avoidance of Robot Manipulators

Output Devices - Non-Visual

these systems has increased, regardless of the environmental conditions of the systems.

Los Alamos. DOE Office of Scientific and Technical Information LA-U R-9&%

Exploring Surround Haptics Displays

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Design and Controll of Haptic Glove with McKibben Pneumatic Muscle

Introduction to Virtual Reality. Chapter IX. Introduction to Virtual Reality. 9.1 Introduction. Definition of VR (W. Sherman)

Feeding human senses through Immersion

Biomimetic Design of Actuators, Sensors and Robots

WEARABLE HAPTIC DISPLAY FOR IMMERSIVE VIRTUAL ENVIRONMENT

Collaboration in Multimodal Virtual Environments

Peter Berkelman. ACHI/DigitalWorld

2. Introduction to Computer Haptics

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Tactile sensing system using electro-tactile feedback

Development of Mutual Telexistence System using Virtual Projection of Operator s Egocentric Body Images

Haptic Technology- Comprehensive Review Study with its Applications

World Automation Congress

Project FEELEX: Adding Haptic Surface to Graphics

Haptic interaction. Ruth Aylett

An Emotional Tactile Interface Completing with Extremely High Temporal Bandwidth

Sensing the Texture of Surfaces by Anthropomorphic Soft Fingertips with Multi-Modal Sensors

Haptic Rendering CPSC / Sonny Chan University of Calgary

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Touch Perception and Emotional Appraisal for a Virtual Agent

Computer Haptics and Applications

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Transcription:

Spatial Interfaces Editors: Frank Steinicke and Wolfgang Stuerzlinger Telexistence: Enabling Humans to Be Virtually Ubiquitous Susumu Tachi The University of Tokyo Telecommunication and remote-controlled operations are becoming increasingly common in our daily lives. While performing these operations, ideally users would feel as if they were actually present at the remote sites. However, the present commercially available telecommunication and telepresence systems do not provide the sensation of self-presence or self-existence, and hence, users do not get the feeling of being spatially present or that they are directly performing spatial tasks, rather than simply controlling them remotely. Furthermore, these systems do not provide haptic sensation, which is necessary for the direct manipulation of remote operations. The lack of such sensations reduces the sense of realism and makes it more difficult to perform tasks. In 1980, I proposed the concept of telexistence, which together with the concept of third-generation robotics, was the fundamental principle behind the eight-year, large-scale national Japanese Advanced Robot Technology in Hazardous Environment Project. The project, which began in 1983, established theoretical considerations and the systematic design procedure for telexistence systems. Since then, experimental hardware systems for telexistence have been developed, and the feasibility of the concept has been demonstrated. 1 5 Two important problems remain to be solved: mutual telexistence is a telexistence system that can provide both the sensations of being present (self-presence) and being perceived present (their presence) and is used mainly for communication purposes, whereas haptic telexistence adds haptic sensation to the visual and auditory sensations of self-presence and is used mainly for remote operations of real tasks. This article presents our work on telexistence, in which our aim is to enable human users to have the sensation of being spatially present on site and performing tasks directly. Specifically, we developed a telexistence master-slave system that enables a human user to feel present in a remote environment. This system is a virtual exoskeleton human amplifier, through which a human user can operate a remote avatar robot as if it is his or her own body. That is, the user has the feeling of being inside the robot or wearing it as a garment. Here, I introduce and explain the recent advancements in telexistence that have partly solved the mutual and haptic telexistence problems. TELESAR II (telexistence surrogate anthropomorphic robot, version II) was the first system to provide the sensations of both self-presence and their presence for communication purposes using retroreflective projection technology (RPT). For remote operation purposes, we have developed the TELE- SAR V telexistence master-slave system, which can transmit not only visual and auditory sensations, but also haptic sensation. The haptic sensations are conveyed using our proposed principle of haptic primary colors. Mutual Telexistence Several commercial products claim to support different forms of telepresence, such as the Teliris telepresence videoconferencing system, Cisco telepresence, Polycom telepresence, Anybots QB telepresence robot, Texai remote presence system, Double telepresence robot, Suitable Beam remote presence system, and VGo robotic telepresence. Yet, although current commercial telepresence robots that are controlled from laptops or intelligent pads can provide a certain sense of their presence on the side of the robot, the remote user has a poor sense of self-presence. As for the sense of their presence, commercial products have problems, such as that the image presented on a display is only a 2D face, which is far from real, and that multi-viewpoint images are not provided. Thus, the same front face is seen even when viewed from the side. An ideal system should provide mutual telexistence, giving 8 Janaury/February 2016 Published by the IEEE Computer Society 0272-1716/16/$33.00 2016 IEEE

Measurement Video projection B C Display (visual, auditory, tactile) Control (head, limbs, trunk) Robot A User B User A User C Cockpit A Environment B Figure 1. Proposed mutual telexistence system using retroreflective projection technology (RPT). User A can observe remote environment B using an omnistereo camera mounted on surrogate robot A. The retroreflective material covering robot A makes it possible to project images from both cameras B and C onto it, so users B and C can view the image on robot A separately. the user a sense being present in the remote environment where his or her surrogate robot exists and, at the same time, creating a sense that the remote user, represented by the surrogate robot, is also present at the remote location. This means the remote user should be seen naturally and simultaneously by several people standing around the surrogate robot. However, almost none of the previous systems provide both the sense of self-presence and the sense of their presence. Figure 1 shows a conceptual sketch of an ideal mutual telexistence system using a telexistence cockpit and an avatar robot. User A can observe remote environment B using an omnistereo camera mounted on surrogate robot A. This provides user A with a panoramic stereo view of the remote environment displayed inside the cockpit. User A controls robot A using the telexistence masterslave control method. Cameras B and C mounted on the booth are controlled by the position and orientation of users B and C, respectively, relative to robot A. To obtain the correct perspective, users B and C can observe different images of user A projected on robot A by wearing their own headmounted projectors (HMPs). Robot A is covered with retroreflective material, making it possible to project images from both cameras B and C onto the same robot while having both images viewed separately by users B and C. A method for mutual telexistence based on projecting real-time images of the operator onto a surrogate robot using RPT was first proposed in 1999, 6 together with several potential applications such as transparent cockpits. 7 The feasibility of the concept was demonstrated with the construction of experimental mutual telexistence systems in 2004. 8 In 2005, a mutual telexistence master-slave system called TELESAR II was constructed for the Aichi World Exposition. Figure 2a shows the telexistence surrogate robot TELESAR II. The virtual exoskeleton human amplifier of the remote operator shows his image as if he is inside the robot. Figure 2b shows the operator who is telexisting in the TELESAR II robot. In addition to conventional verbal communication, this master-slave robotics surrogate can perform nonverbal communication such as gestures and handshakes. 9,10 Moreover, a person operating the robot surrogate could be seen naturally and simultaneously by several people standing around the robot at the remote location, so mutual telexistence is attained. The mobile mutual telexistence system, TELESAR IV, which is equipped with master-slave manipulation capability and an immersive omnidirectional autostereoscopic 3D display with a 360-degree field of view known as TWISTER (telexistence wideangle immersive stereoscope), 11 was developed in IEEE Computer Graphics and Applications 9

Spatial Interfaces (a) (b) Figure 2. Mutual telexistence for spatial interaction. (a) TELESAR II avatar robot and (b) operator at the control. 2010 to further develop a mutual telexistence system toward the ideal form. 12 Haptic Primary Colors Humans do not perceive the world as it is. Different physical stimuli give rise to the same sensation in humans and are perceived as identical. A typical example of this fact is human color perception. Humans perceive the light of different spectra as having the same color if the light has the same proportion of the different spectral components. This is because the human retina typically contains three types of color receptors called cone cells or cones, each of which responds to a different range of the color spectrum. Humans respond to light stimuli via 3D sensations, which generally can be modeled as a mixture of the three primary colors (red, green, and blue). This many-to-one correspondence of elements in mapping from physical properties to psychophysical perception is the key to virtual reality. VR produces the same effect as a real object for a human subject by presenting its virtual entities with this many-to-one correspondence. We have proposed the hypothesis that a cutaneous sensation (that is, one relating to or affecting the skin) also has the same many-to-one correspondence from physical properties to psychophysical perception owing to the physiological constraints of humans. We call this haptic primary colors. 13 As Figure 3 shows, we define three spaces: physical, physiological, and psychophysical or perceptual. In physical space, human skin physically contacts an object, and the interaction continues over time. Physical objects have several surface physical properties such as surface roughness, surface friction, thermal characteristics, and surface elasticity. We hypothesize that at each contact point of the skin, the cutaneous phenomena can be broken down into three components: force f(t), vibration v(t), and temperature e(t). Objects with the same f(t), v(t), and e(t) are perceived as the same, even if their physical properties are different. We measure f(t), v(t), and e(t) at each contact point with sensors that are mounted on the avatar robot s hand and transmit these pieces of information to the human user who controls the avatar robot as his surrogate. We reproduce these pieces of information at the user s hand via haptic displays of force, vibration, and temperature, so that the human user has the sensation that he is touching the object as he moves his hand controlling the avatar robot s hand. We can also synthesize virtual cutaneous sensation by displaying the computersynthesized f(t), v(t), and e(t) to the human users through the haptic display. This breakdown into force, vibration, and temperature in physical space is based on the human restriction of sensation in physiological space. Much like the human retina, human skin has limited receptors. In physiological space, cutaneous perception is created through a combination of nerve signals from several types of tactile receptors located below the surface of the skin. If we consider each activated haptic receptor as a sensory base, we should be able to express any given pattern of cutaneous sensation through synthesis by using these bases. Merkel cells, Ruffini endings, Meissner s corpuscles, and Pacinian corpuscles are activated mainly by pressure, tangential force, low-frequency vibrations, and high-frequency vibrations, respectively. On adding cold receptors (free nerve endings), warmth receptors (free nerve endings), and pain receptors (free nerve endings) to these four vibrotactile haptic sensory bases, we have seven sensory bases in the physiological space. 10 January/February 2016

Physical space (3 bases) Force Vibration Physiological space (7 bases) Merkel cell Ruffini endings Meissner corpuscle Pacinian corpuscle Cold (free nerve endings) Psychological space (5 bases) Hard, soft Rough, smooth Dry, wet Figure 3. Haptic primary color model. Different physical stimuli give rise to the same sensation in humans and are perceived as identical. Temperature Warm (free nerve endings) Pain (free nerve endings) Cold, warm Painful, itchy Physical-physiological Physical-psychological Physiological-psychological Because all seven receptors are related only to force, vibration, and temperature applied on the skin surface, these three components in the physical space are enough to stimulate each of the seven receptors. Thus, in physical space, we have three haptic primary colors : force, vibration, and temperature. Theoretically, by combining these three components we can produce any type of cutaneous sensation without the need for any real touching of an object. Telexistence Avatar Robot System: TELESAR V Conventional telepresence systems provide mostly visual and auditory sensations with only incomplete haptic sensations. TELESAR V, a master-slave robot system for performing full-body movements, was developed in 2011 to realize the concept of haptic telexistence. 14 The TELESAR V master-slave system haptic capabilities were successfully demonstrated at SIGGRAPH in August 2012. TELESAR V can transmit fine haptic sensations during spatial interaction, such as a material s texture and temperature, from an avatar robot s fingers to the human user s fingers, 15 using our proposed principle of haptic primary colors. 13 The TELESAR V implementation includes a mechanically unconstrained master cockpit and a 53 degrees of freedom (DOF) anthropomorphic slave robot with a high-speed, robust, and full upper body. The system provides an embodiment of our extended body schema, which allows human operators to maintain up-to-date representations of their various body parts in space. A body schema can be used to understand the posture of the remote body and to perform actions as if the remote body is the same as the user s own body. The TELESAR V master-slave system can transmit fine Figure 4. TELESAR V master-slave system. The user can perform tasks dexterously and perceive the robot s body as if it s his or her own through visual, auditory, and haptic sensations. haptic sensations such as the texture and temperature of a material from an avatar robot s fingers to a human user s fingers. Because of this, users can perform tasks dexterously and perceive the robot s body as if it s their own through visual, auditory, and haptic sensations, which provide the fundamental experience of telexistence. As shown in Figures 4 and 5, the TELESAR V system consists of a master (local) and a slave (remote). The 53-DOF dexterous robot was developed with a 6-DOF torso, a 3-DOF head, 7-DOF arms, and 15-DOF hands. The robot has full HD cameras (1,920 1,080 pixels) for capturing wide-angle stereovision, and stereo microphones are situated on the robot s ears for capturing audio signals from the remote site. The operator s voice is transferred to the remote site and output through a small IEEE Computer Graphics and Applications 11

Spatial Interfaces Figure 5. TELESAR V system configuration. The 53-DOF dexterous robot has full HD cameras and stereo microphones for capturing audio-visual signals from the remote site. On the master side, a motioncapturing system and data gloves capture the operator movements. Figure 6. Haptic sensor. The sensor obtains haptic information such as contact force, vibration, and temperature based on the haptic primary colors. Master (local) Normal/tangential force display Thermal display Vibrotactile display 14-DOF sensor glove, 2 3-DOF force sensor Vibration sensor Thermistor sensor HMD visual display (1,280 x 800 pixels), 2 Stereo audio display Microphone Fingertip Top base Tip base Finger base 3-axis force sensor Temperature sensor Vibrotactile sensor 6-DOF torso 3-DOF head 7-DOF arm, 2 15-DOF hand, 2 53 DOF Slave (remote) Camera (1,920 1,080 pixels), 2 Stereo microphone Speaker speaker installed near the robot s mouth area for conventional verbal bidirectional communication. On the master side, the operator s movements are captured with a motion-capturing system (Opti- Track). Finger bending is captured with 14 DOFs using a modified 5DT Data Glove 14. The haptic transmission system consists of three parts: a haptic sensor, a haptic display, and a processing block. When the haptic sensor touches an object, it obtains haptic information such as contact force, vibration, and temperature based on the haptic primary colors. The haptic display provides haptic stimuli on the user s finger to reproduce the haptic information obtained by the haptic sensor. The processing block connects the haptic sensor with the haptic display and converts the obtained physical data into data that include the physiological haptic perception for reproduction by the haptic display. First, a force sensor inside the haptic sensor measures the vector force when the haptic sensor touches an object. Then, two motor-belt mechanisms in the haptic display reproduce the vector force on the operator s fingertips. The processing block controls the electrical current drawn by each motor to provide the target torques based on the measured force. As a result, the mechanism reproduces the force sensation when the haptic sensor touches the object. Second, a microphone in the haptic scanner records the sound generated on its surface when the haptic sensor is in contact with an object. Then, a force reactor in the haptic display plays the transmitted sound as a vibration. Because this vibration provides a high-frequency haptic sensation, the information is transmitted without delay. Third, a thermistor sensor in the haptic sensor measures the surface temperature of the object. A Peltier actuator mounted on the operator s fingertips reproduces the measured temperature. The processing block generates a control signal for the Peltier actuator. The signal is generated based on a proportional-integral-derivative (PID) control loop with feedback from a thermistor located on the Peltier actuator. Figures 6 and 7 show the structures of the haptic sensor and haptic display, respectively. Figure 8 shows the left hand of TELESAR V robot with the haptic sensors and the haptic displays set in the modified 5DT Data Glove 14. Figure 9 shows TELESAR V conducting several tasks. The system is able to transmit not only visual and auditory sensations, but also haptic sensations of presence based on data obtained from the haptic sensor. Toward Telework via Telexistence Until now, working at home remotely has been limited to communications and/or paper work that transmits audio-visual information and data. It was impossible to carry out physical work at factories 12 January/February 2016

or operations at places such as construction sites, healthcare facilities, or hospitals because such activities could not be carried out unless the person in question was actually on site. Telexistence technology can extend the conventional range of remote communications, which transmit only audio-visual senses, to transmit all the physical functions of human beings and thus enable remote work and operations, which were impossible until now. Given the functionality of an avatar robot, advances in this area could also provide opportunities for elderly and handicapped people. Using the body of a virtual self, such people will no longer be limited by their physical disadvantages. For example, elderly people could augment and enhance their physical functions to surpass the capabilities of their physical bodies by using the avatar s enhanced body and thus can participate in work that makes use of the abundant experience they have gained over a lifetime. In the future, this technology will also make it possible to dispatch medical and emergency services personnel instantly, allowing them to respond from a safe place during disasters and emergencies. In the same way, medical caregivers, physicians, and experts will be able to access patients in remote areas on a routine basis. In addition, with the creation of new industries such as telexistence tourism, travel, shopping, and leisure, this technology can make the daily lives of citizens more convenient and motivate them to live vigorously. We envision that a healthy and pleasant lifestyle will be realized in a clean and energy-conserving society. References 1. S. Tachi et al., Tele-existence (I): Design and Evaluation of a Visual Display with Sensation of Shaft Peltier actuator Vibrator, 2 Belt Motor, 2 Finger-mounted base Presence, Proc. 5th Symp. Theory and Practice of Robots and Manipulators (RoManSy), 1984, pp. 245 254. 2. J.D. Hightower, E.H. Spain, and R.W. Bowles, Telepresence: A Hybrid Approach to High Performance Robots, Proc. Int l Conf. Advanced Robotics (ICAR), 1987, pp. 563 573. 3. S. Tachi, Real-Time Remote Robotics: Toward Networked Telexistence, IEEE Computer Graphics and Applications, vol. 18, no. 6, 1998, pp. 6 9. 4. S. Tachi, Telexistence, World Scientific, 2010. 5. S. Tachi, Telexistence: Past, Present, and Future, Virtual Realities, G. Brunnett et al., eds., Springer, 2015, pp. 229 259. 6. S. Tachi, Augmented Telexistence, Mixed Reality: Merging Real and Virtual Worlds, Y. Ohta and H. Tamura, eds., Springer-Verlag, 1999, pp. 251 260. 7. S. Tachi et al., Mutual Telexistence System Using Retro-reflective Projection Technology, Int l J. Humanoid Robotics, vol. 1, no. 1, 2004, pp. 45 64. 8. S. Tachi, M. Inami, and Y. Uema, The Transparent Cockpit, IEEE Spectrum, vol. 51, no. 11, 2014, pp. 52 56. 9. R. Tadakuma et al., Development of Anthropomorphic Multi-D.O.F. Master-Slave Arm for Mutual Telexistence, IEEE Trans. Visualization and Computer Graphics, vol. 11, no. 6, 2005, pp. 626 636. Figure 7. Haptic display. The display provides haptic stimuli on the user s finger to reproduce the haptic information obtained by the haptic sensor. Figure 8. TELESAR V master and slave hands. (a) Slave hand with haptic sensors and (b) master hand with haptic displays. (a) (b) IEEE Computer Graphics and Applications 13

Spatial Interfaces Figure 9. TELESAR V conducting several tasks. (a) Picking up sticks, (b) transferring small balls from one cup to another cup, (c) producing Japanese calligraphy, (d) playing Japanese chess (shogi), and (e) feeling the texture of a cloth. (a) (b) (c) (d) (e) 10. S. Tachi et al., TELEsarPHONE: Mutual Telexistence Master Slave Communication System Based on Retroreflective Projection Technology, SICE J. Control, Measurement, and System Integration, vol. 1, no. 5, 2008, pp. 335 344. 11. K. Tanaka et al., TWISTER: An Immersive Autostereoscopic Display, Proc. IEEE Virtual Reality, 2004, pp. 59 66. 12. S. Tachi et al., Mutual Telexistence Surrogate System: TELESAR4 Telexistence in Real Environments Using Autostereoscopic Immersive Display, Proc. IEEE/RSJ Int l Conf. Intelligent Robots and Systems, 2011, pp. 157 162. 13. S. Tachi et al., Haptic Media: Construction and Utilization of Human-Harmonized Tangible Infor mation Environment, Proc. 23rd Int l Conf. Artificial Reality and Telexistence (ICAT), 2013, pp. 145 150. 14 g1spa.indd 14 14. S. Tachi et al., Telexistence: From 1980 to 2012, Proc. IEEE/RSJ Int l Conf. Intelligent Robots and Systems (IROS), 2012, pp. 5440 5441. 15. C.L. Fernando et al., TELESAR V: TELExistence Surrogate Anthropomorphic Robot, ACM SIGGRAPH 2012 Emerging Technologies, 2012, article no. 23. Susumu Tachi is a professor emeritus in the Institute of Gerontology s Tachi Laboratory at The University of Tokyo. Contact him at tachi@tachilab.org. Contact department editors Frank Steinicke at frank.steinicke@uni-hamburg.de and Wolfgang Stuerzlinger at w.s @sfu.ca. Selected CS articles and columns are also available for free at http://computingnow.computer.org. January/February 2016 12/21/15 12:08 PM