A 360 Video-based Robot Platform for Telepresent Redirected Walking

Similar documents
Detection Thresholds for Rotation and Translation Gains in 360 Video-based Telepresence Systems

ReWalking Project. Redirected Walking Toolkit Demo. Advisor: Miri Ben-Chen Students: Maya Fleischer, Vasily Vitchevsky. Introduction Equipment

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Panel: Lessons from IEEE Virtual Reality

Immersive Guided Tours for Virtual Tourism through 3D City Models

Discrete Rotation During Eye-Blink

Moving Towards Generally Applicable Redirected Walking

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Development of a telepresence agent

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Mid-term report - Virtual reality and spatial mobility

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

Immersive Real Acting Space with Gesture Tracking Sensors

Communication Requirements of VR & Telemedicine

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Available online at ScienceDirect. Procedia CIRP 44 (2016 )

Construction of visualization system for scientific experiments

Tobii Pro VR Integration based on HTC Vive Development Kit Description

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

Head-Movement Evaluation for First-Person Games

Dynamic Platform for Virtual Reality Applications

Mixed Reality technology applied research on railway sector

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Using Hybrid Reality to Explore Scientific Exploration Scenarios

Evaluation of a Tricycle-style Teleoperational Interface for Children: a Comparative Experiment with a Video Game Controller

3D sound in the telepresence project BEAMING Olesen, Søren Krarup; Markovic, Milos; Madsen, Esben; Hoffmann, Pablo Francisco F.; Hammershøi, Dorte

Networked Virtual Environments

The Redirected Walking Toolkit: A Unified Development Platform for Exploring Large Virtual Environments

Haptics CS327A

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

VR Collide! Comparing Collision- Avoidance Methods Between Colocated Virtual Reality Users

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Virtual Reality as Innovative Approach to the Interior Designing

Telecommunication and remote-controlled

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

Trip Together: A Remote Pair Sightseeing System Supporting Gestural Communication

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Virtual/Augmented Reality (VR/AR) 101

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Omni-Directional Catadioptric Acquisition System

Gesture Recognition with Real World Environment using Kinect: A Review

Sky Italia & Immersive Media Experience Age. Geneve - Jan18th, 2017

Building a bimanual gesture based 3D user interface for Blender

Geo-Located Content in Virtual and Augmented Reality

AR 2 kanoid: Augmented Reality ARkanoid

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

Presence-Enhancing Real Walking User Interface for First-Person Video Games

Kissenger: A Kiss Messenger

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

Tobii Pro VR Analytics Product Description

Haplug: A Haptic Plug for Dynamic VR Interactions

ITS '14, Nov , Dresden, Germany

Evaluation of an Omnidirectional Walking-in-Place User Interface with Virtual Locomotion Speed Scaled by Forward Leaning Angle

A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Optical Marionette: Graphical Manipulation of Human s Walking Direction

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Experience of Immersive Virtual World Using Cellular Phone Interface

Virtual Reality I. Visual Imaging in the Electronic Age. Donald P. Greenberg November 9, 2017 Lecture #21

VR based HCI Techniques & Application. November 29, 2002


AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Immersive Authoring of Tangible Augmented Reality Applications

Motion Capturing Empowered Interaction with a Virtual Agent in an Augmented Reality Environment

Reorientation during Body Turns

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Best Practices for VR Applications

New interface approaches for telemedicine

Interior Design using Augmented Reality Environment

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Redirecting Walking and Driving for Natural Navigation in Immersive Virtual Environments

Potential Uses of Virtual and Augmented Reality Devices in Commercial Training Applications

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Realistic Visual Environment for Immersive Projection Display System

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Remotely Teleoperating a Humanoid Robot to Perform Fine Motor Tasks with Virtual Reality 18446

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

Corey Pittman Fallon Blvd NE, Palm Bay, FL USA

Virtual Reality Calendar Tour Guide

CSC 2524, Fall 2017 AR/VR Interaction Interface

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

pcon.planner PRO Plugin VR-Viewer

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Touching Floating Objects in Projection-based Virtual Reality Environments

Virtual Environments. Ruth Aylett

BoBoiBoy Interactive Holographic Action Card Game Application

Tobii Pro VR Analytics Product Description

Chapter 1 - Introduction

HARDWARE SETUP GUIDE. 1 P age

INTERIOUR DESIGN USING AUGMENTED REALITY

Multi-User Collaboration on Complex Data in Virtual and Augmented Reality

ISMCR2004. Abstract. 2. The mechanism of the master-slave arm of Telesar II. 1. Introduction. D21-Page 1

Transcription:

A 360 Video-based Robot Platform for Telepresent Redirected Walking Jingxin Zhang jxzhang@informatik.uni-hamburg.de Eike Langbehn langbehn@informatik.uni-hamburg. de Dennis Krupke krupke@informatik.uni-hamburg.de ABSTRACT Nicholas Katzakis nicholas.katzakis@uni-hamburg.de Telepresence systems have the potential to overcome limits and distance constraints of the real-world by enabling people to remotely visit and interact with each other. However, current telepresence systems usually lack natural ways of supporting interaction and exploration of remote environments (REs). In particular, the usage of single webcams for capturing the RE provides only a limited illusion of spatial presence. Furthermore, typical movement controls of mobile platforms in today s telepresence systems are often restricted to simple interaction devices. For these reasons, we introduce a prototype of a 360 video-based telepresence system consisting of a head-mounted display (HMD), a 360 camera, and a mobile robot platform. Considering the heterogeneous layouts between the user s local environment (LE) in which the user s motions are tracked and the RE, redirected walking (RDW) technology and different gains are applied to this system to allow users to explore a much larger RE than the LE. With this setup, users can get a 360 full-view rendered view on the HMD from the RE, and explore it by the most intuitive and natural way, e. g., by real walking in the user s LE, and thus controlling movements of the robot platform in the RE. CCS CONCEPTS Information Interfaces and Presentation Multimedia Information Systems; Artificial, augmented, and virtual realities; Computer Graphics Three-Dimensional Graphics and Realism; Virtual reality; KEYWORDS Virtual reality, telepresence, 360 camera, redirected walking. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. 2018 Association for Computing Machinery. ACM ISBN 123-4567-24-567/08/06... $15.00 https://doi.org/10.475/123_4 Frank Steinicke frank.steinicke@uni-hamburg.de ACM Reference Format: Jingxin Zhang, Eike Langbehn, Dennis Krupke, Nicholas Katzakis, and Frank Steinicke. 2018. A 360 Video-based Robot Platform for Telepresent Redirected Walking. In Proceedings of ACM/IEEE International Conference on Human-Robot Interaction (HRI 18). ACM, New York, NY, USA, 5 pages. https://doi.org/10.475/123_4 1 INTRODUCTION Telepresence refers to a set of technologies, which aim to convey the feeling of being in a different place than the space where a person is physically located [15]. The ideal goal for teleoperation is that users feel as if they were actually present at the remote site during the teleoperation task [15]. Therefore, telepresence systems should allow humans to move through the remote environment (RE), interact with remote artifacts or communicate with the remote people. Such technology nowadays is becoming increasingly common in our daily lives and has enormous potential for different application domains ranging from business, tourism, meetings, entertainment to academic conferences [7, 16], education [8, 11], and remote health care [1, 4]. However, current telepresence systems usually lack natural ways of supporting interaction and exploration of REs. In particular, most current telepresence platforms consist of mobile webcams with speakers and microphones. As a result, the usage of single webcams for capturing the RE provides the users with a very narrow field of view and a limited illusion of spatial presence. Both issues limit the sense of presence of teleoperators [15]. Furthermore, the deficiency of visual information about the RE can lead to a high error-rate for teleoperation tasks and remote interactions [15]. In addition, typical movement controls of mobile platforms in today s telepresence systems are often restricted to simple interaction devices, such as joysticks, touchpads, mice or keyboards. Since these devices require the operator to use their hands in order to control the mobile platform, the hands are not free to perform other tasks. This may decrease the naturalness, task performance and overall user experience [15]. In order to address these limitations and challenges, this paper introduces a prototype of a 360 video-based telepresence system, which aims to provide the local user with a more natural and intuitive way to explore and visit a RE.

Jingxin Zhang, Eike Langbehn, Dennis Krupke, Nicholas Katzakis, and Frank Steinicke Figure 1: Components and structure of 360 video-based telepresence system: On the remote side, a mobile robot equipped with a 360 camera captures a 360 full-view live stream and then transmit it to the local side via a communicating network. On the local side, the received live stream is rendered and projected inside a spherical space and displayed on the user s HMD. The user wearing an HMD teleoperates the remote mobile robot moving through the RE by means of real walking in the local tracked space. 2 SYSTEM DESIGN Figure 1 illustrates the basic components and structure of 360 video-based telepresence system. A 360 full-view camera and an HMD form the basis of this 360 video-based telepresence system, which aims to improve the sensation of presence and the user s spatial perception compared to a typical webcam and its narrow 2D view. A mobile robot equipped with a 360 full-view camera serves as physical agent of the local user at the remote side. The mobile robot provides the 360 camera with mobility which makes it possible for the 360 camera to go through the whole RE. The 360 camera captures a full-view live stream from the RE and then transfers the information of visual scene to the LE via a communication network in real time. The system control and data exchange between the 360 camera and the mobile robot on the remote side are implemented on a laptop. At the local side, a real-time virtual environment (VE) is reconstructed and rendered in Unity3D using the received live stream from the RE. An HMD is provided to the user in the LE to display the reconstructed virtual representation of the RE, which can induce a higher sense of presence by displaying a wider perspective compared to a simple screen or a monitor. With a continuous update of live stream from the RE, the user wearing the HMD in the LE perceives a 360 full-view immersive experience to explore and visit the RE in real-time. All the reconstruction and rendering work on the local side are performed on a graphics workstation. During the interactive process, the user s movements in the LE are detected by a set of tracking system in real-time and mapped to the remote side. The update of the user s position and orientation in the LE controls the robot s movements in the RE. This way, the user can drive the mobile robot through the RE and move to the location by means of real walking in the LE. Compared with other means of movement control for telepresence robots, real walking in the LE is more natural and intuitive when the user needs to travel in the RE from one location to another [6, 13]. Since the position of the camera in the RE is determined and updated according to the position of the user in the LE, this approach provides the most consistent and intuitive perception of motion in the target environment, while releasing user s hands for other potential interactive teleoperation tasks as well. One major problem of this approach is that it requires that the layouts of local and remote space are more or less identical. In most cases, however, the available local tracked space is smaller than space in the RE, which the user wants to explore, and furthermore, local and remote environments typically have completely different spatial layouts. Therefore, we introduce the redirected walking (RDW) method for the 360 video-based telepresence system. Redirected walking is a technique for virtual reality (VR) to overcome the limits and confined space of tracked room [10]. While RDW is based on real walking, the approach guides the user on a path in the real world, which might vary from the path the user perceives in the VE. RDW can be realized by manipulations applied to the virtual camera, causing users to unknowingly compensate for scene motions by repositioning and/or reorienting themselves [14]. RDW without the user s awareness is possible because the sense of vision often dominates proprioception [2, 3]. In other words, the visual feedback that the user sees on the HMD corresponds to the motions in the VE, whereas proprioception and vestibular system are connected to the real world. When the discrepancy is small enough, it is difficult for the user to detect the redirection, which would lead to the illusion of an unlimited natural walking experience [9, 12].

A 360 Video-based Robot Platform for Telepresent Redirected Walking Figure 2: Application of RDW technology in 360 video-based telepresence system: (left) the mobile platform is equipped with a 360 video camera moving in the remote environment (RE), (center) the user wears a virtual reality head-mounted display (HMD) walking in the local environment (LE), and (right) the user s view of the RE on the HMD. Different translation and rotation gains are used in this case to manipulate the mapping between user s movement in the LE and robot s movement in the RE, such that the user can explore REs with different sizes or heterogeneous layouts from LE using this setup. 3 IMPLEMENTATION We implemented the prototype of a 360 video-based RDW telepresence system based on the considerations described above. Figure 3 illustrates the 360 RDW telepresence robot in the RE. A Pioneer 3-DX mobile robot is used as the mobile base to carry the 360 camera moving through the whole RE, which works in a differential-drive way. In addition, a Ricoh THETA S 360 camera is equipped on the mobile robot for capturing a 360 live stream from the RE, which works in a 1280 720 resolution and a 15fps frame rate. Both the mobile robot and the 360 camera are connected with a Ubuntu laptop via USB cables. The laptop runs robot operating system (ROS) Indigo and serves as the core for robot movement control, device driving, remote communication, message publishing and subscribing. On this laptop, two nodes run under the ROS Indigo, which are responsible for controlling robot s movement and capturing 360 live stream separately. When the setup is online, the ROS nodes publish the ROS messages about 360 live stream from the RE to the LE via network; while the mobile robot subscribes the ROS messages for movement control from the LE simultaneously, and updates its position and orientation in the RE according to the parameters inside the ROS messages. In the LE the user is equipped with an HTC Vive HMD with the lighthouse tracking system. The HMD displays the 360 video-based RE with a resolution of 1080 1200 pixels per eye. The diagonal field of view is approximately 110 and the refresh rate is 90Hz. Beyond the tracking area, a pair of lighthouse tracking stations are attached to detect the update of user s position and orientation, by which the sensor on the HMD can be tracked in real-time. The tracked data of the user s movements in the LE is packaged in the form of ROS messages, and then transmitted to the remote side via network and used for the movements control of mobile robot in the RE. In this way, controlling a robot s movements in the RE by means of real walking in the LE is possible. A graphics workstation, which has a 3.5GHz Core i7 processor, 32GB of main memory, and two NVIDIA Geforce GTX 980 graphics cards, serves as the communicating and calculating core in the LE. Scene reconstruction and rendering are performed on the graphics workstation. The connection between the HMD and the graphics workstation is based on an HTC Vive 3-in-1 (HDMI, USB and Power) 5m cable, in such a way the user could move freely within the tracking space. Furthermore, the reconstruction and rendering of the RE are implemented based on a spherical space modelled in Unity3D. The live stream from the RE is rebuilt and projected as a movie texture on the inner surface of Figure 3: Prototype of a 360 RDW telepresense robot, which consists of a Ricoh THETA S 360 camera, a Pioneer 3-DX mobile robot and a Ubuntu laptop running ROS Indigo.

Jingxin Zhang, Eike Langbehn, Dennis Krupke, Nicholas Katzakis, and Frank Steinicke Figure 5: Application of 360 telepresence system in a CAVElike space. In this case, the reconstructed virtual RE is rendered and projected inside a CAVE-like space by projectors. Figure 4: Reconstructed virtual scene in a spherical space, which shows the real-time RE as a rendered texture in a spherical space on the user s HMD. this spherical space. A virtual camera is positioned in the center of this sphere to provide a perspective-correct view of 360 RE to the user from inside. Thus, users can get a 360 real-time telepresence view on the HMD using the live stream updates from the RE. The resulting reconstruction of the RE is shown in Figure 4. The communication between LE and RE is implemented via a ROS bridge between ROS and Unity3D [5]. In addition, RDW technology and different gains [17] are applied to the telepresence system to allow users exploring a much larger RE with different layouts compared to the LE (illustrated in Figure 2). 4 CONCLUSION In this paper, we presented a 360 video-based telepresence system based on redirected walking. We described the system design and implemention of the prototype. In our prototype we used RDW techniques by means of different gains to allow the exploration of larger REs. As described above, this setup enables user to explore and interact with a much larger RE than the LE by means of real walking in the local tracked space while perceiving a 360 immersive display from the RE. 5 FUTURE WORK In the future we would like to further reduce the current latency of movement control and image update. Hence, one of the main aspects of future work will be focussed on the improvement of the telepresence system so that it can be used in more real-time situations. In addition, we would like to explore other VR setups in the LE to display the 360 video-based RE. In particular, we have already explored CAVE-like setups (as illustrated in Fig. 5), and are interested in introducing more interactive behaviors and virtual avatars or objects into the system during the interaction with REs. Furthermore, we will test different REs and application domains like exploration of hallways, cooperation in business meeting rooms or inspections of outdoor scenarios. REFERENCES [1] David Anton, Gregorij Kurillo, Allen Y Yang, and Ruzena Bajcsy. 2017. Augmented Telemedicine Platform for Real-Time Remote Medical Consultation. In International Conference on Multimedia Modeling. Springer, 77 89. [2] Alain Berthoz. 2000. The brain s sense of movement. Vol. 10. Harvard University Press. [3] Johannes Dichgans and Thomas Brandt. 1978. Visual-vestibular interaction: Effects on self-motion perception and postural control. In Perception. Springer, 755 804. [4] Saso Koceski and Natasa Koceska. 2016. Evaluation of an assistive telepresence robot for elderly healthcare. Journal of medical systems 40, 5 (2016), 121. [5] Dennis Krupke, Lasse Einig, Eike Langbehn, Jianwei Zhang, and Frank Steinicke. 2016. Immersive remote grasping: realtime gripper control by a heterogenous robot control system. In Proceedings of the 22nd ACM Conference on Virtual Reality Software and Technology. ACM, 337 338. [6] Eike Langbehn, Paul Lubos, Gerd Bruder, and Frank Steinicke. 2017. Bending the curve: Sensitivity to bending of curved paths and application in room-scale vr. IEEE transactions on visualization and computer graphics 23, 4 (2017), 1389 1398. [7] Carman Neustaedter, Gina Venolia, Jason Procyk, and Daniel Hawkins. 2016. To Beam or not to Beam: A study of remote telepresence attendance at an academic conference. In Proceedings of the 19th ACM Conference on Computer-Supported Cooperative Work & Social Computing. ACM, 418 431. [8] Erina Okamura and Fumihide Tanaka. 2016. A pilot study about remote teaching by elderly people to children over a two-way telepresence robot system. In Human-Robot Interaction (HRI), 2016 11th ACM/IEEE International Conference on. IEEE, 489 490. [9] Sharif Razzaque. 2005. Redirected walking. University of North Carolina at Chapel Hill. [10] Sharif Razzaque, Zachariah Kohn, and Mary C Whitton. 2001. Redirected walking. In Proceedings of EUROGRAPHICS, Vol. 9. Manchester, UK, 105 106. [11] Delmer Smith and Nancy Louwagie. 2017. Delivering Advanced Technical Education Using Online, Immersive Classroom Technology. Community College Journal of Research and Practice 41, 6 (2017), 359 362. [12] Frank Steinicke, Gerd Bruder, Jason Jerald, Harald Frenz, and Markus Lappe. 2008. Analyses of human sensitivity to redirected walking. In Proceedings of the 2008 ACM symposium on Virtual reality software and technology. ACM, 149 156. [13] Frank Steinicke, Gerd Bruder, Jason Jerald, Harald Frenz, and Markus Lappe. 2010. Estimation of detection thresholds for redirected walking techniques. IEEE transactions on visualization and computer graphics 16, 1 (2010), 17 27.

A 360 Video-based Robot Platform for Telepresent Redirected Walking [14] Evan A Suma, Seth Clark, David Krum, Samantha Finkelstein, Mark Bolas, and Zachary Warte. 2011. Leveraging change blindness for redirection in virtual environments. In Virtual Reality Conference (VR), 2011 IEEE. IEEE, 159 166. [15] Susumu Tachi. 2016. Telexistence: Enabling Humans to Be Virtually Ubiquitous. IEEE computer graphics and applications 36, 1 (2016), 8 14. [16] Gina Venolia, John Tang, Ruy Cervantes, Sara Bly, George Robertson, Bongshin Lee, and Kori Inkpen. 2010. Embodied social proxy: mediating interpersonal connection in hub-and-satellite teams. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 1049 1058. [17] Jingxin Zhang, Eike Langbehn, Dennis Krupke, Nicholas Katzakis, and Frank Steinicke. 2018. Detection Thresholds for Rotation and Translation Gains in 360 Video-based Telepresence System. IEEE Transactions on Visualization and Computer Graphics (TVCG), Special Issue on IEEE Virtual Reality (VR) (2018).