Immersive Guided Tours for Virtual Tourism through 3D City Models

Similar documents
A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

ReVRSR: Remote Virtual Reality for Service Robots

VR/AR Concepts in Architecture And Available Tools

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Virtual Reality as Innovative Approach to the Interior Designing

Subject Description Form. Upon completion of the subject, students will be able to:

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Toward an Augmented Reality System for Violin Learning Support

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

Virtual- and Augmented Reality in Education Intel Webinar. Hannes Kaufmann

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Team Breaking Bat Architecture Design Specification. Virtual Slugger

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Immersive Real Acting Space with Gesture Tracking Sensors

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

A Survey of Mobile Augmentation for Mobile Augmented Reality System

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Virtual/Augmented Reality (VR/AR) 101

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Chapter 1 Virtual World Fundamentals

Head Tracking for Google Cardboard by Simond Lee

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

Issues and Challenges of 3D User Interfaces: Effects of Distraction

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

The Mixed Reality Book: A New Multimedia Reading Experience

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Mario Romero 2014/11/05. Multimodal Interaction and Interfaces Mixed Reality

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

COMPUTER GAME DESIGN (GAME)

Real-time AR Edutainment System Using Sensor Based Motion Recognition

Virtual Environments. Ruth Aylett

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow

Networked Virtual Environments

Mid-term report - Virtual reality and spatial mobility

A Study on Motion-Based UI for Running Games with Kinect

Interactive Multimedia Contents in the IllusionHole

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Virtual Reality in Neuro- Rehabilitation and Beyond

Activities at SC 24 WG 9: An Overview

Augmented and Virtual Reality

Construction of visualization system for scientific experiments

Immersive Simulation in Instructional Design Studios

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

A 360 Video-based Robot Platform for Telepresent Redirected Walking

HeroX - Untethered VR Training in Sync'ed Physical Spaces

3D Interaction Techniques

Localized Space Display

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

AGENT PLATFORM FOR ROBOT CONTROL IN REAL-TIME DYNAMIC ENVIRONMENTS. Nuno Sousa Eugénio Oliveira

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

GESTURE RECOGNITION SOLUTION FOR PRESENTATION CONTROL

AUGMENTED REALITY, FEATURE DETECTION Applications on camera phones. Prof. Charles Woodward, Digital Systems VTT TECHNICAL RESEARCH CENTRE OF FINLAND

Application of 3D Terrain Representation System for Highway Landscape Design

COMPANY PROFILE MOBILE TECH AND MARKETING

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

Remote Shoulder-to-shoulder Communication Enhancing Co-located Sensation

UMI3D Unified Model for Interaction in 3D. White Paper

CSE Tue 10/09. Nadir Weibel

arxiv: v1 [cs.hc] 14 Sep 2018

/ Impact of Human Factors for Mixed Reality contents: / # How to improve QoS and QoE? #

Air-filled type Immersive Projection Display

Immersive Training. David Lafferty President of Scientific Technical Services And ARC Associate

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

immersive visualization workflow

Augmented Reality And Ubiquitous Computing using HCI

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

Advancements in Gesture Recognition Technology

Boneshaker A Generic Framework for Building Physical Therapy Games

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

AR 2 kanoid: Augmented Reality ARkanoid

Learning Based Interface Modeling using Augmented Reality

Panel: Lessons from IEEE Virtual Reality

iwindow Concept of an intelligent window for machine tools using augmented reality

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

3D interaction techniques in Virtual Reality Applications for Engineering Education

Design Principles of Virtual Exhibits in Museums based on Virtual Reality Technology

CSC 2524, Fall 2017 AR/VR Interaction Interface

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Technical Specifications: tog VR

6 System architecture

Embodied Interaction Research at University of Otago

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

interactive laboratory

CSE 190: 3D User Interaction

Using the Kinect body tracking in virtual reality applications

Mobile Virtual Reality what is that and how it works? Alexey Rybakov, Senior Engineer, Technical Evangelist at DataArt

Chapter 1 - Introduction

PHYSICS-BASED INTERACTIONS IN VIRTUAL REALITY MAX LAMMERS LEAD SENSE GLOVE

Transcription:

Immersive Guided Tours for Virtual Tourism through 3D City Models Rüdiger Beimler, Gerd Bruder, Frank Steinicke Immersive Media Group (IMG) Department of Computer Science University of Würzburg E-Mail: {ruediger.beimler,gerd.bruder,frank.steinicke}@uni-wuerzburg.de Abstract: Since decades, computer-mediated realities such as virtual reality (VR) or augmented reality (AR) have been used to visualize and explore virtual city models. The inherent three-dimensional (3D) nature as well as our natural understanding of urban areas and city models makes them suitable for immersive or semi-immersive installations, which support natural exploration of such complex datasets. In this paper, we present a novel VR approach to leverage immersive guided virtual tours through 3D city models. Therefore, we combine an immersive head-mounted display (HMD) setup, which is used by one or more tourists, with a touch-enabled tabletop, which is used by the guide. While the guide overviews the entire virtual 3D city model and the virtual representations of each tourist inside the model, tourists perceive an immersive view from an egocentric perspective to regions of the city model, which can be pointed out by the guide. We describe the implementation of the setup and discuss interactive virtual tours through a 3D city model. Keywords: Virtual environments, virtual cities, guided tours 1 Introduction In recent years, virtual environments (VEs) have become more and more popular and widespread due to the requirements of numerous application areas in particular in the 3D city visualization domain. Two-dimensional desktop systems are often limited in cases where natural interfaces are desired, for example, when navigating within complex 3D scenes. In such cases virtual reality (VR) systems, which make use of tracking technologies and stereoscopic display of three-dimensional synthetic worlds, support better exploration of complex datasets. These VR systems allow users to explore virtual worlds in an intuitive and immersive manner. In immersive virtual 3D city environments people can visit, for instance, tourist landmarks by natural locomotion in the space provided by the range of tracking sensors, or can explore larger VEs by using 3D input devices. In this paper, we introduce a rst prototype for a novel VR approach to leverage multiuser immersive VR technology for guided virtual tours through 3D city models. This approach is realized by combining several cost-eective technologies and techniques. Therefore,

(a) (b) Figure 1: Photo of (a) a guide interacting with the touch-enabled tabletop, and (b) a rendering from within the VE from a tourist's point of view. we combine immersive head-mounted display (HMD) setups, which are used by the with a touch-enabled tabletop, which is used by the guide. tourists, The views and movements of the tourists are tracked by a Kinect-based interface and are then streamed and mapped onto virtual avatars within a VE, which are displayed to the guide. The interactions of the guide with the VE are captured via a touch-based interface and applied to the position and orientation of the tourists' avatars. The collaborating users receive real-time audio-visual feedback. The paper is structured as follows: Section 2 resumes background information. Section 3 introduces our setup and explains how guided virtual tours through 3D city models are performed. In Section 4 we discuss the results. Section 5 concludes the paper and gives an overview of future work. 2 Background There has been an increasing demand of virtual 3D city representations during the last years for a variety of application elds [RSH05, SHR06, SRH06]. When navigating in an unknown environment, e. g., a foreign city, way- nding tends to be a complex task, which can be supported by prior virtual exploration. Virtual city models provide several advantages over traditional physically crafted city models, especially when they are explored using stereoscopic 3D displays in conjunction with head-coupled perspective rendering [BKLP04]. Burigat and Chittaro [BC07] discuss the importance of vantage points to overview the whole setup, and propose solutions to aid users by providing visual navigation aids. To improve the degree of realism, virtual city 3D models can be populated with virtual humans, which can be animated either by a crowd simulation algorithm or manually by a performer or an animator to simulate natural phenomena of crowd behavior [TGMY09].

Very recently, developments in the eld of consumer sensor and display hardware have provided the means to implement cost-ecient multi-user immersive VEs. Examples for tracking sensors are the Nintendo Wii remote and the Microsoft Kinect, as well as display technologies such as the Sony HMZ-T1 or the Oculus Rift HMD. In this context, dierent multi-user immersive VEs, in which users assume dierent roles have been realized in the scope of the 3DUI Contest in 2012 [3DU12]. A similar multi-user approach using immersive and semi-immersive setups has been proposed by Beimler et al. [BBS13] for character animations. 3 Guided Virtual Tourism Setup In this section we describe our collaborative virtual tourism setup. In the following subsections, we present the setups of the guide as well as the tourists and describe their roles. 3.1 Semi-Immersive Guidance Setup The guide takes the role of a tourist guide, i. e., the guide points out important sights, landmarks and other places of interest to a user that plays the role of a tourist. The guide oversees the whole scene from a bird's eye point of view as naturally supported by the tabletop setup (see Figure 1(a)). Since each tourist is represented in the scene by its corresponding avatar, the guide can overview the VE and interact with the scene and the tourists' avatars from the elevated vantage point [BC07]. In our setup both roles, the tourists and the guide are able to communicate with each other, for instance, if there is a spontaneous question of the tourist, the guide can immediately react on requests. In this regard, the tour guide is able to direct the tourists to virtual landmarks, while keeping track of the whole scene. Setup The physical setup of the guide is based on the SmurVEbox approach [FLBS12, BBS13]. The setup consists of a tabletop screen with dimensions of 62cm 112cm that displays back projected images over a mirror mounted at the bottom of the box. The virtual scene is rendered on a Windows 7 workstation, equipped with Intel Core i7 3.40GHz processors and an Nvidia Quadro 4000 graphic card. To render the virtual scene on dierent screens and displays, we made use of the Unity 3D proprietary game engine with the MiddleVR for Unity framework 1. To provide one- or multi-nger touch capacity on the surface, we made use of rear diffused illumination (Rear-DI) [MTS10]. A cluster of six high-power infrared LED have been mounted to the bottom of the box to illuminate the surface from below. Since the surface consists of a diusing material, it disperses the light and reections can be captured with a camera that is equipped with an infrared band-pass lter at the bottom of the setup. To detect touch points captured by the camera, the video stream was evaluated by a modied 1 http://www.imin-vr.com/middlevr/

(a) (b) Figure 2: (a) Photo of a tourist in front of a Microsoft Kinect sensor. (b) The tourist's virtual view on the HMD. 2 version of NUI Group's Community Core Vision (CCV). The transfer protocol to handle touch information was the TUIO 3 protocol, which is often used with tangible multi-touch surfaces. In addition, we use the community edition of xtuio's unituio 4 scripts library to stream multi-touch gestures into Unity 3D. Interaction Since the virtual scene is displayed to the guide as seen from an elevated van- tage point on the tabletop, the guide can naturally interact with the miniature representation of the tourists in the mini ed virtual 3D city environment. To support intuitive and natural interaction, we mapped single- nger pan gestures as well as two- nger rotate gestures to translations and rotations of the virtual avatar, which provides the guide with the ability to move the tourist's avatar to any desired pose within the virtual city (see Figure 1(a)). We built this interaction in two modes: The guide can translate or rotate the tourist's avatar around the scene within the ground plane, and the guide is able to navigate the whole scene through camera panning or rotating in the same plane. We distinguish among the modes by determining the touched object below the user's nger when interacting with the tabletop. 3.2 Immersive Tourist Setup The tourist is immersed in the guided virtual tour by donning an HMD in front of a Kinect sensor in a room-sized tracked space. The user perceives the virtual scene from a pedestrian point of view, such as if he or she would experience a city in the real world. The user is free to look around and explore the VE. For a virtual self-representation we map the tracked skeleton information from the Kinect sensor to a rigged, bipedal character in the VE. 2 http://ccv.nuigroup.com/ 3 http://www.tuio.org/ 4 http://www.xtuio.com/

Setup The physical setup is based on an Oculus Rift HMD with a resolution of 640 800 pixels per eye at 60 frames per second. In our current setup, the HMD is connected to the guide's rendering workstation, which renders both the output screens for a tourist and the guide using the MiddleVR framework. For the tourist's avatar we used a rigged, bipedal character, which we adopted from the Unity Asset store, and rened it using the Autodesk Maya software by placing Joint-Deformer Objects to shape a bipedal skeleton (see Figure 1). For the real-time tracking of the tourist's body movements, we use a Microsoft Kinect sensor as illustrated in Figure 2. The Kinect tracks users with an update rate of 30 frames per second. We stream the captured motion of the tourist's skeleton limbs into the Unity 3D engine and map them onto the tourist's avatar. Therefore, we utilized the Kinect for Windows SDK 5 and the Kinect Wrapper Package by CMU's Entertainment Technology Center 6. As a result, the avatar behaves accordingly to the motions of the tracked tourist. The tourist's avatar has a virtual scene camera attached to his head node, which is used to provide the user with an egocentric view. Interaction Since the user's body is tracked with the Kinect sensor, and the user's virtual view is slaved to the corresponding head node of the avatar, the user can navigate naturally in the virtual city by real walking. In particular, the user can move and turn towards locations of interest in the virtual scene from a pedestrian's perspective. However, the lowcost tracking currently imposes some limitations on natural interaction. In particular, the accuracy and precision of the tracked head node as provided by the Kinect sensor is very low. Moreover, the Kinect's resolution and our room-sized workspace impose limitations on the size of the virtual space that the user can interact within. In order to travel longer distances, the tourist can make requests to the guide, who then moves the tourist in the virtual city. 4 Discussion The guided tourism system described in Section 3 represents an early prototype that we developed with immersive and semi-immersive consumer-level VR hard- and software. To this point we have not yet conducted a formal user evaluation of the system to show its advantages over traditional Desktop-based guided tours. However, we observed several benets and limitations during informal tests of the system. In particular, we observed that interaction with the virtual city model and the tourist's avatar via the tabletop setup provides an intuitive and direct method to move and guide a user through a virtual scene. We observed that users often interact with both hands using the common multi-touch gestures that we implemented on the tabletop: In particular, users often pan the virtual city view with their non-dominant hand, whereas interactions with the avatar are usually performed with their dominant hand at the same time. While this use of 5 http://www.microsoft.com/en-us/kinectforwindows/ 6 http://wiki.etc.cmu.edu/unity3d/

both hands seems to provide an intuitive form of interaction, interactions with the avatar could be simplied with additional gestures, such as a single-nger move-to gesture. On the other hand, we observed that the tourist setup has much room for improvements. While users can use the Kinect-based tracking to change the view on the HMD, the quality of the tracking data is very poor, resulting in many incorrectly identied body poses. More recent developments in the consumer market such as the upcoming revised Kinect sensor may help to alleviate these problems without raising the cost of the system out of proportion for multiple users. To this regard, while the current system has only been tested for one tourist, we see the potential to provide views to multiple collocated or remote tourists. 5 Conclusion In this paper we introduced a setup for collaborative interactive 3D city exploration. We described a prototype in which we implemented the multi-user approach using consumerlevel VR hard- and software. With users assuming the roles of a tour guide or tourists, the system allows users to explore virtual 3D cities in a collaborative manner. We observed that a semi-immersive touch-enabled tabletop environment is well-suited to be used by an interactive guide who views the VE from an elevated perspective, whereas an immersive pedestrian view is provided to a tourist via a HMD. We discussed initial observations of our prototype, and identied benets and limitations. For future work, we see much potential for tourists to connect to virtual city environments using consumer-level immersive display and tracking hardware using their home system or setups at interactive installations. For interactive multi-user installations, such as realized in museums or art galleries, we believe that incorporating a tour guide interface to steer and direct immersed users to points of interest will provide many benets in terms of the tourists' sense of feeling present in an interactive virtual world. References [3DU12] [BBS13] 3DUI 2012 Contest. In Proceedings of Symposium on 3D User Interfaces (3DUI). IEEE, 2012. R. Beimler, G. Bruder, and F. Steinicke. SmurVEbox: A smart multi-user realtime virtual environment for generating character animations. In Proceedings of Virtual Reality International Conference (VRIC), 7 pages, 2013. [BC07] S. Burigat and L. Chittaro. Navigation in 3D virtual environments: Eects of user experience and location-pointing navigation aids. International Journal of Human-Computer Studies (IJHCS), 65(11):945958, 2007. [BKLP04] D. Bowman, E. Kruij, J. LaViola, and I. Poupyrev. 3D User Interfaces: Theory and Practice. Addison-Wesley, 2004.

[FLBS12] M. Fischbach, M. E. Latoschik, G. Bruder, and F. Steinicke. smartbox: Out-ofthe-box technologies for interactive art and exhibition. In Proceedings of Virtual Reality International Conference (VRIC), pages 17, 2012. [MTS10] C. Müller-Tomfelde and J. Schöning. Building interactive multi-touch surfaces. In C. Müller-Tomfelde, editor, Tabletops - Horizontal Interactive Displays, Human-Computer Interaction Series, pages 2749. Springer, 2010. [RSH05] T. Ropinski, F. Steinicke, and K. H. Hinrichs. A Constrained Road-based VR Navigation Technique for Travelling in 3D City Models. In Proceedings of International Conference on Articial Reality and Telexistence (ICAT), pages 228 235, 2005. [SHR06] F. Steinicke, K. H. Hinrichs, and T. Ropinski. A Hybrid Decision Support System for 3D City Planning. In Proceedings of International Commission II Symposium (ISPRS), pages 103108, 2006. [SRH06] F. Steinicke, T. Ropinski, and K. H. Hinrichs. Collaborative Interation Concepts for City Planning Tasks in Projection-based Virtual Reality Systems. In Proceedings of Virtual Reality International Conference (VRIC), pages 113122. IEEE, 2006. [TGMY09] D. Thalmann, H. Grillon, J. Maim, and B. Yersin. Challenges in crowd simulation. Proceedings of International Conference on Cyberworlds, pages 112, 2009.