A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Similar documents
Moving Towards Generally Applicable Redirected Walking

Immersive Guided Tours for Virtual Tourism through 3D City Models

Immersive Real Acting Space with Gesture Tracking Sensors

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

A 360 Video-based Robot Platform for Telepresent Redirected Walking

3D User Interfaces for Collaborative Work

The Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract

Panel: Lessons from IEEE Virtual Reality

Navigating the Virtual Environment Using Microsoft Kinect

Presence-Enhancing Real Walking User Interface for First-Person Video Games

Reorientation during Body Turns

ReWalking Project. Redirected Walking Toolkit Demo. Advisor: Miri Ben-Chen Students: Maya Fleischer, Vasily Vitchevsky. Introduction Equipment

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Application of 3D Terrain Representation System for Highway Landscape Design

CSE 165: 3D User Interaction. Lecture #11: Travel

A FRAMEWORK FOR TELEPRESENT GAME-PLAY IN LARGE VIRTUAL ENVIRONMENTS

VR/AR Concepts in Architecture And Available Tools

From Gamers to Tango Dancers Bridging Games Engines and Distributed Control System Frameworks for Virtual Reality (VR) based scientific simulations

Detection Thresholds for Rotation and Translation Gains in 360 Video-based Telepresence Systems

Guidelines for choosing VR Devices from Interaction Techniques

Virtual/Augmented Reality (VR/AR) 101

Chapter 1 - Introduction

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Building a bimanual gesture based 3D user interface for Blender

WHEN moving through the real world humans

Using Hands and Feet to Navigate and Manipulate Spatial Data

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

Introduction to Virtual Reality (based on a talk by Bill Mark)

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

The Control of Avatar Motion Using Hand Gesture

CSC 2524, Fall 2017 AR/VR Interaction Interface

CSE 190: 3D User Interaction

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Classifying 3D Input Devices

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Touching Floating Objects in Projection-based Virtual Reality Environments

Input devices and interaction. Ruth Aylett

Real Walking through Virtual Environments by Redirection Techniques


Perception in Immersive Environments

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

THE WII REMOTE AS AN INPUT DEVICE FOR 3D INTERACTION IN IMMERSIVE HEAD-MOUNTED DISPLAY VIRTUAL REALITY

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Experiment P02: Understanding Motion II Velocity and Time (Motion Sensor)

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

ThumbsUp: Integrated Command and Pointer Interactions for Mobile Outdoor Augmented Reality Systems

VR/AR with ArcGIS. Pascal Mueller, Rex Hansen, Eric Wittner & Adrien Meriaux

Haptic presentation of 3D objects in virtual reality for the visually disabled

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Simultaneous Object Manipulation in Cooperative Virtual Environments

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Mid-term report - Virtual reality and spatial mobility

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Redirecting Walking and Driving for Natural Navigation in Immersive Virtual Environments

Localized Space Display

Classifying 3D Input Devices

The architectural walkthrough one of the earliest

Real-time Reconstruction of Wide-Angle Images from Past Image-Frames with Adaptive Depth Models

Communication Requirements of VR & Telemedicine

Chapter 1 Virtual World Fundamentals

Experiment P01: Understanding Motion I Distance and Time (Motion Sensor)

Experience of Immersive Virtual World Using Cellular Phone Interface

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Improving Depth Perception in Medical AR

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Interactive and Immersive 3D Visualization for ATC

Designing Semantic Virtual Reality Applications

Taxonomy and Implementation of Redirection Techniques for Ubiquitous Passive Haptic Feedback

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Walking Up and Down in Immersive Virtual Worlds: Novel Interaction Techniques Based on Visual Feedback

Avatar: a virtual reality based tool for collaborative production of theater shows

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Mobile Haptic Interaction with Extended Real or Virtual Environments

Future Directions for Augmented Reality. Mark Billinghurst

Enhancing Fish Tank VR

USER-ORIENTED INTERACTIVE BUILDING DESIGN *

Immersive Visualization On the Cheap. Amy Trost Data Services Librarian Universities at Shady Grove/UMD Libraries December 6, 2019

International Journal of Informative & Futuristic Research ISSN:

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow

ARK: Augmented Reality Kiosk*

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Evaluating Collision Avoidance Effects on Discomfort in Virtual Environments

Ubiquitous Home Simulation Using Augmented Reality

Does a Gradual Transition to the Virtual World increase Presence?

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Transcription:

F. Steinicke, G. Bruder, H. Frenz 289 A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Frank Steinicke 1, Gerd Bruder 1, Harald Frenz 2 1 Institute of Computer Science, Westfälische Wilhelms-Universität Münster, Germany {fsteini, g_brud01}@math.uni-muenster.de 2 Institute of Psychology II, Westfälische Wilhelms-Universität Münster, Germany frenzh@psy.uni-muenster.de Abstract. In this paper we present a new multimodal locomotion user interface that enables users to travel through 3D environments displayed in geospatial information systems (GISs), e.g., Google Earth or Microsoft Virtual Earth. When using the proposed interface the geospatial data can be explored immersively using stereoscopic visualization on a head-mounted display (HMD). When using certain tracking approaches the entire body can be tracked in order to support natural traveling by real walking. Moreover, intuitive devices are provided for both-handed interaction such as gestures to complete the navigation process. We introduce the setup as well as associated interaction concepts. 1 INTRODUCTION The main Exploration and visualization of geospatial data is of major importance for many areas, e.g., building evaluation, urban planning, terrain exploration. Hence, various web-based GISs, e.g., Google Earth or Microsoft Virtual Earth, are available and on the rise. These products allow users from different domains to gain new perspectives on 3D geospatial data. For such geospatial applications (semi-)immersive VR systems have proven to provide enormous potential these environments give a realistic impression of complex datasets and allow to virtually intrude into them (Dodge et al. 1998). Usually the immersion is supported by stereoscopic projection or immersive displays. However, most web-based GI applications do not support stereoscopy and interaction in immersive VEs natively although different plug-ins are available. In order to further increase the realism when exploring geospatial environments, it is essential to permit users at least to apply exploration paradigms that are similar to those used in the real world; or even better to provide more sophisticated approaches to overcome drawbacks and restrictions involved in the real world (Whitton et al. 2005). The most natural and intuitive way to get from place to place is to walk as a pe- Florian Probst and Carsten Keßler (Eds.): GI-Days 2007 - Young Researchers Forum. IfGIprints 30. ISBN: 978-3-936616-48-4

290 F. Steinicke, G. Bruder, H. Frenz destrian. Consequently, it is important to allow users to perform real walking in immersive VEs (Whitton et al. 2005). This can be done by tracking the user s movements, in particular the gaits. Actually, VEs usually exceed the dimensions of the real environment in which motion can be tracked. To meet this challenge various locomotion interfaces have been proposed, e.g. treadmills or step-in devices (Ishii et al. 2002). However, the most natural way to map the walking metaphor is to support real walking. 2 MULTIMODAL LOCOMOTION SETUP HMDs are the standard display devices for immersive VR systems. These devices consist of two LCDs mounted in front of the user s eyes giving a stereoscopic impression when the images on the LCDs show slightly different scenes (see Figure 1). Usually, orientation and acceleration sensors are attached to the HMDs measuring a change of the user s head orientation. HMDs are not featured with position trackers providing the user s absolute or relative position in terms of a tracking coordinate system. But optical tracking systems, for example, allow reconstructing the position of certain trackable markers. Due to line-of-sight restrictions, these systems are constrained to laboratory environments, where accuracy errors can be reduced below 1mm. We use a stereo-based optical tracking system setup, where the tracking volume is about 10m 5m 3m and thus the user can move within this area. When both cameras capture markers which are attached to the user, e.g., attached to the head, hands, and feet, this information is applied in the virtual world: the head position is mapped to the position of the virtual camera, transformations of the real hands and feet are mapped to the user s virtual avatar allowing the user to see her virtual extremities. Since the interaction volume is restricted to an area usually smaller than the virtual world through which users travel, further locomotion strategies have to be applied in order to realize navigation within the entire VE. For this purpose we exploit the Nintendo Wii remote in combination with the nunchuk supporting both-handed interaction allowing simple gestures (see Figure 1). We associate control stick movements on the nunchuk to accelerated movements along the ground plane, and the control pad buttons on the remote controller are mapped to height changes, or motions along the view axis providing fly-to-view-direction approaches. Furthermore, the buttons can be used to configure several settings as described in Section 3.

F. Steinicke, G. Bruder, H. Frenz 291 3 LOCOMOTION USER INTERFACE We have developed an interscopic user interface framework (Steinicke et al. 2007) that allows capturing 3D content of any graphics applications based on OpenGL or DirectX, e.g., Google Earth or Microsoft Virtual Earth. The 3D content can be modified and processed arbitrarily, e.g., the scene can be rendered twice for stereoscopy. Moreover, we can manipulate certain parameters of the virtual camera with respect to the tracked user s inputs. These manipulations are generic in terms of their universal usability across different 3D graphics applications and are independent of their ordinary user interface. For instance, Google Earth neither natively supports stereoscopic projection on an HMD nor interaction via optical tracking systems respectively a Wii controller (although third-party plug-ins are available), but using our framework provides full control about the world displayed in Google Earth. In order to enable appropriate VR-based scene exploration, the application s coordinate system and the tracking coordinate system must be calibrated. When a change of the user s position is tracked, the application s virtual camera is moved with respect to the changed position, i.e. when the user moves straight ahead, left, right, or backwards the camera is moved accordingly. When the user looks around, the virtual camera is rotated in an analogous way providing a look-around capability. Small distances or height changes can be realized by walking or head movements, larger distances can be implemented via the Wii controller as described in Section 2. When mapping the movements of the user to camera motion different strategies may be applied. The movements can be mapped using a one-toone mapping, i.e., if the user moves one meter in the tracking coordinate system, this movement is mapped to a motion of one meter in the corresponding direction of the scene camera. In order to allow the user to explore a larger region by using walking or head movements only, this mapping can be scaled. We have tested certain factors up to a value of 15, which still gives the user a good mechanism to explore geospatial data in particular when the objects to be explored are far away. The scaling factor can be configured manually using certain buttons on the Wii controller. Using a scaled relation of motion for a longer period results in an adaptation of the mapping by the user. Hence, also larger scaling values might be appropriate; moreover, the user can access them and sense them as accustomed mappings (Freundschuh et al. 1997).

292 F. Steinicke, G. Bruder, H. Frenz Figure 1: (left) Multimodal locomotion setup consisting of HMD, optical tracking system, and Wii. The projection wall illustrates the user s view. (right) Photography of the same building. Figure 1 (left) illustrates the described proceeding. The user wears HMD and perceives a detailed model of the castle of Münster. Figure 1 (right) shows a real image of the same scene. The user can explore the castle using multimodal approaches: turning the head, walking around by feet, or moving via the Wii controller in combination with the nunchuk. If a corresponding mapping is applied, the user is able to virtually walk around the virtual castle, although the tracking volume is restricted to a clearly smaller region. 4 DISCUSSION In this paper we have presented a multimodal locomotion interface for immersive 3D geospatial information systems. The approach combines walking with both-handed interaction using different mapping strategies. The setup has been tested with different application scenarios in the context of web-based geospatial graphics services. Since walking has been revealed as most intuitive navigation concept, further locomotion and mapping strategies will be developed that increase the area, which is comfortably reachable by real walking. REFERENCES Dodge, D., S. Doyle, A. Smith and S. Fleetwood (1998). "Towards the VirtualCity: VR & Internet GIS for Urban Planning." In Workshop on Virtual Reality and Geographical Information Systems. Freundschuh, S. M. and M. J. Egenhofer (1997). "Human Conception of Spaces: Implications for Geographic Information Systems." Transaction on GIS, 4(2): 361-375.

F. Steinicke, G. Bruder, H. Frenz 293 Ishii, M., M. Sato and L. Bouguila (2002). "Realizing a new Step-in-place Locomotion Interface for Virtual Environment with Large Display System." ACM International Conference Proceeding Series of the Workshop on Virtual Environments, 197 207. Steinicke, F., T. Ropinski, G. Bruder and K. Hinrichs (2007). "Interscopic User Interface Concepts for Fish Tank Virtual Reality Systems." In Proceedings of IEEE Virtual Reality, 27 34. Whitton, M., J. Cohn, J. Feasel, P. Zimmons, S. Razzaque, S. Poulton, B.McLeod and F. Brooks (2005). "Comparing VE Locomotion Interfaces. " In Proceedings of IEEE Virtual Reality, 123 130.