Physical Presence in Virtual Worlds using PhysX

Similar documents
Chapter 1 Virtual World Fundamentals

EnSight in Virtual and Mixed Reality Environments

ROBOTICS ENG YOUSEF A. SHATNAWI INTRODUCTION

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

A RANGE OF LOCATION BASED VR PRODUCTS BY

ART 269 3D Animation The 12 Principles of Animation. 1. Squash and Stretch

Psychophysics of night vision device halo

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

A C A D / C A M. Virtual Reality/Augmented Reality. December 10, Sung-Hoon Ahn

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Case Study ASK THE ARTISTS: THOMAS HEINRICH

Architecting Systems of the Future, page 1

One Size Doesn't Fit All Aligning VR Environments to Workflows

Team Breaking Bat Architecture Design Specification. Virtual Slugger

Narrative Guidance. Tinsley A. Galyean. MIT Media Lab Cambridge, MA

Perception in Immersive Environments

HeroX - Untethered VR Training in Sync'ed Physical Spaces

Construction of visualization system for scientific experiments

Low-cost virtual reality visualization for SMEs

SPIDERMAN VR. Adam Elgressy and Dmitry Vlasenko

Unit IV: Sensation & Perception. Module 19 Vision Organization & Interpretation

RKSLAM Android Demo 1.0

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Head Tracking for Google Cardboard by Simond Lee

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Early art: events. Baroque art: portraits. Renaissance art: events. Being There: Capturing and Experiencing a Sense of Place

Input devices and interaction. Ruth Aylett

NVIDIA APEX: From Mirror s Edge to Pervasive Cinematic Destruction. Anders Caspersson, DICE Monier Maher, NVIDIA Jean Pierre Bordes, NVIDIA

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

A Hybrid Immersive / Non-Immersive

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Adding Content and Adjusting Layers

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

ADVANCED WHACK A MOLE VR

INTRODUCTION TO GAME AI

To solve a problem (perform a task) in a virtual world, we must accomplish the following:

The Use of Virtual Reality System for Education in Rural Areas

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

Dipartimento di Elettronica Informazione e Bioingegneria Robotics

Oculus Rift Development Kit 2

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Components for virtual environments Michael Haller, Roland Holm, Markus Priglinger, Jens Volkert, and Roland Wagner Johannes Kepler University of Linz

The development of a virtual laboratory based on Unreal Engine 4

Virtual Environments. Ruth Aylett

Technical Specifications: tog VR

Assignment 5: Virtual Reality Design

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

The Voice Coil Actuator Story. Who needs to read this?

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

Exploring 3D in Flash

Jane Li. Assistant Professor Mechanical Engineering Department, Robotic Engineering Program Worcester Polytechnic Institute

SIMULATION MODELING WITH ARTIFICIAL REALITY TECHNOLOGY (SMART): AN INTEGRATION OF VIRTUAL REALITY AND SIMULATION MODELING

Transforming Industries with Enlighten

About us. What we do at Envrmnt

Investigating the Post Processing of LS-DYNA in a Fully Immersive Workflow Environment

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

HOW TO PAINT SMOKE USING SKETCHBOOK PRO ON THE IPAD

Tele-Nursing System with Realistic Sensations using Virtual Locomotion Interface

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Simulation of Water Inundation Using Virtual Reality Tools for Disaster Study: Opportunity and Challenges

Emergent s Gamebryo. Casey Brandt. Technical Account Manager Emergent Game Technologies. Game Tech 2009

Waves Nx VIRTUAL REALITY AUDIO

Virtual Reality Mobile 360 Nanodegree Syllabus (nd106)

VICs: A Modular Vision-Based HCI Framework

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Virtual Experiments as a Tool for Active Engagement

About us. What we do at Envrmnt

University of Geneva. Presentation of the CISA-CIN-BBL v. 2.3

Sensible Chuckle SuperTuxKart Concrete Architecture Report

Beacons Proximity UUID, Major, Minor, Transmission Power, and Interval values made easy

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Chapter 1 - Introduction

This little piece here I created is some of the scraps and then samples I was making for today s show. And these are wonderful for doing like

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

An Introduction into Virtual Reality Environments. Stefan Seipel

Intelligent interaction

Workshop 4: Digital Media By Daniel Crippa

Microsoft ESP Developer profile white paper

NVIDIA APEX: High-Definition Physics with Clothing and Vegetation. Michael Sechrest, IDV Monier Maher, NVIDIA Jean Pierre Bordes, NVIDIA

TRIAXES STEREOMETER USER GUIDE. Web site: Technical support:

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

TEAM JAKD WIICONTROL

What Does VR Mean for the Next Generation of Architects & Designers?

Marquee Instruction Manual

Omni-Directional Catadioptric Acquisition System

Realtime 3D Computer Graphics Virtual Reality

First Things First. Logistics. Plan for this afternoon. Logistics. Logistics 9/1/08. Welcome to Applications in VR. This is /

Robotics Institute. University of Valencia

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

PINBALL FX CHAMPIONSHIP EDITION 47" SCREEN

P15083: Virtual Visualization for Anatomy Teaching, Training and Surgery Simulation Applications. Gate Review

Kismet Interface Overview

DON T LET WORDS GET IN THE WAY

STEP-BY-STEP THINGS TO TRY FINISHED? START HERE NEW TO SCRATCH? CREATE YOUR FIRST SCRATCH PROJECT!

TURNING TO THE FOURTH DIMENSION Peter Rand

Transcription:

Physical Presence in Virtual Worlds using PhysX One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are completely immersed in the virtual world created for them. After receiving the extremely powerful SDK and accelerators from Ageia, we aimed to use this technology to bring the user one step closer to total immersion. University of Michigan 3D Lab http://um3d.dc.umich.edu Gabriel Cirio Eric Maslowski

Virtual Reality and the CAVE Since we wanted to create the most immersive and believable experience possible, we decided to initially develop for the CAVE. The CAVE is an advanced virtual reality system which consists of a 10 x10 room where each wall is a stereoscopic 3D screen similar to an IMAX theatre. Each wall is driven by its own dedicated computer with one master computer that tells each node what to draw and when. Since the CAVE is driven by many independent computers each with their own internal clocks and timings, we encountered the early challenge of making the physical responses consistent and predictable across the entire cluster. The slightest discontinuity would be immediately noticeable and jarring for the user breaking the whole experience. Without a perfectly deterministic system, one wall of the CAVE would show one interpretation of the scene, while another wall may show something entirely different. With the stability of PhysX, and the help of OpenSG, we were able to build a system that is completely deterministic and consistent under the highest application load. With the foundation in place, we started to look at other uses of PhysX in a virtual world and how it could be used with the CAVE... (break down of different walls of CAVE in actual demo)

Keeping Track of the User Often when people talk about immersive displays such as the CAVE, the question of tracking comes up. Since the walls of the CAVE oriented at 90 degree angles, it s necessary to know where the user is at any point in time so that the images created for each wall compensate for this. photo taken from user s perspective showing how images for each wall is adjusted for user s perspective (use anaglyph (red/cyan) glasses to see in stereo) To accomplish this, we use a Vicon motion capture system with eight cameras placed around the CAVE for complete coverage. Placing special reflective markers on the user s glasses we can tell the computer exactly where the user is and what they are looking at. Having this ability to track anything inside the CAVE (not just the users head) created endless possibilities with PhysX but also exposed how limited traditional systems are. One of the biggest problems with physics-centric worlds is how the user interacts with it. First, you have the user walking through a scene with physical objects, and then you have the situation where the user wants to interact with a physical object. To address the problems found with a user navigating through a scene, combined with the fact that the user can physical walk around in the CAVE, we ve introduced a new form of character controller which offers a great sense of immersion and freedom...

Character Controller One of the CAVE s strengths is the perceived freedom given to the user. Since the images for each wall are constantly updated for the users point-of-view, the user can walk around virtual objects, kneel down and look underneath them, and generally see the world as it would be in real life. We ve taken these same principles and extended it to the physical presence of the user. The character controller we ve implemented addresses the short-comings often found in traditional systems and keeps the system stable and predictable. One of the key features of the controller is the use of real mass and physical properties to define the user. This introduces a great sense of realism by giving the sensation that an object is light or heavy according to how easy it is to move. Imagine knocking over a virtual cabinet that pushes you to the side, or standing in a PhysX river and being taken away by the current in a life-size virtual world. The controller brings the experience one step closer to full immersion by allowing the user to act and react to the virtual world all while freely moving inside the CAVE. To achieve this, we developed a system based on dynamic bodies using a series of joints that gave us complete control over the user s changing height and location. The system is comprised of multiple capsules that adjust their position based on the user s head height allowing for a very stable, resizable, character controller. The system has also been designed for easy expansion to include distinct arms, legs, etc. which match the user s actual body allowing for incredible realism inside the virtual world...

In addition to the user moving through the virtual world, how they interact with it is equally important. Most applications rely on a joystick or mouse for the user s input. Since we are able to track arbitrary objects inside the CAVE, we used this flexibility to track the user s hands in addition to their head. Wearing special gloves and attaching a kinematic actor to the user s hand position allows us to give the user a method for natural interaction with the virtual world. By using kinematic actors, every motion was automatically translated by PhysX into a force. Therefore, the faster one swings their arm, the stronger the force applied to objects, adding more realism to the experience. Users can grab objects, move them, throw them, all by using what comes naturally. Users don't pass through virtual objects any more, and actually have unconscious awareness of their virtual presence in the world that surround them. One possible use that we will be exploring for this is in our training simulators for first responders. When an officer approaches what could be the site of massive casualties, they have to be aware of many things. Are there any victims and who needs immediate attention? Are there additional immediate risks to bystanders/victims such as structural instabilities or secondary devices? Previously, we were only able to train the officer on visual cues, but now can test their methods as well. We can have buildings that are structurally unsound. We can have convincing smoke using particles that billows and folds around existing objects hindering the officers vision. We can place victims and secondary devices under debris forcing the officer to go through actual techniques used in the field. All of this helps bring the officer deeper into the simulation, immersing their senses and preparing them for when lives are actually at risk. lifting panels in the virtual world - use anaglyph (red/cyan) glasses to see in stereo

Virtual Graphic User Interface To further extend our system and to completely remove the need for traditional input devices, we developed a series of controls and interfaces within the virtual world for the user to interact with. Since the main goal of the CAVE is complete immersion, we wanted the user to forget where the walls are and believe they were in the virtual world, even if only for a moment. Traditional, screen-aligned GUI methods would immediately highlight where the screens are, thus breaking the entire illusion. Thus, we decided to bring the user interface into the virtual world as true 3D objects. This Virtual Graphical User Interface (VGUI) was presented as a platform surrounding the virtual body of the user. Over the platform are located several objects (the buttons), with distinct shapes, distributed in a layout comfortable for the user and within reach of their virtual hands. By using unique properties of the character controller, we are able to keep the VGUI at a comfortable distance to the user regardless of their position or orientation in the CAVE. Whether we crouch, fall or jump, the VGUI will follow us. The buttons rely on PhysX triggers for their functionality keeping the VGUI consistent with other physical objects in the virtual world. Each button can only be activated by the user s hand(s), with the possibility to use both hands for some advanced interactions. This provides a very natural response for the user without the need of external devices. While we chose to represent the VGUI as a platform around the user, it could be anything the developer wants as the VGUI uses the same descriptions as other physics objects in the engine. (Spraying particles in the scene. Left hand used to start/stop spray. Right hand used for direction of spray)

3D Studio Max Pipeline Cellar environment and associated physical bodies. (red = static, blue = dynamic) With such a complex system at our disposal, then came the question of generating content. The 3D Studio Max scripts developed by Ageia and Marcus Storm were phenomenal. They allowed us to build an environment that would have been near impossible otherwise. We never expected to create a full environment with full physical response in such a short time, but the scripts stability and it s power to preview the simulation in Max made such a project a pleasure. Therefore, we designed our system so that creating and navigating new scenes didn't require programming, and any new feature or specific interaction required few lines of code. Our system is art path centric, geometry and physics-wise. Even the VGUI follows these rules. This allows artists to create content and use the system quickly and efficiently. Outside environment and associated physical bodies. (red = static, blue = dynamic, green = trimesh)

Future Work and Credits Our main focus will be to keep working on the integration of features of the SDK into our system. Along with rigid bodies and joints, we have fluids working across the cluster. Our next goal is the integration of soft bodies and cloth. We are keeping an eye open and are very excited for any new features that Ageia will release in their SDK. We also plan to extend our character controller to get a full virtual body, with full tracking and physical interaction. To increase our physics computation power, and improve the immersive capability of our system, we will experiment with the hardware cards in SLI mode. Hardware accelerated computations allowed us to run physics in parallel to other subsystems and before the actual rendering, therefore getting rid of the "one-frame delay" issue inherent to a system where physics are computed while rendering is done. This is crucial for a real-time immersive environment like the CAVE: a oneframe delay is noticeable, even at high frame rates, and is a huge drawback for realism and user immersion. No other physics system would have provided such a clean, simple and scalable solution for this problem. For real world applications, we will be improving our Virtual Disaster Simulator and other applications related to training first responders. However, the possibilities are endless, and our ideas run far beyond what has been included here including applications to the entertainment industry. We look forward what Ageia comes up with next and hope to continue our close relationship for many years. Special Acknowledgements Greg Stoner and Bob Whitecotton for the donation of Ageia boards Marcus Storm for the great 3D Studio Max scripts Andrew Hamilton who donated portions of the cellar scene for the demo.