Immersive Real Acting Space with Gesture Tracking Sensors

Similar documents
Real-time AR Edutainment System Using Sensor Based Motion Recognition

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Navigating the Virtual Environment Using Microsoft Kinect

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

The Visual Cliff Revisited: A Virtual Presence Study on Locomotion. Extended Abstract

Design and Implementation of the 3D Real-Time Monitoring Video System for the Smart Phone

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Building a bimanual gesture based 3D user interface for Blender

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Development a File Transfer Application by Handover for 3D Video Communication System in Synchronized AR Space

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Implement of weather simulation system using EEG for immersion of game play

A Study on the Physical Effects in 4D

Virtual/Augmented Reality (VR/AR) 101

Performing Art Utilizing Interactive Technology -Media Performance <Silent Mobius>-

Realization of Multi-User Tangible Non-Glasses Mixed Reality Space

Immersion & Game Play

Chapter 1 - Introduction

Interactive Media Artworks as Play Therapy through Five Senses

The Control of Avatar Motion Using Hand Gesture

Tracking. Alireza Bahmanpour, Emma Byrne, Jozef Doboš, Victor Mendoza and Pan Ye

Image Manipulation Interface using Depth-based Hand Gesture

Interactive Multimedia Contents in the IllusionHole

Gesture Recognition with Real World Environment using Kinect: A Review

Application of 3D Terrain Representation System for Highway Landscape Design

The User Activity Reasoning Model Based on Context-Awareness in a Virtual Living Space

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

Mixed Reality technology applied research on railway sector

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Development of a telepresence agent

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Classifying 3D Input Devices

The Application of Virtual Reality in Art Design: A New Approach CHEN Dalei 1, a

SimVis A Portable Framework for Simulating Virtual Environments

CONTENT RICH INTERACTIVE, AND IMMERSIVE EXPERIENCES, IN ADVERTISING, MARKETING, AND EDUCATION

Classifying 3D Input Devices

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

The Mixed Reality Book: A New Multimedia Reading Experience

Immersive Simulation in Instructional Design Studios

NICE: Combining Constructionism, Narrative, and Collaboration in a Virtual Learning Environment

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

One Size Doesn't Fit All Aligning VR Environments to Workflows

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Composite Body-Tracking:

Provisioning of Context-Aware Augmented Reality Services Using MPEG-4 BIFS. Byoung-Dai Lee

ISCW 2001 Tutorial. An Introduction to Augmented Reality

TRAVEL IN SMILE : A STUDY OF TWO IMMERSIVE MOTION CONTROL TECHNIQUES

Multi-Resolution Estimation of Optical Flow on Vehicle Tracking under Unpredictable Environments

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

TOUCH & FEEL VIRTUAL REALITY. DEVELOPMENT KIT - VERSION NOVEMBER 2017

Input devices and interaction. Ruth Aylett

Design Principles of Virtual Exhibits in Museums based on Virtual Reality Technology

Localized Space Display

Immersive Guided Tours for Virtual Tourism through 3D City Models

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Interior Design using Augmented Reality Environment

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Experience of Immersive Virtual World Using Cellular Phone Interface

VIEW: Visual Interactive Effective Worlds Lorentz Center International Center for workshops in the Sciences June Dr.

International Journal of Informative & Futuristic Research ISSN:

Omni-Directional Catadioptric Acquisition System

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Affordance based Human Motion Synthesizing System

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Introduction to Virtual Reality (based on a talk by Bill Mark)

Bring Imagination to Life with Virtual Reality: Everything You Need to Know About VR for Events

A Kickball Game for Ankle Rehabilitation by JAVA, JNI and VRML

Guidelines for choosing VR Devices from Interaction Techniques

INTELLIGENT GUIDANCE IN A VIRTUAL UNIVERSITY

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Implementation of Augmented Reality System for Smartphone Advertisements

The Use of Virtual Reality System for Education in Rural Areas

Korean Wave (Hallyu) of Knowledge through Content Curation, Infographics, and Digital Storytelling

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

Augmented Reality Lecture notes 01 1

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

Interactive Virtual Environments

VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Audio Output Devices for Head Mounted Display Devices

Quality of Experience for Virtual Reality: Methodologies, Research Testbeds and Evaluation Studies

A C A D / C A M. Virtual Reality/Augmented Reality. December 10, Sung-Hoon Ahn

A Study on the control Method of 3-Dimensional Space Application using KINECT System Jong-wook Kang, Dong-jun Seo, and Dong-seok Jung,

SVEn. Shared Virtual Environment. Tobias Manroth, Nils Pospischil, Philipp Schoemacker, Arnulph Fuhrmann. Cologne University of Applied Sciences

Efficiency Analysis of the Smart Controller Switch System using RF Communication for Energy Saving

VISUAL REQUIREMENTS ON AUGMENTED VIRTUAL REALITY SYSTEM

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Air-filled type Immersive Projection Display

Design of Intelligent Blind Control System to Save Lighting Energy and Prevent Glare

3D AUDIO AR/VR CAPTURE AND REPRODUCTION SETUP FOR AURALIZATION OF SOUNDSCAPES

A Step Forward in Virtual Reality. Department of Electrical and Computer Engineering

Geo-Located Content in Virtual and Augmented Reality

Learning Based Interface Modeling using Augmented Reality

Life Prediction of Mold Transformer for Urban Rail

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Corey Pittman Fallon Blvd NE, Palm Bay, FL USA

Transcription:

, pp.1-6 http://dx.doi.org/10.14257/astl.2013.39.01 Immersive Real Acting Space with Gesture Tracking Sensors Yoon-Seok Choi 1, Soonchul Jung 2, Jin-Sung Choi 3, Bon-Ki Koo 4 and Won-Hyung Lee 1* 1,2,3,4 Creative Content Research Laboratory, ETRI 1* Graduate School of Advanced Imaging Science, Multimedia & Film, Chung-Ang University ys-choi@etri.re.kr, s.jung@etri.re.kr, jin1025@etri.re.kr, bkkoo@etri.re.kr, whlee@cau.ac.kr Abstract. In this study, we realized a real space based virtual aquarium equipped with a multi view function that provides images for users and observers at the same time. A virtual reality system needs more natural and intuitive interfaces so as to enhance users immersion. We attach markers on users and camera devices in a real space designed in the same 1:1 size as the virtual space to collect motion information, which is reflected in real time to generate virtual world images, which in turn are transmitted to the users immersing image devices. Also, the system allows observers to share experiences by providing them virtual synthetic images from a third person perspective including a user after filming the user in the real space with a camcorder on which motion tracking markers are attached. For this, the system provides the functions of marker based motion tracking, recognition of users motions, real time actual image synthesis, and multi view to realize a method to simulate more intuitive and natural virtual space interactions, which can be used for the construction of motion based realistic/experiencing systems, which increasingly attract interest. Keywords: Virtual reality, Real-action, Motion sensor, gesture recognition, Real-time animation 1 Introduction The computer graphics industry has provided images with amazing quality owing to quantum development of GPU based graphics cards, since it was fully used in the entertainment field. Users are satisfied with their achievements in virtual spaces as their characters freely walk around in virtual spaces to carry out their given missions owing to the combination of virtual reality, a computer graphic technology, with games. However, user interfaces have been evolved for the sake of interaction with virtual reality as it is difficult to expect natural immersion with joystick based interfaces. Then, virtual reality based contents have evolved into interfaces using various sensor technologies for bodily interactions. Nintendo s Wii remote control arranges ISSN: 2287-1233 ASTL Copyright 2013 SERSC

virtual environments according to motions of actual users who grab it beyond a joystick in the existing simple form, and Microsoft s Kinect, different from existing interfaces, is equipped with a camera module to sense users motions with a motion capture device, on the basis of which games run. However, as they have sensors on a fixed position, there are spatial limitations, and they are insufficient in realizing near actual free motions. Motion capture equipment began to be adopted to express precise and natural motions of characters in virtual reality, and system construction employing motion capture equipment was made easy in tandem with lowering of the price of the equipment. We get the information of user and camera motions on the basis of a marker based optical motion tracking sensor, and based on this, we provides images reflecting interactions with virtual spaces in accordance with users sight movements and gestures to allow them experience optimal virtual reality. Also, in order to allow observers to feel the interactions between users and the virtual reality, we realized a real space based virtual reality system equipped with a multi view function that can provide images for users and observers at the same time. The paper is organized as follows. Section 2 introduces virtual reality technologies and experience system based on them. Section 3 accounts for the composition of the whole proposed system and the basic method to realize it. Section 4 concludes the paper. 2 Previous studies 2.1 Virtual reality and interactions Virtual reality (VR), meaning the technology that uses computers to provide a specific environment similar to the real, stimulates users five senses to provide them with spatial and temporal experiences similar to real worlds[3] Users not only are immersed in virtual reality, but also can interact with virtual reality through various interfaces. VR can be classified into three types, i.e., monitor based VR, projection based VR, and head based VR [3]. In the head based VR, users are supposed to wear equipment like head mounted displays. Contrastively, in the monitor and projection based VR, the screen is not fixed but moves in accordance with the movement of users sights. VR is used diversely in various fields of the entertainment industry, and in particular, its utility is prominent in the edutainment industry, which combines education and entertainment. Customers who would visit real aquariums will have greater feelings when they are located for themselves underwater to see a great number of fish and water plants of tens of kinds that surround them, and when they see scenes moving in accordance with their own actions. For this reason, spaces like aquariums and undersea vehicles are often realized in virtual spaces [1] [2] [6] [7]. What is important in VR is to allow users interact with virtual spaces and objects therein by controlling their avatars naturally and effectively. Interactions with virtual spaces are classified into three types, manipulation, navigation, and communication [3]. Some systems began to provide users gestures. Takala et al. [4] proposed a virtual aquarium system adopting a gesture sensing system. Users take a swimming 2 Copyright 2013 SERSC

motion to move forward in the virtual space. Virtual space experiences with such real actions are simpler, more intuitive, and more natural [5]. 3 The proposed system 3.1 Composition of the system In order to reflect users actual motions without changes, we set up a 7m x 7m space where motions can be tracked by motion tracking equipment in an actual 10m square space, and painted the floor and walls green for the sake of real time image synthesis for observers view. Two servers for motion tracking and images respectively are used for the virtual aquarium system. The OptiTrack system made by Natural Point was adopted for the motion tracking. An OptiTrack S250e camera and tracking tools were used to track users and camcorders motions. The motion tracking server transmits such motions to the image server through the network, and the image server recognizes gestures based on the transmitted information and generates images of user and audience views in 720p resolution while it recognizes the user s gestures from the information from markers attached to the users both hands [Fig 1]. He/she can walk about freely in the virtual aquarium space because the generated images of the user views are transmitted to his/her HMD through a wireless HDMI transmission device. A Sony PMW F3K camcorder is used to take 720p actual images, and the taken images are transmitted to the capture board of the image server through BNC cables. Then, the foreground images extracted in real time are combined with the virtual reality images for the observers view. Fig. 1. Real acting studio and user configuration for gesture tracking 3.2 Generation of virtual aquarium scenes The system generates two kinds of images in order to allow not only users participating in the virtual reality but also observers to share experiences in the virtual aquarium. Ogre3D, an open source game engine, is used for scene generation. As for images for the user, location and rotation information taken from the tracking of Copyright 2013 SERSC 3

markers attached to the HMD device worn on the user s head is transmitted to the virtual aquarium system in real time. The images are in turn generated on the basis on the information and then return to the user [Fig 2]. Observers who don t engage in the virtual aquarium system can see images in the virtual reality through the Observers view. They do not simply see the same images as the user sees, but are provided with the real time synthesis of the images in the virtual space and the actual images taken from the filming of the user in order to show how the user interacts with the virtual space. [Fig. 3] shows steps for generating the observers view. [Fig. 3] (a) shows filming of actual images and [Fig 4] (b) shows an image taken from the camcorder. [Fig. 3] (c) shows how to generate images in the virtual reality reflecting the location information of the camcorder taken by tracking markers attached to it. [Fig. 3] (d) is an image taken by reflecting the location information of the camcorder. [Fig. 3] (e), an observers view, is finally generated by synthesizing [Fig. 3] (b) and [Fig. 3] (d) in real time. Fig. 2. User s view through wireless HMD Fig. 3. Observers view through real time image composition 4 Copyright 2013 SERSC

Fig. 4. Gestures for interaction with fishes A total of 30 species of fish and 10 species of seaweeds were produced, and normal and light maps were basically allotted to individual fish so as to enhance the quality. Also, in order to give undersea feelings in an aquarium, an optical effect shader, is applied to add GodRay and wave effects, and so as to show rich inner scenes of the aquarium, more than 300 objects are made to swim in the water. Also, 10 gestures were defined for the interaction with fish in the virtual aquarium. When the user makes gestures with his right or left hands as in [Fig. 4], fish move in accordance with relevant gestures [Fig.5]. Gestures are recognized based on Procrustes analysis [8]. Fig. 5. Fish movement according to the each gesture 4 Conclusion We constructed a virtual aquarium projecting motions in reality, which supports the following functions. - Natural Real Acting Navigation with motion sensors. A user can make moves as if he or she moves in a real space. Then the system uses the motion sensor to track the user s movements, and projects the motion information to the virtual space without modification. Copyright 2013 SERSC 5

- Observers View Support. The system allows not only the user who participates in the virtual space but also surrounding observers to experience the virtual space. This delivers the user s interaction with the objects in the virtual space to observers without modification. - Interaction by 3D Gestures. Rather than using some special equipment for users interaction with the virtual space, the system supports user gesture sensing to help them experience virtual spaces naturally. By using the method of Procrustes analysis the system is designed not to be influenced by the spatial locations, sizes, and directions of the motions users make. The system, equipped with such merits, provides an experiencing service in which users and observers are more immersed and sympathized. Acknowledgment. This work was supported by the IT R&D program of MCST/MKE/IITA. [2008-F-030-02, Development of Full 3D Reconstruction Technology for Broadcasting Communication Fusion] References 1. Torsten Frohlich. The virtual oceanarium. Communications of the ACM, 43(7):94-101, (2000). 2. Hyun-Cheol Lee, Eun-Seok Kim, Nak-Keun Joo, and Gi-Taek Hur. Development of real time virtual aquarium system. International Journal of Computer Science and Network Security, 6(7),:58-63, (2006). 3. William R Sherman and Alan B Craig. Understanding virtual reality: Interface, application, and design. Morgan Kaufmann,12. (2002). 4. Tapio Takala, Lauri Savioja, and Tapio Lokki. Swimming in a virtual aquarium, http://www.academia.edu/2744573/ Swimming_in_a_Virtual_Aquarium. (2005) 5. Martin Usoh, Kevin Arthur, Mary C Whitton, Rui Bastos, Anthony Steed, Mel Slater, and Frederick P Brooks Jr. Walking > walking-in-place> flying, in virtual environments. In International Conference on Computer Graphics and Interactive Techniques, vol, 1999, 359-364, (1999). 6. G. Wetzstein and P. Stephenson. Towards a workflow and interaction framework for virtual aquaria. In Virtual Reality for Public Consumption, IEEE Virtual Reality 2004 Workshop, (2004). 7. Greg T Young, Marcus Foth, and Natascha Y Matthes. Virtual fish: visual evidence of connectivity in a master-planned urban community. In Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces, 219-222, (2007). 8. John C. Gower and Garmt B. Dijksterhuis. Procrustes Problems. Oxford Statistical Science. Oxford University Press, (2004). 6 Copyright 2013 SERSC