Mixed and Augmented Reality Reference Model as of January 2014

Similar documents
ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

ISO/IEC JTC 1/SC 24 N 3655

Activities at SC 24 WG 9: An Overview

Video Requirements for Web-based Virtual Environments using Extensible 3D (X3D) Graphics

Extending X3D for Augmented Reality

Open Standard based Visualization Infrastructure for 3D Geospatial Information

AR Glossary. Terms. AR Glossary 1

ISO/IEC JTC 1 VR AR for Education

Web3D Standards. X3D: Open royalty-free interoperable standard for enterprise 3D

X3D Capabilities for DecWebVR

ISO/IEC JTC 1 N 13141

Web3D.org. March 2015 Anita Havele, Executive Director

Multi-Modal User Interaction

3D Virtual Training Systems Architecture

Exhibition Strategy of Digital 3D Data of Object in Archives using Digitally Mediated Technologies for High User Experience

Web3D and X3D Overview

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

ISCW 2001 Tutorial. An Introduction to Augmented Reality

X3D Graphics for Web Authors. X3D-Edit Update. Web3D Consortium Korea Chapter Seoul, 7-8 December Don Brutzman

Activities at SC 24 WG 9: An Overview

Haptics CS327A

Scalable geospatial 3D client applications in X3D - Interactive, online and in real-time

Supporting Mixed Reality Visualization in Web3D Standard

The browser must have the proper plugin installed

Augmented and Virtual Reality

Omni-Directional Catadioptric Acquisition System

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Web3D Consortium Medical WG Update. Nicholas F. Polys, PhD Virginia Tech Web3D Consortium

Programme TOC. CONNECT Platform CONNECTION Client MicroStation CONNECT Edition i-models what is comming

Extensible 3D (X3D) Graphics and X3D Earth for Web-Interoperable Modeling, Simulation and Visualization

Virtual Reality Calendar Tour Guide

THE VISIONLAB TEAM engineers - 1 physicist. Feasibility study and prototyping Hardware benchmarking Open and closed source libraries

Fig.1 AR as mixed reality[3]

MAR Visualization Requirements for AR based Training

Building a bimanual gesture based 3D user interface for Blender

Virtual Environments. Ruth Aylett

Fabrication of the kinect remote-controlled cars and planning of the motion interaction courses

overview steffen p walz, m.a.

ETSI TS V ( )

Waves Nx VIRTUAL REALITY AUDIO

Designing an Audio System for Effective Use in Mixed Reality

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

SpringerBriefs in Computer Science

Interactive Virtual Environments

The Mixed Reality Book: A New Multimedia Reading Experience

Provisioning of Context-Aware Augmented Reality Services Using MPEG-4 BIFS. Byoung-Dai Lee

Mixed Fantasy Delivering MR Experiences

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

An Introduction into Virtual Reality Environments. Stefan Seipel

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments. Stefan Seipel

Context-Aware Interaction in a Mobile Environment

EXPERIMENTAL BILATERAL CONTROL TELEMANIPULATION USING A VIRTUAL EXOSKELETON

COMPACT GUIDE. Camera-Integrated Motion Analysis

What is Virtual Reality? What is Virtual Reality? An Introduction into Virtual Reality Environments

Virtual Acoustic Space as Assistive Technology

Interframe Coding of Global Image Signatures for Mobile Augmented Reality

Augmented Reality and Its Technologies

VR based HCI Techniques & Application. November 29, 2002

Open-source AR platform for the future

IMGD 5100: Immersive HCI. Augmented Reality

MIRACLE: Mixed Reality Applications for City-based Leisure and Experience. Mark Billinghurst HIT Lab NZ October 2009

Marco Cavallo. Merging Worlds: A Location-based Approach to Mixed Reality. Marco Cavallo Master Thesis Presentation POLITECNICO DI MILANO

Vision with Precision Webinar Series Augmented & Virtual Reality Aaron Behman, Xilinx Mark Beccue, Tractica. Copyright 2016 Xilinx

Spatial Audio Transmission Technology for Multi-point Mobile Voice Chat

6 System architecture

A Survey of Mobile Augmentation for Mobile Augmented Reality System

Perception in Immersive Virtual Reality Environments ROB ALLISON DEPT. OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE YORK UNIVERSITY, TORONTO

Real and Virtual Spaces

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

3D and Sequential Representations of Spatial Relationships among Photos

SC24 Study Group: Systems Integration Visualization (SIV)

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Geo-Located Content in Virtual and Augmented Reality

Interactive Multimedia Contents in the IllusionHole

Keywords - Augmented reality, Internet, Mobile phone technology, new media, Virtual reality

Topics VRML. The basic idea. What is VRML? History of VRML 97 What is in it X3D Ruth Aylett

The presentation based on AR technologies

A Road Traffic Noise Evaluation System Considering A Stereoscopic Sound Field UsingVirtual Reality Technology

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Immersive Authoring of Tangible Augmented Reality Applications

CSE 165: 3D User Interaction. Lecture #11: Travel

A Java Virtual Sound Environment

Delivering Object-Based 3D Audio Using The Web Audio API And The Audio Definition Model

Novel Hemispheric Image Formation: Concepts & Applications

Exploring Surround Haptics Displays

Images and Graphics. 4. Images and Graphics - Copyright Denis Hamelin - Ryerson University

Construction of visualization system for scientific experiments

Sensor Observation Service for the GeoEvent Extension Making sensor data come alive

Perception. Read: AIMA Chapter 24 & Chapter HW#8 due today. Vision

Augmented reality as an aid for the use of machine tools

Lifelog-Style Experience Recording and Analysis for Group Activities

Mission Space. Value-based use of augmented reality in support of critical contextual environments

Evaluating Haptic and Auditory Guidance to Assist Blind People in Reading Printed Text Using Finger-Mounted Cameras

Transcription:

Mixed and Augmented Reality Reference Model as of January 2014 10 th AR Community Meeting March 26, 2014 Author, Co-Chair: Marius Preda, TELECOM SudParis, SC29 Presented by Don Brutzman, Web3D Consortium and NPS

Definition and architecture

Definition Direct perception AR Computer mediated perception

MAR focus

Global Architecture AR Sensors / Actuators AVH Display / UI MAR Engine MAR Scene Descrip5ons Services Addi5onal Media

Enterprise Viewpoint AR Sensors / Device Manufacturer (DM) Actuators Device Middleware/Component Provider (DMCP) DMCP AVH Display / UI EUP Service Middleware/Component Provider (SMCP) AR Service Provider (ARSP) Telecommunica9on Operator (TO) DM TO MAR Engine DMCP End- User Profile (EU) TO TO AR Authoring Tools Creator (ARATC) AR Experience Creator (AREC) Content Creator (CC) TO Content Aggregator (CA) AREC MAR Scene Descrip5ons ARATC CC Services ARSP SMCP Addi5onal Media CC CA

Computational Viewpoint AR Real World AVH Capture Actuator Device AVH Display / UI Rec/Tracker Context Mapper Spa5al Mapper Scene Graph Engine MAR Engine AVH Renderer MAR Scene Descrip5ons Services Addi5onal Media

Direct perception AR Computer mediated perception

MAR Reference Model Component based classification system Component Dimension Types Real World AVH Capture Dimension (compute the 1. Modality Visual Auditory E l e c t r o - m a g n e t i c waves (e.g. GPS) Temperature context) 2. Source type Live Pre-captured Real World AVH Capture 1. Modality Visual Auditory Haptics properties Other Dimension (contribute to 2. Form of Visual Modality Stil image 2D Video 3D Video (video + 3D mesh Other composition) depth) 3. Source type Live Pre-captured Recognizer 1. Form of Target Signal Image patch 3D primitives 3D Model Earth- reference None coordinates 2. Form of the Output Event Recognized or not Additional data: Type, Timestamp, Recognition confidence level, other attributes 3. Execution place Local Remote Tracker 1. Form of Target Signal Image patch 3D primitives 3D Model Earth- reference None coordinates 2. Form of the Output Event Spatial (2D, 3D, 6D, ) Aural (Intensity, Pitch, ) Haptic (Force, Direction, ) 3. Execution place Local Remote Actuator 1. Modality Motion Temperature Lighting Object shapes Other 2. Execution place Local Remote Scene Graph Engine 1. Space & time 2D + t 3D + t 2. User Interactivity Yes No 3. Execution place Local Remote Hybrid 4. Number of simultanous users Single-user Multi-user AVH Renderer 1. Modality Visual Aural Haptics Other 2. Execution place Local Remote Hybrid Visual Display 1. Presentation optical see through video see through projection 2. Mobility Fixed Mobile Controlled 3. No of channels 2D (mono) 3D stereoscopic 3D holographic Aural Display 1. No of channels Mono Spatial 2. Acoustic space coverage Headphones Speaker Haptics Display 1. Type Vibration Pressure Temperature Other Other physical properties

Terminology

Terminology Definition of a set of terms

Use cases

MAR Reference Model Local vs Remote Modeling of 6 state of the art AR use cases: 1. Real-time, local detection, no registration 2. Real-time, local detection, local registration 3. Real-time, remote detection, no registration 4. Real-time, remote detection, remote registration 5. Real-time, remote detection, local registration 6. Real-time, remote detection, registration and augmentation, local presentation

MAR Reference Model Points of Interests Modeling of 2 AR use cases using Point of Interests: 1. Content embedded POIs 2. Server available POIs

MAR Reference Model 2D vs 3D video Modeling of 4 AR use cases using 3D video: 1. Real-time, local depth estimation, condition based augmentation 2. Real-time, local depth estimation, model based augmentation 3. Real-time, remote depth estimation, condition based augmentation 4. Real-time, remote depth estimation, model based augmentation

MAR Reference Model Stereo vs 3D audio Modeling of 2 AR use cases using 3D audio: 1. Real-time, spatial audio based in intensity 2. Real-time, 3D audio based HRTF (Head-related Transfer Function)

MAR Reference Model Use cases Local vs Remote Point of Interests 3D video 3D audio

MAR Reference Model Use cases Local vs Remote Point of Interests 3D video 3D audio

MAR Reference Model 1. Real-time, local detection, no registration MAR device MAR Browser Target Images/Descriptors Content Designer Target Images/ Set of descriptors Scene ID Mask Detection Library Camera frames Real world capture Camera

TM = Transforma5on Matrix MAR Reference Model 2. Real-time, local detection, local registration Mobile device Augmenta5on media AR Browser Target Images/Descriptors Content Designer Target Images/ Set of descriptors Scene ID Mask + TM Detection & Tracking Library Camera frames Real world capture Camera

MAR Reference Model 3. Real-time, remote detection, no registration Processing Server URL Mobile device AR Browser Target Images/Descriptors + IDs Processing Server Content Designer Target Images/ Set of descriptors Scene Timestamp + ID Mask Detection Library Video stream/5med images Real world capture Camera

MAR Reference Model 4. Real-time, remote detection, remote registration Processing Server URL Mobile device AR Browser Target Images/Descriptors + IDs Processing Server Content Designer Target Images/ Set of descriptors Scene Timestamp + ID Mask + TM Detection & Tracking Library Augmenta5on media Video stream/5med images Real world capture Camera TM = Transforma5on Matrix

MAR Reference Model 5. Real-time, remote detection, local registration Content Designer Processing Server URL Scene Mobile device AR Browser Region Tracking Library Ini5al Region + Augmenta5on Media Processing Server Augmentation Media Detection Library Video stream/5med images Real world capture Camera Large Image DB Rectangle = Target Image to be detected locally

Composed Stream = the Processing Server composes the Video and the Augmenta5on Media and sends back the Augmented Stream MAR Reference Model 6. Real-time, remote registration and detection, local presentation Content Designer Processing Server URL Target Images/ Set of descriptors Scene Mobile device AR Browser Composed/Augmented Stream Video stream/5med images Processing Server Composed Stream Detection & Tracking Library Real world capture Camera Augmentation Media + Video

MAR Reference Model Use cases Local vs Remote Point of Interests 3D video 3D audio

MAR Reference Model 1. Content embedded POIs Sensors Mobile device Augmenta5on media AR Browser Content Designer POIs Scene TM Scene/World coordinates Registration Real world capture Camera Map service TM = Transforma5on Matrix

MAR Reference Model 2. Server available POIs Sensors Mobile device Augmenta5on parameters AR Browser Content Designer POIs server address Scene TM Scene/World coordinates Registration Real world capture Camera POI + content server Map service TM = Transforma5on Matrix

MAR Reference Model Use cases Local vs Remote Point of Interests 3D video 3D audio

MAR Reference Model 1. Real-time, local depth estimation, condition based augmentation Mobile device AR Browser Image + depth Condi5on detector Content Designer Condition Scale AR Scene Stereoscopic images (+ camera parameters) Depth es5ma5on Camera len Camera right Real world capture

MAR Reference Model 2. Real-time, local depth estimation, model based augmentation Mobile device AR Browser Content Designer 3D approx. of real world AR Scene Transformation matrix of the camera in real world Stereoscopic images (+ camera parameters) Depth es5ma5on + Detec5on Library Camera len Camera right Real world capture

Real world capture MAR Reference Model 3. Real-time, remote depth estimation, condition based augmentation Mobile device AR Browser Processing Server Depth (+transformation matrix) Content Designer Condition Orientation + Scale Processing server URL AR Scene Stereoscopic images (+ camera parameters) Depth es5ma5on + Detec5on Library Camera len Camera right Condi5on detector

Real world capture MAR Reference Model 4. Real-time, remote depth estimation, model based augmentation Content Designer 3D approx. of real world Processing server URL AR Scene Mobile device AR Browser Transformation matrix of the camera in real world Stereoscopic images (+ camera parameters) Processing Server Depth es5ma5on + Detec5on Library Camera len Camera right

Get involved in MAR Reference Model 1. Stakeholders and participants MAR Reference Model is intended to become an ISO standard Animated by SC24/WG9 and SC29/WG11 Contributions from Web3D, ARS, OGC Open to all interested in developing an open and free standard 2. ISO Intellectual property rights policy MAR Reference model will be published by ISO under the royalty free policy 3. How to get involved Participate to meetings of any standard organization involved (ISO, Web3D, OGC) Direct contributions on http://wg11.sc29.org/trac/augmentedreality 4. Contact information Marius Preda (marius.preda@it-sudparis.eu) Gerry Kim (gjkim@korea.ac.kr)

What is Extensible 3D (X3D)? X3D is a royalty-free open-standard file format Communicate animated 3D scenes using XML Run-time architecture for consistent user interaction ISO-ratified standard for storage, retrieval and playback of real-time graphics content Enables real-time communication of 3D data across applications: archival publishing format for Web Rich set of componentized features for engineering and scientific visualization, CAD and architecture, medical visualization, training and simulation, multimedia, entertainment, education, and more

X3D AR X3D version 4.0 will support the MAR Reference model and HTML5/DOM/X3DOM Much work is complete already X3D AR working group cochairs: ar_chairs@web3d.org Gun Lee, University of New Zealand Timo Engelke, Fraunhofer Speaker contact, X3D working group cochair: Don Brutzman, Naval Postgraduate School brutzman@nps.edu cell +1.831.402.4809