MRT: Mixed-Reality Tabletop

Similar documents
TRIAXES STEREOMETER USER GUIDE. Web site: Technical support:

CSE 190: Virtual Reality Technologies LECTURE #7: VR DISPLAYS

Lesson Plan 1 Introduction to Google Earth for Middle and High School. A Google Earth Introduction to Remote Sensing

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Chapter 7- Lighting & Cameras

ACDSee Pro 3 tutorials: Process mode overview

Chapter 7- Lighting & Cameras

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

Application of 3D Terrain Representation System for Highway Landscape Design

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Immersive Visualization and Collaboration with LS-PrePost-VR and LS-PrePost-Remote

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

VICs: A Modular Vision-Based HCI Framework

CS 354R: Computer Game Technology

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

CALIBRATION MANUAL. Version Author: Robbie Dowling Lloyd Laney

CSE 165: 3D User Interaction. Lecture #7: Input Devices Part 2

MotionDesk. 3-D online animation of simulated mechanical systems in real time. Highlights

Understanding OpenGL

Vendor Response Sheet Technical Specifications

Drawing a Plan of a Paper Airplane. Open a Plan of a Paper Airplane

Extending X3D for Augmented Reality

Falsework & Formwork Visualisation Software

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

GlobiScope Analysis Software for the Globisens QX7 Digital Microscope. Quick Start Guide

VR/AR with ArcGIS. Pascal Mueller, Rex Hansen, Eric Wittner & Adrien Meriaux

Table of Contents. Creating Your First Project 4. Enhancing Your Slides 8. Adding Interactivity 12. Recording a Software Simulation 19

11 Advanced Layer Techniques

Table of Contents 1. Image processing Measurements System Tools...10

Augmented Reality Lecture notes 01 1

Tutorial 1 is an introduction to vector software and assumes no previous knowledge of vector-based

Registering and Distorting Images

Interface Design V: Beyond the Desktop

EnSight in Virtual and Mixed Reality Environments

User Manual for VIOSO BlackBox 2.0

AgilEye Manual Version 2.0 February 28, 2007

Augmented Reality Mixed Reality

VARIANT: LIMITS GAME MANUAL

COURSES. Summary and Outlook. James Tompkin

tracker hardware data in tracker CAVE library coordinate system calibration table corrected data in tracker coordinate system

Students: Bar Uliel, Moran Nisan,Sapir Mordoch Supervisors: Yaron Honen,Boaz Sternfeld

Enhancing Classroom and Distance Learning Through Augmented Reality

CHAPTER 1. INTRODUCTION 16

High-Resolution Interactive Panoramas with MPEG-4

SDC. SolidWorks Tutorial 2001Plus. A Competency Project Based Approach Utilizing 3D Solid Modeling. David C. Planchard & Marie P.

Technical Specifications: tog VR

Customized Foam for Tools

ACAD-BAU TUTORIAL For BricsCAD platform

04. Two Player Pong. 04.Two Player Pong

What is Virtual Reality? Burdea,1993. Virtual Reality Triangle Triangle I 3 I 3. Virtual Reality in Product Development. Virtual Reality Technology

November 30, Prof. Sung-Hoon Ahn ( 安成勳 )

Virtual Reality as Innovative Approach to the Interior Designing

Exercise 4-1 Image Exploration

in the list below are available in the Pro version of Scan2CAD

Picture Style Editor Ver Instruction Manual

Harry Plummer KC BA Digital Arts. Virtual Space. Assignment 1: Concept Proposal 23/03/16. Word count: of 7

synchrolight: Three-dimensional Pointing System for Remote Video Communication

ROTATING SYSTEM T-12, T-20, T-50, T- 150 USER MANUAL

Motion Simulation - The Moving Man

Kigamo Scanback which fits in your view camera in place of conventional film.

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Introduction to Virtual Reality (based on a talk by Bill Mark)

A quick note: We hope that you will find something from the Tips and Tricks that will add a little pizazz to your yearbook pages!

COLORMUNKI DISPLAY & i1display PRO

OzE Field Modules. OzE School. Quick reference pages OzE Main Opening Screen OzE Process Data OzE Order Entry OzE Preview School Promotion Checklist

RKSLAM Android Demo 1.0


Image Viewing. with ImageScope

The 8 th International Scientific Conference elearning and software for Education Bucharest, April 26-27, / X

Room 2D/3D Diagram Demo

Fpglappy Bird: A side-scrolling game. 1 Overview. Wei Low, Nicholas McCoy, Julian Mendoza Project Proposal Draft, Fall 2015

CONTENTS INTRODUCTION ACTIVATING VCA LICENSE CONFIGURATION...

ENGAGING STEM STUDENTS USING AFFORDABLE VIRTUAL REALITY FRAMEWORKS. Magesh Chandramouli Computer Graphics Technology Purdue University NW STEM

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

Advancements in Gesture Recognition Technology

Photoshop CS part 2. Workshop Objective. Getting Started Quit all open applications Single click Adobe Photoshop from the Dock

Drawing with precision

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Procedural Level Generation for a 2D Platformer

Microsoft Scrolling Strip Prototype: Technical Description

User Operation of JEOL 1200 EX II

Relationship to theory: This activity involves the motion of bodies under constant velocity.

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

House Design Tutorial

Shared Virtual Environments for Telerehabilitation

P202/219 Laboratory IUPUI Physics Department THIN LENSES

Chapter 19- Working With Nodes

go1984 Performance Optimization

Obduction User Manual - Menus, Settings, Interface

HI, THIS IS FROM COLOGNE. WE UNITE THE DIGITAL AND ANALOG WORLD.

Building a bimanual gesture based 3D user interface for Blender

User s handbook Last updated in December 2017

Team 4. Kari Cieslak, Jakob Wulf-Eck, Austin Irvine, Alex Crane, Dylan Vondracek. Project SoundAround

Creating Stitched Panoramas

Virtual Environment Interaction Based on Gesture Recognition and Hand Cursor

Using Photoshop for Color Demonstration

By Chris Burton. User Manual v1.60.5

Introduction to Game Design. Truong Tuan Anh CSE-HCMUT

Distributed Simulation of Dense Crowds

Transcription:

MRT: Mixed-Reality Tabletop Students: Dan Bekins, Jonathan Deutsch, Matthew Garrett, Scott Yost PIs: Daniel Aliaga, Dongyan Xu August 2004 Goals Create a common locus for virtual interaction without having to shift attention between input and display devices Compose and synchronize mixed-reality video and audio for local and distant participants Create a low-cost scalable system that integrates multiple data streams over a uniform distributed platform Department of Computer Sciences Purdue University Motivation Immersive learning in Year 2020 There is a power in virtual interaction Rita R. Colwell Going beyond current-generation whiteboard Provide a natural focus of attention: lab table, desk, counter Support rich and intuitive interactions among distributed users Adding virtual and real objects to the equation Mix real and virtual objects in the same focus of attention Create virtual venue and context for interactions Wider deployment than full-fledged VR systems Lower cost Less infrastructural requirement Easier to develop, install, and operate Mixed-Reality Tabletop (MRT) Create stations containing a tabletop, camera, and projector to provide intuitive, device-free interaction Support both virtual and real objects on same tabletop Connect stations by transporting multimedia data over the network for composition and display on remote stations Provide a software toolkit for fast application development Related Work Example MRT Applications Whiteboards HMD-based VR systems (UNC-CH, Feiner at Columbia) The Workbench (Barco, 3 rd Tech) Tangible user interfaces (MIT, UVA) Emancipated Pixels (SIGGRAPH 99) Shader Lamps (Raskar at MERL) Everywhere Displays (IBM) 1

Presentation System Overview Key Components Applications Interactive Physics Conclusions MRT Station Pipeline The stations are interconnected by a programmable pipeline for composing real and virtual imagery over a network MRT Station Projector and camera PC workstation Tabletop Presentation System Overview Key Components Applications Interactive Physics Conclusions MRT Software-only only Station PC only Mouse movements are mapped into MRT environment Camera-Projector Synchronization Synchronize the camera and projector to prevent an infinite mirror effect 2

Camera-Projector Synchronization Frame 1 Camera triggered Black image projected Frame 2 RGB image projected Frame 3 RGB image projected and so on Calibration: Camera A snapshot is taken of a rectilinear grid on the tabletop Known points on the grid are corresponded to their pixel locations in the snapshot The point correspondences are used to approximate the camera warp [Tsai87] Camera-Projector Synchronization Calibration: Projector V-sync to camera RGB to Projector V-sync to projector RGB from Video Card A rectilinear grid is projected onto the tabletop and recorded by the camera The recording is transformed by the camera warp Points on the grid are corresponded to their pixel locations in the warped camera image HV-sync (bypasses black box) V-sync split from VGA signal Calibration User Interface Perspective and lens distortion cause the raw camera and projector images to be misaligned with the tabletop and each other Determine a mapping to warp from the camera s coordinate system to the tabletop s coordinate system Provide an intuitive graphical user interface with no interaction with keyboard or mouse Support object tracking and recognition Adopt same interface for PC-only mode Tabletop overhead: the visible camera area (green) and projector area (red) are aligned with the tabletop (white) to form a rectilinear grid (yellow) 3

Tracking Objects Objects are distinguished from the white table background using an intensity threshold Foreground regions in the interior of the table are considered objects Foreground regions touching the edge of the table are considered hands or pointers Objects are tracked from frame to frame by considering attributes like pixel area and average position Mouse press events are simulated by the opening and closing of the hand Presentation System Overview Key Components Applications Interactive Physics Conclusions Tracking Objects API Framework The following attributes are determined for objects: object center - average pixel location object area - pixel count object border - outline of pixel region The object border geometry is simplified to a small set of edges, based on an error threshold Moving objects are tracked based on attribute similarities between frames Provide basic controls like buttons, numeric selectors, and panels Use C++ inheritance to create custom controls from a base control class Provide programmable event control networking mouse click, move, drag n drop object tracking Render graphics using DirectX/OpenGL Tracking Hands Application #1: Interactive Classroom Hand regions are thinned to produce a single-pixel thick skeleton A graph is created to describe the skeleton s connectivity A hand s hotspot is placed at the farthest endpoint from the image border A skeleton with fingers is an open hand (mouse up) A skeleton with no fingers is a closed hand (mouse down) Uses an Instructor / Student model One instructor and multiple students Designed for use with students from grade 6 and up Instructor can use environment for: Demonstrations and labs (e.g., biology dissections) Show and Tell (e.g., describe parts of circuit board) 4

Instructor and Student Environments Question Button Instructor environment includes: Programmable labels Extendable list of students Composable multiple-choice quizzes Movable button panels Student environment includes: Movable labels Ask-question and submit-response buttons Viewable user list Movable button panels Allows the student to notify the instructor that they have a question (e.g., raising your hand) Once pressed, the question button is colored brown This button will be colored green when the student table is considered live after instructor recognizes the students question, or if instructor calls on this student When the table is live, the student is now allowed to move labels The question button returns to original color when instructor deselects the student Programmable Labels Quizzes Label text loaded at run-time Instructor freely moves labels Instructor calls a specific student to move a label Instructor may correct student and move label to proper location Instructor: Instructor presses the Quiz button Presses up and down to select how many questions required Move the quiz labels to proper location The students selection will appear beside their user button after they have pressed Submit Clear and ready for another quiz Student: Make selection by clicking on appropriate quiz label Press Submit The student cannot move the quiz labels -- they can only select them and submit answers to instructor User Lists Interactive Classroom Students added to list at runtime Student buttons are colored yellow when the student has pressed their question button Both instructor and students view the user list Instructor list is interactive. A student is called upon by pressing their button. Their button will then be colored green Student list is noninteractive. A student can only view the list 5

Application #2: Interactive Physics Allow students to interactively experiment with physics concepts in mixed reality Allow remote tables to interact in a common physical simulation environment Take advantage of object tracking to model real physical characteristics Display interactive labels such as vector arrows More Physics Tutorials Projectile Motion Students attempt to hit targets on other tables by solving projectile motion equations Rotational Motion Students experiment with the effects of applying force to various points on a real object. The system simulates the 2D center of mass and rotational inertia Collisions Objects from various tables collide. Students can experiment with the effects of mass and velocity Fluid Dynamics Flow lines are rendered to show fluid motion around objects placed on the tabletop Interactive Physics: Orbital Motion Students learn about 2D orbital motion and Newton s law of gravity F = ma = GM 0 M 1 / d 2 Students and teacher set the mass of an object placed on their respective tables The teacher sets the scale of the universe The student sets the initial velocity vector for the orbiting object Presentation System Overview Key Components Applications Physics Tutorial Conclusions Interactive Physics: Orbital Motion In Conclusion MRT creates a common tabletop for interaction among human users and objects MRT composes and synchronizes virtual and real objects for shared virtual venues involving local and remote users MRT demonstrates a low-cost scalable system that integrates multiple data streams over a uniform distributed platform 6

MRT Configuration and Performance Station specs: Pentium 4 @ 3.2 Ghz, 512 Mb RAM 100 Mbit Ethernet 640x480 resolution camera triggered at 20 FPS 1024x768 DLP projector at 60 FPS (total cost ~$4000) Per frame processing: video capture and warp: ~15 msec object tracking: 1 to 10 msec (depending on object count) network streamed video: ~7 msec Overall performance: 20 FPS, limited by projector synchronization Thank you! http://www.cs.purdue.edu/~aliaga/mrt.htm Future Work Synchronization Drift Provide richer virtual interactions and scenario creation (e.g., urban planning, emergency response training, ) Use multiple projectors and/or cameras to reproduce approximate 3D renderings Small delay (33 ms) between projector receiving signal and actual display Drifts slowly over time Configure camera to delay after receiving trigger signal Shutter delay is bits 16-31 of camera register 1108h Set the register via Register 1108 Changer Provides a graphical slider for setting camera delay Extend to more pervasive display and capture surfaces ( Mixed Reality Room ) Enhance user s perception by improving camera/projector synchronization (e.g., DLP synchronization, projecting non-black images, ) Acknowledgments Microsoft Research, Learning Sciences and Technology Computer Science Department, Purdue University Oliver Colic 7