Interactive Tables. ~Avishek Anand Supervised by: Michael Kipp Chair: Vitaly Friedman

Similar documents
Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces

MULTIMODAL MULTIPLAYER TABLETOP GAMING Edward Tse 1,2, Saul Greenberg 2, Chia Shen 1, Clifton Forlines 1

Prototyping of Interactive Surfaces

Multi-Modal User Interaction

A Multimodal Interaction Framework for Pervasive Game Applications

Audiopad: A Tag-based Interface for Musical Performance

User Interaction and Perception from the Correlation of Dynamic Visual Responses Melinda Piper

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Towards the Next Generation of Tabletop Gaming Experiences

Lifelog-Style Experience Recording and Analysis for Group Activities

A Gestural Interaction Design Model for Multi-touch Displays

LCC 3710 Principles of Interaction Design. Readings. Tangible Interfaces. Research Motivation. Tangible Interaction Model.

Multimodal Research at CPK, Aalborg

Interior Design with Augmented Reality

A Kinect-based 3D hand-gesture interface for 3D databases

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Multi-modal System Architecture for Serious Gaming

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Advances in Human!!!!! Computer Interaction

Interactive Multimedia Contents in the IllusionHole

Enhancing Tabletop Games with Relative Positioning Technology

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Advancements in Gesture Recognition Technology

Immersion in Multimodal Gaming

Bellairs Games Workshop. Massively Multiplayer Games

3D Interaction Techniques

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Dhvani : An Open Source Multi-touch Modular Synthesizer

What was the first gestural interface?

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

mixed reality & (tactile and) tangible interaction

Tangible User Interfaces

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

ModaDJ. Development and evaluation of a multimodal user interface. Institute of Computer Science University of Bern

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Heads up interaction: glasgow university multimodal research. Eve Hoggan

A Demo for efficient human Attention Detection based on Semantics and Complex Event Processing

Advanced User Interfaces: Topics in Human-Computer Interaction

User Interface Agents

Auto und Umwelt - das Auto als Plattform für Interaktive

User Interfaces. What is the User Interface? Player-Centric Interface Design

Meaning, Mapping & Correspondence in Tangible User Interfaces

VICs: A Modular Vision-Based HCI Framework

roblocks Constructional logic kit for kids CoDe Lab Open House March

User Interface Software Projects

ACTIVE, A PLATFORM FOR BUILDING INTELLIGENT SOFTWARE

Saphira Robot Control Architecture

Multi-User Multi-Touch Games on DiamondTouch with the DTFlash Toolkit

mixed reality mixed reality & (tactile and) tangible interaction (tactile and) tangible interaction class housekeeping about me

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

A Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds

Distributed Vision System: A Perceptual Information Infrastructure for Robot Navigation

Natural Interaction with Social Robots

Collaboration in Multimodal Virtual Environments

Multimodal Metric Study for Human-Robot Collaboration

p. 2 21st Century Learning Skills

Access Invaders: Developing a Universally Accessible Action Game

Interaction Design for the Disappearing Computer

The TViews Table for Storytelling and Gameplay

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

IMPROVING DIGITAL HANDOFF IN TABLETOP SHARED WORKSPACES. A Thesis Submitted to the College of. Graduate Studies and Research

GUIBDSS Gestural User Interface Based Digital Sixth Sense The wearable computer

Speech Controlled Mobile Games

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Video Games and Interfaces: Past, Present and Future Class #2: Intro to Video Game User Interfaces

Mixed Reality: A model of Mixed Interaction

Keywords MMORPG, LARP, RPG, TRRPG, pervasive, cross-platform, game, trans-reality, design.

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

HELPING THE DESIGN OF MIXED SYSTEMS

ACTIVE, A PLATFORM FOR BUILDING INTELLIGENT OPERATING ROOMS

Sensors & Systems for Human Safety Assurance in Collaborative Exploration

Hierarchical Controller for Robotic Soccer

Project Multimodal FooBilliard

Getting Started with Coding Awbie. Updated

Building a gesture based information display

A Multi-Touch Enabled Steering Wheel Exploring the Design Space

New interface approaches for telemedicine

Instructions.

Controlling vehicle functions with natural body language

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

Getting Started with Osmo Coding. Updated

Beginning 3D Game Development with Unity:

Portfolio. Swaroop Kumar Pal swarooppal.wordpress.com github.com/swarooppal1088

This tutorial will guide you through the process of adding basic ambient sound to a Level.

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

Department of Computer Science and Engineering The Chinese University of Hong Kong. Year Final Year Project

Case Study: The Autodesk Virtual Assistant

Augmented Reality Lecture notes 01 1

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

overview steffen p walz, m.a.

How to Make Games in MakeCode Arcade Created by Isaac Wellish. Last updated on :10:15 PM UTC

Design, Optimization and Production of an Ultra-Wideband (UWB) Receiver

Abstract. Keywords: Multi Touch, Collaboration, Gestures, Accelerometer, Virtual Prototyping. 1. Introduction

Effective Iconography....convey ideas without words; attract attention...

Information Layout and Interaction on Virtual and Real Rotary Tables

Collaborating with a Mobile Robot: An Augmented Reality Multimodal Interface

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

Microsoft Scrolling Strip Prototype: Technical Description

Transcription:

Interactive Tables ~Avishek Anand Supervised by: Michael Kipp Chair: Vitaly Friedman

Tables of Past

Tables of Future metadesk Dialog Table Lazy Susan Luminous Table Drift Table Habitat Message Table Reactive Table Ping Pong Plus EnterTaible Tangible Viewpoints Microsoft Surface

Today s Focus STARS: A Multimodal interactive framework for Pervasive gaming applications Multi Modal Multiplayer Tabletop Gaming Interaction Techniques for a musical performance on a table top Interface

STARS: A Multimodal interactive framework for Pervasive gaming applications

Goals of the System A platform for developing computer augmented board games. Novel hardware Unifying Traditional and Computer Table Top gaming. Computing without compromising the Human centered Interaction.

STARS Components Setup The Game Table InteracTable : plasma screen with no back projection. Overhead axis camera for detection of hand gestures and extra tracking of playing objects. Integrated RF-ID antenna for pawn/token detection.

STARS Components Setup Vertical Displays Public display about game status, game logic and display. PDA and Audio Devices Administering private data of game components. Private interaction between players. Verbal commands to the game.

STARS Setup Public Display Audio Devices PDA Interactive Table

Pool of Modes Large number of modes Audio, touch, PDA, gestures. Distinction between Input and Output modes. Complex Interactions Multiple Input modes Parallel Output modes

Interaction Framework Application level translates the multimodal interactions to Interaction Requests and sends them to the Interaction Manager. These Interaction requests consist of hints using which the Interaction Manager maps them to the respective devices.

Interaction Manager Maps Interaction Requests from the higher level of the game logic to the lower level of the so called Interface Services implemented for each device. Uses a rule-based system to determine how to map an Interaction Request to one or more input or output modes.

Interface Selection by Interaction Manager

STARS Software Architecture Built on.net architecture to allow a flexible distribution of components among different computers. Multimodal inputs may be offshored to different computers for better performance.

Advantages and Takeaways Social interaction only enhances with new set of interactions surfacing. Feature rich Gaming experience with augmented Game logic and Visualization. Easier for the application developer to formulate IR in a mode independent way. Program behavior can be tuned during run time without touching any code.

MULTIMODAL MULTIPLAYER TABLETOP GAMING

Aim of the Paper Multimodal gesture and speech input benefits collaborative interaction over such a digital table Designed a multimodal, multiplayer gaming environment that allows players to interact directly atop a digital table via speech and rich whole hand gestures.

Behavioral Foundations Summarization from research to out to find effectiveness of the following multimodal ways: Gestures and Hand positions Speech and Aloud Combination of Speech and Gestures Gaze Awareness

Application of Foundations Redesign of 2 multiplayer games: Warcraft III and The Sims

Use of Hand Gestures 5-finger grabbing gesture to reach, pick up, move and place items on a surface a fist gesture mimicking the use of a physical stamp to paste object instances on the terrain

Use of Hand Gestures Selecting six friendly units within a particular region of the screen using a two-handed selection gesture. One handed panning gesture similar to how one moves a paper map on a table

Use of Hand Gestures Pointing for item selection Hand gestures coupled with speech to define other meaningful custom actions

Use of Meaningful Speech A player should be able to rapidly issue commands to the game table. Its meaning should be easily understood by other players within the context of the visual landscape and the player s gestures. Speech to low level action mapping is easy. Speech and Gestures integration should be easy.

Feedback and Gaze Feedback and feedthrough are important in reflection of the translation of the gestures. E.g. Highlighted area shown by arrows or audio feedback as Yes Master. Awareness and Gaze are instrumental in understanding the modes, actions and consequences. E.g. understand from feedback what the player interprets.

Conclusion and Advantages This paper contributes multimodal co-located tabletop interaction as a new genre of home console gaming. Gaze and awareness hugely improved and provides rich gaming insight. Single user games can be easily repurposed for different game genres on similar lines.

Interaction Techniques for a musical performance on a table top Interface

Goals of the System - AudioPad Tracking object positions and translating them into meaningful commands for the music synthesizer. Define a vocabulary for Tangible User Interfaces (TUI), by spatial rearrangement of objects on the interface. Promote visual participation of onlookers during performances.

Interaction Techniques Hardware AudioPad is a tabletop RF tracking system with video projection. The tracked objects used here are pucks. The selector puck is used to select properties and others represent individual tracks.

Hierarchical Item Browsing and The non-dominant hand holds the puck to be modified, and the dominant hand holds the modifier puck. One can also select items from the tree using one hand, since it depends only on relative positions. Selection

Floating Menus Defining the selection area of the puck where the Menu it should not move. Re-centering of the Menu around the puck when moved When should it move?

Changing Continuous Parameters The value of the parameter is determined by the distance between the puck and another master puck. Defining a transfer function: Function which maps the distance to the value being configured

Setting Two-dimensional Parameters Use of effect zones on the table where the two dimensional motion of the puck controlled the two parameters Relatively adjusting pucks (two dimensional motions of the puck ) to effect changes in twodimensional parameters.

Conclusion and Advantages Making the interaction legible to the observers. Use of relative mapping based on the positions of other pucks for setting continuous parameters. Using both absolute and relative positioning and other TUI interaction techniques this system can be scaled to other domains as well.

References: Patten, Recht, Ishii (2006) Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces Magerkurth, Memisoglu, Engelke, Streitz (2004) Towards the Next Generation of Tabletop Gaming Experiences Magerkurth, Stenzel, Streitz, Neuhold (2003) A Multimodal Interaction Framework for Pervasive Game Applications. Tse et al. (2006) MULTIMODAL MULTIPLAYER TABLETOP GAMING. http://wikipedia.org/ http://social.cs.uiuc.edu/papers/

Thank You Questions Suggestions Discussions