UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

Similar documents
UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

BoBoiBoy Interactive Holographic Action Card Game Application

iwindow Concept of an intelligent window for machine tools using augmented reality

Early Take-Over Preparation in Stereoscopic 3D

MRT: Mixed-Reality Tabletop

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Wi-Fi Fingerprinting through Active Learning using Smartphones

PLEASE NOTE! THIS IS SELF ARCHIVED VERSION OF THE ORIGINAL ARTICLE

Multi-Touchpoint Design of Services for Troubleshooting and Repairing Trucks and Buses

SIU-CAVE. Cave Automatic Virtual Environment. Project Design. Version 1.0 (DRAFT) Prepared for. Dr. Christos Mousas JBU.

Collaboration on Interactive Ceilings

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Physical Affordances of Check-in Stations for Museum Exhibits

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

Paint with Your Voice: An Interactive, Sonic Installation

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Using Scalable, Interactive Floor Projection for Production Planning Scenario

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Augmented Reality in Transportation Construction

Augmented Reality And Ubiquitous Computing using HCI

ITS '14, Nov , Dresden, Germany

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

Virtual Reality for Real Estate a case study

Efficient In-Situ Creation of Augmented Reality Tutorials

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

Interior Design using Augmented Reality Environment

COMET: Collaboration in Applications for Mobile Environments by Twisting

Mixed Reality-based Process Control of Automatic Printed Circuit Board Assembly Lines

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Design Principles of Virtual Exhibits in Museums based on Virtual Reality Technology

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Building a bimanual gesture based 3D user interface for Blender

Application of 3D Terrain Representation System for Highway Landscape Design

The Mixed Reality Book: A New Multimedia Reading Experience

Spatial augmented reality to enhance physical artistic creation.

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

My project is based on How museum installations could be combined with gesture technologies to make them more interactive.

CREATING TOMORROW S SOLUTIONS INNOVATIONS IN CUSTOMER COMMUNICATION. Technologies of the Future Today

User Interfaces in Panoramic Augmented Reality Environments

interactive laboratory

UMI3D Unified Model for Interaction in 3D. White Paper

Interactions and Applications for See- Through interfaces: Industrial application examples

Augmented and Virtual Reality

Building a gesture based information display

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

Automated Virtual Observation Therapy

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

The presentation based on AR technologies

Mohammad Akram Khan 2 India

We should start thinking about Privacy Implications of Sonic Input in Everyday Augmented Reality!

Toward an Augmented Reality System for Violin Learning Support

CONTENT RICH INTERACTIVE, AND IMMERSIVE EXPERIENCES, IN ADVERTISING, MARKETING, AND EDUCATION

D8.1 PROJECT PRESENTATION

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

Augmented and Virtual Reality 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

A New Paradigm for Head-Mounted Display Technology: Application to Medical Visualization and Remote Collaborative Environments

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

Multi-User Interaction in Virtual Audio Spaces

Computer Graphics. Spring April Ghada Ahmed, PhD Dept. of Computer Science Helwan University

INTERIOUR DESIGN USING AUGMENTED REALITY

ISO JTC 1 SC 24 WG9 G E R A R D J. K I M K O R E A U N I V E R S I T Y

Gazemarks-Gaze-Based Visual Placeholders to Ease Attention Switching Dagmar Kern * Paul Marshall # Albrecht Schmidt * *

Gesture Recognition with Real World Environment using Kinect: A Review

ShadowTouch: a Multi-user Application Selection Interface for Interactive Public Displays

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Description of and Insights into Augmented Reality Projects from

IMAGINE IOT PROTOTYPE CHALLENGE PER HULTGREN

Advances In Natural And Applied Sciences 2018 April; 12(4): pages DOI: /anas

A Hybrid Immersive / Non-Immersive

VR/AR with ArcGIS. Pascal Mueller, Rex Hansen, Eric Wittner & Adrien Meriaux

Interaction With Adaptive and Ubiquitous User Interfaces

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Reflecting on Domestic Displays for Photo Viewing and Sharing

AUGMENTED REALITY FOR COLLABORATIVE EXPLORATION OF UNFAMILIAR ENVIRONMENTS

Roadblocks for building mobile AR apps

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING

HELPING THE DESIGN OF MIXED SYSTEMS

Interactive Multimedia Contents in the IllusionHole

3D and Sequential Representations of Spatial Relationships among Photos

Kissenger: A Kiss Messenger

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Collaborative Interaction through Spatially Aware Moving Displays

ISCW 2001 Tutorial. An Introduction to Augmented Reality

Wearable Laser Pointer Versus Head-Mounted Display for Tele-Guidance Applications?

Team 4. Kari Cieslak, Jakob Wulf-Eck, Austin Irvine, Alex Crane, Dylan Vondracek. Project SoundAround

A Demo for efficient human Attention Detection based on Semantics and Complex Event Processing

HCITools: Strategies and Best Practices for Designing, Evaluating and Sharing Technical HCI Toolkits

ubigaze: Ubiquitous Augmented Reality Messaging Using Gaze Gestures

Subject Description Form. Upon completion of the subject, students will be able to:

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Transcription:

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays Pascal Knierim, Markus Funk, Thomas Kosch Institute for Visualization and Interactive Systems University of Stuttgart Stuttgart, Germany {firstname.lastname}@vis.uni-stuttgart.de Anton Fedosov Università della Svizzera italiana Faculty of Informatics Lugano, Switzerland anton.fedosov@usi.ch Tamara Müller, Benjamin Schopf, Marc Weise and Albrecht Schmidt Institute for Visualization and Interactive Systems University of Stuttgart Stuttgart, Germany tamara mueller@outlook.com benjaminschopf@web.de marc.weise@yahoo.de albrecht.schmidt@acm.org Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. NordiCHI 16, October 23-27, 2016, Gothenburg, Sweden c 2016 ACM. ISBN 978-1-4503-4763-1/16/10...$15.00 DOI: http://dx.doi.org/10.1145/2971485.2996747 Abstract Interactive tabletops or projections became widely utilized in schools, museum exhibitions or conference rooms to teach and illustrate dynamic artifacts or support talks. In such scenarios, all observers, such as pupils and teachers, will perceive the same information even if they hold different positions and could benefit from an adapted and personalized view. We developed the UbiBeam++ mixed reality software toolkit to enable augmentation of an interactive projection surface using optical see-through glasses. Our toolkit supports simultaneous presentation of private, shared, and public content. Private and shared content is registered in space and presented through a head-mounted display, while public content is presented by a projector. Our toolkit simplifies the development of interactive projections with different visualization levels. In a preliminary study, participants understood the concept of personalized information space and appreciated the presentation of additional information. Looking forward, our toolkit supports the development and the exploration of various scenarios not only limited to teaching, presentations or games. Author Keywords Augmented Reality; interactive projected tabletop; toolkit; gaming; tabletop games.

ACM Classification Keywords H.5.0. [Information Interfaces and Presentation (e.g. HCI)]: General Introduction Interactive projections are set up in many places and are often used to provide a simplified representation of information in a collaborative scenario. In lessons, teachers use projectors to explain subjects at school. In museums, interactive projections invite visitors to explore parts of the exhibition or certain topics. Another scenario are interactive tabletops, which can be utilized to act as collaborative and competitive games. In most of these scenarios, each observer will perceive the same information within the interactive projection surface. However, it could be beneficial to provide personalized overlays like 3D models, videos or other annotations. A teacher for example could get additional content regarding a subject to make sure that pupils understood the context. In a competitive tabletop game, each player could see their private items, such as cards or tokens, but not the private items of their opponent. This can be enriched by public items, which are visible by all players to provide important global content. Displaying private information on a shared display was implemented by using modified shutter glasses [4, 6]. However, these systems only support a limited amount of simultaneous users. Alternatively, a phone or a head-mounted display (HMD) was used as a private screen [4, 5] to display private content during collaborative work or games. In these works, content is presented aside the interactive tabletop and a focus shift and indirect interaction is necessary. Figure 1: Reference setup of UbiBeam++: A projector is displaying public content, while an HMD provides shared and private content. Inspired by previous work of projector-based augmented reality [1, 2], we developed a toolkit, which combines optical see-through glasses with an interactive projection. Thus, we can augment the projection with user specific information. The combination of a projector with an HMD enables three different display spaces consisting of a 2D public, 2D or, 3D private, and a shared display space. In this paper, we present the UbiBeam++ toolkit which synchronizes an interactive projected public display with an HMD s private stereoscopic display. The UbiBeam++ toolkit allows the exploration of novel interaction and visualization concepts in competitive

scenarios like games as well as collaborative scenarios, which include teaching, gaming, and business applications. As a proof of concept, we implemented a board game with public, shared, and private elements. Furthermore, we conducted a preliminary study to evaluate the user experience of the system. Figure 2: A user s view onto the game field through the HMD. As the picture is taken trough the developer view of the Meta1, there is an offset of the 3D content. This offset is corrected by user-specific calibration when wearing the HMD. Figure 3: A picture of the users view taken through the HMD. The blue rectangle depicts the field of view of the the Meta1 HMD. The surrounding content in the periphery is provided by the top-mounted projector. Concept Based on previous work, we identified the following Goals for a multi-user augmented interactive projection toolkit. G1. Enable public, shared, and private view spaces: To support collaborative and competitive scenarios, an Augmented Reality (AR) toolkit needs the possibility to provide the following three dedicated views: (1) Public View content can be seen by observers that do not wear any HMD technology. (2) Shared View content can be seen by all collaborating workers wearing an HMD. (3) Private View content can only be seen by one dedicated user. G2. Provide 2D and 3D content: As most of the related approaches are limited to the projection of 2D content, a novel AR toolkit should be supported by technologies that facilitate 2D as well as 3D content onto surfaces. G3. Using off-the-shelf hardware: The last goal for the toolkit is the use commercially available hardware, as we want to provide an easily deployable platform that can be used by other developers or researchers to build interactive immersive applications. System To meet previously described design goals, we introduce UbiBeam++1, a toolkit to augment interactive projection. Our setup consists of three main components (G3): A stationary projector, a Microsoft Kinect v1, and a Meta1 head-mounted display, which is equipped with an inertial measurement unit (IMU), an RGB camera, and a depth sensor. An overview of the system is depicted in Figure 1. We use the projector to display 2D as well as public content, and the HMD to display 3D content (G2). Further, the Kinect v1 depth sensing camera detects interaction with the content in terms of touch and gesture recognition. In general, projector and Kinect v1 can also be replaced by a touch sensitive display. However, this eliminate the capability of performing and recognizing mid-air gestures. To enable private, shared, and public content, we use an HMD and projector as output devices. The public content (G1.1) is displayed directly using the projector. Shared content (G1.2) is shown on the HMD. However, it can only be seen by multiple users who have the permissions to view the content. Private content (G1.3) is only shown at the HMD of the user that the private content belongs to. For the projected content we use a main application that is written in C#. The graphical output of the application is achieved by having a maximized window on the projector. The output of this window defines a 2D coordinate system, which is the reference for positions in the system. Interactive content can be placed at any position in the coordinate system. In case the content is public, an image representing the content is shown via the 1 The Source code of the UbiBeam++ toolkit and the reference implementation are available at https://github.com/ hcilab-org/ubibeamplusplus.

projector. If the content is shared or private, the software shows a marker at the position of the content. If a developer defines a content object to be displayed in 3D, the content is also represented by a marker in the 2D coordinate system. 3D models are rendered by Unity2 and placed at the corresponding location on the projected marker. When displaying public 3D content, any public observers also have to wear an HMD. Each HMD is connected to a separate PC which connects to the main application via WiFi. During initialization, the content and markers that are synchronized for each user from wearing a HMDs. The RGB camera of the Meta1 HMD is constantly streaming images to the attached PC. Software is processing this stream to detect markers and track the position of the HMD in space using RGB stream and IMU. Thus correct perspective rendering using Unity is possible. If the user has the permission, recognized markers are overlaid by the corresponding private or shared 2D/3D element. Figure 4: The game field of the game Scrolls that we use in the reference implementation of UbiBeam++. The card slots are positioned using one fixed marker per player. As we designed our setup to enable users to interact with projected content, a Kinect depth sensing camera is placed above the table. To create an interactive projection that is capbable of detecting multiple touch events, we re-implemented the UbiDisplay toolkit from Hardy and Alexander [3] in C# to enable a faster processing of multiple touch events. After a simple onetime calibration, touch events can be mapped to the 2D coordinate system. Therefore, it can be used to interact with both 2D and 3D content belonging to each privacy group. The touch events are directly forwarded to the content, which enables the content to decide how to process the events. Thus, developers can use content as an interactive button, display or both. 2 Unity - www.unity3d.com - (last access: Aug. 10th 2016) Proof of Concept Implementation To show the capabilities of the UbiBeam++ toolkit, we adapted and implemented the game Scrolls from Mojang. Showcasing the public, shared, and private views, slight modifications of the original rules were necessary. Players can have up to five hand cards, which are either unit cards or spell cards. Unit cards can be placed on the game field and spell cards can be used to deal damage or heal units. The game field contains units that can attack in the line they are placed in. Units have health points (HP), attack points (AP), and a cooldown time until the next attack (see Figure 2). If a unit s cooldown lapsed, the unit attacks automatically. At the end of each line, a so called idol has to be destroyed by the opponent player. The goal of the game is to destroy all idols of the opponent player. To apply the introduced privacy concepts, we assigned aspects of the game as public, shared, and private content. We implemented the game field consisting of 15 hexagons per player (see Figure 4) as public content. Here the position of the units is shown. It further shows the resources, that a player has available and it displays three interactive buttons. One to draw an additional card, one to sacrifice a card for resources, and one to finish the turn. Regarding the shared content, we considered the 3D models of the units to be viewable by both players. However, an observer is not able to see the 3D models. Lastly, we defined the hand cards as private content, specifically in spells and units, that a player has available for playing. We considered displaying the units health points, attack points, and cooldown time to the owner of the units as further private information. In our reference implementation, all private content is rendered on an HMD in 3D.

Future Work We invited eight pairs of students to our preliminary study. Participants had the opportunity to explore the different privacy aspects of the game implemented with UbiBeam++. We gathered qualitative feedback during each session through interviews and open ended questions. We directed towards feedback regarding the comprehension of different display spaces as well as the overall concept. Qualitative feedback we collected indicates that participants appreciated the general idea of augmenting interactive projection with HMDs using UbiBeam++. We observed that participants were understanding the concept of shared, public and private content very quickly. Also, the participants were noticing the borders of the interactive area of the game very easy as it was limited by the projection. Almost every participant could imagine using UbiBeam++ for educational purposes, like in teaching, simulation, or exploring museums. Some participants also suggested integrating UbiBeam++ in the living room for playing games with family and friends. Since we are using optical see trough glasses, none of the participants complained about any motion sickness issues which often occur in video see-trough and virtual reality applications. Our preliminary study also allowed us to get insights about limitations of the system. The general drawbacks of an HMD also applies to using an HMD in UbiBeam++, as participants stated that the HMD is too heavy and the field of view for displaying 3D content is very limited. Another technical limitation using UbiBeam++ is that observers, which do not wear an HMD, cannot perceive any 3D content or private and shared content in general. In future work, we plan to address the mentioned issues. First, we want to enable markerless position estimation for private components by using the entire projected content as reference. Thereby, players are not distracted by markers, observers can view public content without distraction, and guessing the position of private content is not possible anymore. We plan to adjust the toolkit to other commercially available HMD, which are lighter and offer a greater field of view. Further, we see the value of developing interactive prototypes to support learning activities, collaborative work, and serious games using UbiBeam++ toolkit to explore its capabilities beyond competitive gaming scenarios. Conclusion In this poster, we present the UbiBeam++ toolkit. It combines the strength of optical see-through augmented reality glasses and interactive tabletop projection. Hereby, a large shared interactive 2D display beside a private stereoscopic display for augmenting the projection is available for content visualization. Furthermore, we enable the simultaneous presentation of interactive content with different privacy levels. Public content is displayed using a projector and is visible to anyone. User personalized shared or private content is presented using a head-mounted display. We implemented a strategy game to showcase the capabilities of the UbiBeam++ toolkit. Through a preliminary user study, we received initial insights about the experience when using our toolkit. Using this results we intend to further develop the framework and investigate the benefits which arise in collaborative and teaching scenarios suggested during the evaluation. We envision that data exploration and visualization in business or education scenarios could benefit from our UbiBeam++ toolkit. Additionally, by releasing the toolkit

and its reference implementation of the game to the community, we believe that UbiBeam++ provides an compelling starting point for developers and researchers to explore the field of augmented projection. Acknowledgements We thank all participants for their support during the study. This work was part of the project Be-greifen, supported by the German Federal Ministry of Education and Research, grant no. 16SV7527. The work was also supported by Swiss National Science Foundation grant no 156406 SHARING21 - Future Digital Sharing Interfaces. References [1] Benko, H., Ofek, E., Zheng, F., and Wilson, A. D. Fovear: Combining an optically see-through near-eye display with projector-based spatial augmented reality. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, UIST 15, ACM (New York, NY, USA, 2015), 129 135. [2] Gugenheimer, J., Knierim, P., Seifert, J., and Rukzio, E. Ubibeam: An interactive projector-camera system for domestic deployment. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces, ITS 14, ACM (New York, NY, USA, 2014), 305 310. [3] Hardy, J., and Alexander, J. Toolkit support for interactive projected displays. In Proceedings of the 11th International Conference on Mobile and Ubiquitous Multimedia, MUM 12, ACM (New York, NY, USA, 2012), 42:1 42:10. [4] Lissermann, R., Huber, J., Steimle, J., and Mühlhäuser, M. Permulin: Collaboration on interactive surfaces with personal in- and output. In CHI 13 Extended Abstracts on Human Factors in Computing Systems, CHI EA 13, ACM (New York, NY, USA, 2013), 1533 1538. [5] Shirazi, A. S., Döring, T., Parvahan, P., Ahrens, B., and Schmidt, A. Poker surface: combining a multi-touch table and mobile phones in interactive card games. In Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services, ACM (2009), 73. [6] Shoemaker, G. B. D., and Inkpen, K. M. Single display privacyware: Augmenting public displays with private information. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 01, ACM (New York, NY, USA, 2001), 522 529.