Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

Similar documents
Multi-Point Interactions with Immersive Omnidirectional Visualizations in a Dome

Beyond Flat Surface Computing: Challenges of Depth-Aware and Curved Interfaces

Beyond Flat Surface Computing: Challenges of Depth-Aware and Curved Interfaces

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience

COMET: Collaboration in Applications for Mobile Environments by Twisting

Building a gesture based information display

ECEN 4606, UNDERGRADUATE OPTICS LAB

Omni-Directional Catadioptric Acquisition System

Light Emitting Diode Illuminators for Video Microscopy and Machine Vision Applications

SIXTH SENSE TECHNOLOGY A STEP AHEAD

What was the first gestural interface?

Enhanced Virtual Transparency in Handheld AR: Digital Magnifying Glass

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

Make Machine Vision Lighting Work for You

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Gravitational Lensing Experiment

INTRODUCTION TO CCD IMAGING

Occlusion-Aware Menu Design for Digital Tabletops

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Advancements in Gesture Recognition Technology

Prac%ce Quiz 7. These are Q s from old quizzes. I do not guarantee that the Q s on this year s quiz will be the same, or even similar.

R (2) Controlling System Application with hands by identifying movements through Camera

Interface Design V: Beyond the Desktop

Beyond: collapsible tools and gestures for computational design

SIPS instructions for installation and use

Time-Lapse Panoramas for the Egyptian Heritage

TOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD

Properties of two light sensors

Using Scalable, Interactive Floor Projection for Production Planning Scenario

Social Editing of Video Recordings of Lectures

Panoramic imaging. Ixyzϕθλt. 45 degrees FOV (normal view)

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

MRT: Mixed-Reality Tabletop

University of California, Santa Barbara. CS189 Fall 17 Capstone. VR Telemedicine. Product Requirement Documentation

Organic UIs in Cross-Reality Spaces

Shopping Together: A Remote Co-shopping System Utilizing Spatial Gesture Interaction

True 2 ½ D Solder Paste Inspection

Gesture Recognition with Real World Environment using Kinect: A Review

Diploma Thesis Final Report: A Wall-sized Focus and Context Display. Sebastian Boring Ludwig-Maximilians-Universität München

Advanced Diploma in. Photoshop. Summary Notes

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Measuring intensity in watts rather than lumens

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

The Mixed Reality Book: A New Multimedia Reading Experience

Table of Contents. Stanford University, p3 UC-Boulder, p7 NEOFELT, p8 HCPU, p9 Sussex House, p43

Guidelines for choosing VR Devices from Interaction Techniques

Realtime 3D Computer Graphics Virtual Reality

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

Digital Photographic Imaging Using MOEMS

- applications on same or different network node of the workstation - portability of application software - multiple displays - open architecture

Novel Hemispheric Image Formation: Concepts & Applications

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

A Study of Navigation and Selection Techniques in Virtual Environments Using Microsoft Kinect

Mixed Reality Approach and the Applications using Projection Head Mounted Display

Touch & Gesture. HCID 520 User Interface Software & Technology

interactive laboratory

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

Spotlight 150 and 200 FT-IR Microscopy Systems

- Modifying the histogram by changing the frequency of occurrence of each gray scale value may improve the image quality and enhance the contrast.

Image Manipulation Interface using Depth-based Hand Gesture

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

REPORT ON THE CURRENT STATE OF FOR DESIGN. XL: Experiments in Landscape and Urbanism

WaveForm: Remote Video Blending for VJs Using In-Air Multitouch Gestures

Optical Window Design for ALS Devices updated by Kerry Glover and Dave Mehrl February 2010

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

One Display for a Cockpit Interactive Solution: The Technology Challenges

Enabling Cursor Control Using on Pinch Gesture Recognition

Information Layout and Interaction on Virtual and Real Rotary Tables

Effective Iconography....convey ideas without words; attract attention...

TESTING VISUAL TELESCOPIC DEVICES

Toward an Augmented Reality System for Violin Learning Support

Classifying 3D Input Devices

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

User Interface Agents

GlobiScope Analysis Software for the Globisens QX7 Digital Microscope. Quick Start Guide

Projection Based HCI (Human Computer Interface) System using Image Processing

Double-side Multi-touch Input for Mobile Devices

A fast F-number 10.6-micron interferometer arm for transmitted wavefront measurement of optical domes

Microscopy. The dichroic mirror is an important component of the fluorescent scope: it reflects blue light while transmitting green light.

Short Course on Computational Illumination

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

Sense. 3D scanning application for Intel RealSense 3D Cameras. Capture your world in 3D. User Guide. Original Instructions

AgilEye Manual Version 2.0 February 28, 2007

Interactions in a Human-Scale Immersive Environment: the CRAIVE- Lab

International Journal of Computer Engineering and Applications, Volume XII, Issue IV, April 18, ISSN

Pinhole Camera. Nuts and Bolts

Direct gaze based environmental controls

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Camera Overview. Digital Microscope Cameras for Material Science: Clear Images, Precise Analysis. Digital Cameras for Microscopy

Sensing Human Activities With Resonant Tuning

CSE 165: 3D User Interaction. Lecture #11: Travel

UbiBeam: An Interactive Projector-Camera System for Domestic Deployment

Multi-touch Technology 6.S063 Engineering Interaction Technologies. Prof. Stefanie Mueller MIT CSAIL HCI Engineering Group

Oculus, the company that sparked the VR craze to begin with, is finally releasing its first commercial product. This is history.

Family of Stereo Microscopes Quality microscopes for industry and life sciences

Preview. Light and Reflection Section 1. Section 1 Characteristics of Light. Section 2 Flat Mirrors. Section 3 Curved Mirrors

Transcription:

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Abstract Pinch-the-Sky Dome is a large immersive installation where several users can interact simultaneously with omni-directional data inside of a tilted geodesic dome. Our system consists of an omni-directional projectorcamera unit in the center of the dome. The projector is able to project an image spanning the entire 360 degrees and a camera is used to track freehand gestures for navigation of the content. The interactive demos include: 1) the exploration of the astronomical data provided by World Wide Telescope, 2) social networking 3D graph visualizations, 3) immersive panoramic images, and 4) 360 degree video conferencing. We combine speech commands with freehand pinch gestures to provide a highly immersive and interactive experience to several users inside the dome, with a very wide field of view for each user. Keywords Freehand interaction, omni-directional interface, gestures, dome, curved displays. Copyright is held by the author/owner(s). CHI 2010, April 10 15, 2010, Atlanta, Georgia, USA. ACM 978-1-60558-930-5/10/04.

2 ACM Classification Keywords H5.2. Information interfaces and presentation (e.g., HCI): User Interfaces Input devices and strategies: Graphical user interfaces. General Terms Design, Human Factors. Introduction Pinch-the-Sky Dome is a large immersive installation where several users can interact simultaneously with omni-directional data inside of a tilted geodesic dome (Figure 1). This experience is designed to immerse the users in the omni-directional visualization and allow them to manipulate and interact with data using freehand gestures in mid-air without the need to wear or hold tracking devices. In designing this experience, we focused on exploring ways to allow the users to interact with immersive content beyond arm s reach through simple gestures and without on-body trackers. We also aimed to highlight the increasing availability of omni-directional content (e.g., panoramic imagery, space data, earth mapping data, etc.) and explore effective ways of visualizing it within an immersive curved display. Dome Experience The user enters the dome through the entry gate which is designed to capture outside light. Inside, the user is immersed in a 360 degree interactive experience. Our 9ft (2.7m) dome can comfortably accommodate up to 5 observers at any given time. Inside, the observers have a choice of four different visualizations. Figure 1. Pinch-the-Sky Dome (the entry gate is not shown in order to capture the user inside). First, we project astronomical imagery from World Wide Telescope 1 in our dome and allow the user to explore the sky and the universe by simply moving their hands above the projector (Figure 2a). As part of the experience, the users travel around the Solar system, visit the outskirts of the known universe, and observe the incredible imagery from the Hubble Space Telescope. Second, the observers can be virtually transported to several remote destinations by presenting high resolution omni-directional panoramic images; for example, Apollo 12 lunar landing site, the lobby of the Microsoft Research building, etc. (Figure 2b). Third, we show a live feed from a 360 degree camera which can be used for omni-directional video 1 http://www.worldwidetelescope.org

3 conferencing scenarios with remote participants (Figure 2c). Lastly, the observers can explore complex custom made 3D graph visualizations (Figure 2d) showing the social network graph of one of the authors or animations that highlight the immersive nature of the dome. Implementation Our 9ft geodesic dome is constructed of cardboard sheets following a 2V design 2, using large paper clips to hold the cardboard sheets together. The dome rests on a 30 degree tilted base built of standard construction lumber. We wrapped base area under the dome with dark fabric to ensure light insulation. The cardboard dome surrounds the projector and serves as the large hemispherical projection surface. The various elements of this construction can be seen in Figure 1. Figure 3. The projector-camera unit with a wide angle lens and the infrared illumination ring around it. Figure 2. A collection of four different applications shown in the dome: a) World Wide Telescope (e.g., Solar System visualization), b) panoramic imagery (e.g., the Apollo 17 lunar landing site), c) 360 degree video-conferencing application, and d) 3D visualization of a social networking graph. Note: images are circularly distorted for dome projection. In the middle of the dome, we placed a custommade omni-directional projector-camera unit (Figure 3). This unit is based on the Magic Planet display from Global Imagination, Inc 3 which we previously 2 http://www.desertdomes.com 3 http://www.globalimagination.com

4 demonstrated in our Sphere project [2]. The Magic Planet projector base uses a high-resolution DLP projector (Projection Design F20 sx+, 1400x1050 pixels) and a custom wide-angle lens to project imagery from the bottom of the device onto a spherical surface. In this project, we removed the spherical display surface of Magic Planet and simply projected onto the entire hemisphere of the dome surface. The quality of the projected image depends on the size of the dome; the brightness, contrast, and resolution of the projector; and the amount of ambient light that enters the dome. Our projector is capable of displaying a circular image with diameter of 1050 pixels, or approximately 866,000 pixels. To enable freehand interactions above the projector in mid-air, we reused the same optical axis of the projection and we added: an infra-red (IR) sensitive camera, an IR-pass filter for the camera, an IR-cut filter for the projector, an IR illumination ring, and a cold mirror. The physical layout of these components is illustrated in Figure 4. Gesture-sensing is performed by an IR camera (Firefly MV camera by Point Grey Research 4 ). This camera is able to image the entire area of the projected display. To ensure that sensing is not affected by currently visible projected data, we perform touchsensing in the IR portion of the light spectrum, while the projected display contains only light in the visible spectrum. This light spectrum separation approach has previously been demonstrated in many camera-based 4 http://www.ptgrey.com sensing prototypes. To provide IR light used in sensing, our setup requires a separate IR illumination source (i.e., the illumination ring around the lens). Illumination ring (IR LEDs) Wide angle lens Cold mirror IR pass filter IR cut filter IR camera Projector Figure 4. Schematic of the omni-directional projector-camera unit. The detail image shows the wide-angle lens and the IR illumination ring around it. User Interactions The main contribution of this work is in enabling the user to interact with omni-directional data in the dome using simple freehand gestures above the projector without special gloves or tracking devices. We acknowledge that for many scenarios there are important benefits associated with using tracked physical devices; for example, reduction of hand movement and fatigue, availability of mode-switching buttons, and haptic feedback. On the other hand, tracked devices can be cumbersome, may be prone to getting lost, require batteries, and so on. Furthermore, in multi-user collaborative scenarios, the need to hand off a tracked device in order to be able to interact with

5 the system can impede the flexibility and the fluidity of interaction. One crucial freehand gestural interaction issue is the problem of gesture delimiters, i.e., how can the system know when the movement is supposed to be a particular gesture or action and not simply a natural human movement through space. For surface interactions, touch contacts provide straightforward delimiters: when the user touches the surface they are engaged/interacting, and lift off usually signals the end of the action. However in mid-air, it is not often obvious how to disengage from the 3D environment we live in. This issue is similar to the classical Midas touch problem. Therefore, gestures should be designed to avoid accidental activation, but remain simple and easy to perform and detect. Figure 5. The detection of pinching gestures above the projector (left) in our binarized camera image (right). Red ellipses mark the points where pinching was detected. Since our projector-aligned camera is able to image the entire dome, it is difficult to detect when the user is actively engaged with the system and when they are simply watching or interacting with others in the dome. We require a simple and reliable way to detect when the interactions begin and end (i.e., the equivalent of a mouse click in a standard user interface). We therefore chose the pinching gesture (from [5]) as the basic unit of interaction. This can be seen by the camera as two fingers of the hand coming together and making a small hole (Figure 5). This enabled us to literally pinch the content and move it around to follow the hand, or introduce two or more pinches to zoom in or out similar to more standard multi-touch interactions available on interactive surfaces. Figure 6. Using a pinching gesture to interact with the projected content. The user is also wearing a headset microphone. We extended this basic functionality with speech recognition in combination with a specific hand gesture: the user may put two hands together (making in effect

6 a larger pinch or hole) and then speak a verbal command which in turn switches visualization modes. Conclusions and Future Work Pinch-the-Sky Dome showcases how simple gestural interactions can greatly enhance the immersive experience and how large wide-field-of-view displays provide an immersive perspective of standard widely available data. The inspiration for our work comes from the early work of Wellner [4] and Pinhanez et al. [3] where they imagined many interactive surfaces in the environment adapting to the users and their context. While Pinhanez et al. [3] explored similar ideas while researching interactions with a steerable projector, they were unable to simultaneously project on a variety of surfaces in the environment, which we are able to do. However, the limited brightness and resolution of today s projectors prevents us from fully realizing this vision without an enclosed and perfectly dark room. Ultimately, we would like to simply place our projector-camera setup in any room and use any surface (walls, tables, couches, etc.) for both projection and interaction, making the idea of on-demand ubiquitous interactive surfaces a reality. Acknowledgements We would like to thank Jonathan Fay and the World Wide Telescope team, and Mike Foody and Global Imagination, Inc. References [1] Benko, H. (2009). Beyond Flat Surface Computing: Challenges of Depth-Aware and Curved Interfaces. In Proceedings of ACM MultiMedia '09. p. 935-944. [2] Benko, H., Wilson, A., and Balakrishnan, R. (2008) Sphere: Multi-Touch Interactions on a Spherical Display In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST). p. 77-86. [3] Pinhanez, C. S. (2001) The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces. In Proceedings of the International Conference on Ubiquitous Computing (UBICOMP). p. 315-331. [4] Wellner, P. (1993). Interacting with paper on the DigitalDesk. Communications of the ACM. 36, 7 (Jul. 1993). p. 87 96. [5] Wilson, A. (2006) Robust Computer Vision-Based Detection of Pinching for One and Two-Handed Gesture Input. In Proceedings of the ACM Symposium on User Interface Software and Technology (UIST). p. 255-258.