Investigating Gestures on Elastic Tabletops

Similar documents
PopObject: A Robotic Screen for Embodying Video-Mediated Object Presentations

Using Hands and Feet to Navigate and Manipulate Spatial Data

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

Beyond: collapsible tools and gestures for computational design

Welcome, Introduction, and Roadmap Joseph J. LaViola Jr.

Simulation of Tangible User Interfaces with the ROS Middleware

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

ITS '14, Nov , Dresden, Germany

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Digital Paper Bookmarks: Collaborative Structuring, Indexing and Tagging of Paper Documents

Occlusion-Aware Menu Design for Digital Tabletops

Double-side Multi-touch Input for Mobile Devices

3D Interactions with a Passive Deformable Haptic Glove

TapBoard: Making a Touch Screen Keyboard

Programming reality: From Transitive Materials to organic user interfaces

Translucent Tangibles on Tabletops: Exploring the Design Space

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Physical Affordances of Check-in Stations for Museum Exhibits

Running an HCI Experiment in Multiple Parallel Universes

Mudpad: Fluid Haptics for Multitouch Surfaces

Organic UIs in Cross-Reality Spaces

COMET: Collaboration in Applications for Mobile Environments by Twisting

Effects of Display Sizes on a Scrolling Task using a Cylindrical Smartwatch

Information Layout and Interaction on Virtual and Real Rotary Tables

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

Haptic Feedback in Remote Pointing

Social and Spatial Interactions: Shared Co-Located Mobile Phone Use

Using Scalable, Interactive Floor Projection for Production Planning Scenario

Haptic Camera Manipulation: Extending the Camera In Hand Metaphor

Multimodal Interaction Concepts for Mobile Augmented Reality Applications

Integration of Hand Gesture and Multi Touch Gesture with Glove Type Device

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

Comparison of Haptic and Non-Speech Audio Feedback

AR Tamagotchi : Animate Everything Around Us

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

clayodor: Retrieving Scents through the Manipulation of Malleable Material

Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Interior Design using Augmented Reality Environment

Dynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone

FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy

Integrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices

Paint with Your Voice: An Interactive, Sonic Installation

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

The Mixed Reality Book: A New Multimedia Reading Experience

The Good, the Bad and the Hacked: Creative Coding on Objects

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

Interactive Multimedia Contents in the IllusionHole

Andriy Pavlovych. Research Interests

Haplug: A Haptic Plug for Dynamic VR Interactions

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Application of 3D Terrain Representation System for Highway Landscape Design

Project Multimodal FooBilliard

QS Spiral: Visualizing Periodic Quantified Self Data

A Gestural Interaction Design Model for Multi-touch Displays

Twisting Touch: Combining Deformation and Touch as Input within the Same Interaction Cycle on Handheld Devices

Sensing Human Activities With Resonant Tuning

HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays

User Interface Software Projects

Simplifying Remote Collaboration through Spatial Mirroring

Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application

Evaluation of Spatial Abilities through Tabletop AR

APPEAL DECISION. Appeal No USA. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan

ACTUI: Using Commodity Mobile Devices to Build Active Tangible User Interfaces

Exploring Surround Haptics Displays

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

CapWidgets: Tangible Widgets versus Multi-Touch Controls on Mobile Devices

3D and Sequential Representations of Spatial Relationships among Photos

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

One Display for a Cockpit Interactive Solution: The Technology Challenges

Running an HCI Experiment in Multiple Parallel Universes

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Development of Video Chat System Based on Space Sharing and Haptic Communication

ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field

mixed reality mixed reality & (tactile and) tangible interaction (tactile and) tangible interaction class housekeeping about me

A Kinect-based 3D hand-gesture interface for 3D databases

3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray

A Multi-Touch Enabled Steering Wheel Exploring the Design Space

Jamming User Interfaces: Programmable Particle Stiffness and Sensing for Malleable and Shape-Changing Devices

Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity

Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp

Collaborative Pseudo-Haptics: Two-User Stiffness Discrimination Based on Visual Feedback

Tangible Sketching in 3D with Posey

Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display

DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS. Lucia Terrenghi*

What was the first gestural interface?

Air-filled type Immersive Projection Display

HELPING THE DESIGN OF MIXED SYSTEMS

LightBeam: Nomadic Pico Projector Interaction with Real World Objects

Transcription:

Investigating Gestures on Elastic Tabletops Dietrich Kammer Thomas Gründer Chair of Media Design Chair of Media Design Technische Universität DresdenTechnische Universität Dresden 01062 Dresden, Germany 01062 Dresden, Germany dietrich.kammer@tu-dresden.de thomas.gruender@tudresden.de Fabian Göbel Chair of Media Design Technische Universität Dresden 01062 Dresden, Germany fabian.goebel@tu-dresden.de Rainer Groh Chair of Media Design Technische Universität Dresden 01062 Dresden, Germany rainer.groh@tu-dresden.de Abstract This work in progress investigates gestures on tabletops with elastic displays that allow temporary deformations of the surface. While tabletops with rigid interactive surfaces have been subject of the research agenda of tangible, embedded, and embodied interaction for a considerable amount of time, we review novel systems that exploit the third dimension offered by tabletops with elastic surfaces. In addition, we propose a tentative interaction syntax. In a user study, we compare push gestures on elastic tabletops with swipe gestures on a multi-touch display. Author Keywords Elastic displays; Tabletops; Haptic interaction; Natural interaction; Elastic gestures ACM Classification Keywords H.5.2 [Information interfaces and presentation (e.g., HCI)]: User Interfaces: Input Devices and Strategies, Interaction Styles. General Terms Human Factors; Interaction Design; Elastic Displays Copyright is held by the author/owner(s). TEI 14, Feb 16 Feb 19, 2014, Munich, Germany. Introduction Elastic displays address a challenge that current tabletop systems are facing: the lack of differentiated haptic

feedback. While a very direct and natural interaction with 2D content is possible with standard rigid multi-touch surfaces, they only provide uniform feedback for the users hands. Elastic tabletops promise to remedy this lack of tactile and spatial experience. However, other than displays that can be permanently deformed, they also maintain consistency of shape and thus enable viewing an interface in a standard and well-established way. In contrast to gestures performed in mid-air, elastic displays add a third dimension to touch while maintaining haptic feedback. First, we give a brief review of related work in the domain of elastic displays, distinguishing the domain from other research in the field of deformable displays (cp. [1]). Second, we propose a tentative interaction syntax for elastic gestures. Our pilot study investigates whether an elastic display holds advantages over a multi-touch tabletop and shows that a number of challenges remain to establish this novel type of interactive surface. Related Work Recently, researchers have started to focus on interactive surfaces other than flat and rigid ones [1, 12]. While there is a considerable body of work in the literature concerning malleable displays [5, 10] and actuated displays [9], the knowledge about elastic displays that feature only temporary deformations is scarce [6, 2]. One of the first elastic displays presented is the Khronos projector by Cassinelli and Ishikawa [3]. It is a vertical installation of a deformable tissue that is used to fast-forward to a certain position in a video when touched and depressed. With the deformable workspace a comprehensive system for manipulating virtual 3D objects on vertical elastic displays is available [13]. However, in this paper we focus on horizontal tabletop systems with elastic displays. An elastic display that allows varied haptic feedback is MudPad [7]. Although it is used as horizontal display, its size is relatively small. One of the first published systems that exhibited a tabletop system with an elastic display is DepthTouch [11]. The Obake display is a prototype devised at MIT media lab that demonstrates various interactions with a silicone based screen [4]. The proposed interaction language features various combinations of intruding and extruding the elastic display, which are addressed in the next section. Interaction Syntax for Elastic Gestures Elastic displays afford novel types of interaction using hands. Although the gestures themselves are not elastic, we use the term elastic gestures. A systematic overview of interaction techniques with elastic displays is still lacking in the literature. For multi-touch gestures, various high-level abstractions exist, e. g. based on a semiotic analysis [8]. As a prerequisite for such future work in the domain of elastic displays, we propose a tentative interaction syntax for elastic gestures. We identified three main categories: push, pull, and touch (see Table 1). Pushing the surface produces valleys and requires a certain amount of strength from the user depending on the depth of pushing (see Figure 1). Pulling an elastic surface requires not only strength but also a certain amount of training and dexterity (see Figure 1). In our experience, most new users have difficulties performing this interaction. The third category is touch interaction, which is comparable to standard multi-touch interaction. In theory, all multi-touch gestures possible on a rigid planar surface can also be performed on an elastic screen.

Figure 1: Interacting with an elastic display using push and pull gestures. However, there is another parameter: pressure. A touch gesture can be performed with different degrees of force and thus in different depth levels. Difficulties in dragging on an elastic display have been described by [2]. Categories PUSH PULL TOUCH Object Manipulation Indirect & direct Indirect & direct Direct Static Hit Flip Tap, Hold Dynamic Multiple hands pushing surface Multiple hands pulling surface Joining & splitting of pulled areas Combination of techniques Multi-touch gestures with different pressure Table 1: Classification of interaction syntax for elastic displays Another important issue is the way objects are being manipulated. While touch interaction is commonly based on direct manipulation (i.e. touching) of virtual objects on the surface, pushing and pulling an elastic display can also be used to indirectly collect and disperse objects by exploiting natural physics. All of the main categories of the proposed elastic gestures can either be of static or dynamic nature (cp. [14]). The main property of static gestures is that no continuous movement of the user is necessary. For pushing, a simple hit (or bump) on the elastic surface causes a slight vibration, similar to ripples on a water surface. In the case of pulling, letting the surface flip down by quickly letting go of the cloth causes a similar effect. In the case of standard touch interaction, tap or hold gestures are considered to be static (cp. [14]). In contrast to hitting and flipping an elastic display, only visual feedback can be provided by the application. Static gestures are often used to perform selections, activate menus, or confirm actions. Dynamic gestures also address the movement while pushing or pulling the cloth in order to manipulate objects. The depth of a push or height of a pull are important parameters that can be used by an application for different purposes. A single user can employ two hands to push or pull the cloth at multiple locations (see Figure 1). In the case of multi-users more than two areas can be manipulated in parallel (see Figure 1). It is conceivable that multiple pulled areas can be brought together or be pulled away from each other (cp. [4]). As stated before, all dynamic multi-touch gestures are also conceivable on an elastic tabletop. Moreover, a combination of pulling, pushing, and touch interaction is possible. While considerable knowledge exists on multi-touch gestures [8, 14], there is little or no evaluation of how they are performed on an elastic display. Furthermore, the diverse pushing and pulling gestures possible on an elastic tabletop have not been thoroughly investigated yet (cp. [2]). Study on Memorability and Learnability We compared the interaction using a standard rigid multi-touch surface with a tabletop using an elastic display. In our pilot study we investigated the following underlying hypotheses: [H1] Interaction with an elastic display supports finding and memorizing different layers of information (memorability), [H2] The interaction with metaphors based on depth levels is easier to learn and

understand using an elastic display (learnability). Participants There were 21 participants (6 female), from 24 to 36 years averaging at about 29 (SD = 2.97). All but one were familiar with multi-touch tabletops (7 or more hours) and only 2 with interacting on an elastic display. Most of the participants (14) knew how to interact with the elastic tabletop (1-2 hours experience), but were no professionals in dealing with it. Apparatus We used a commercial multi-touch tabletop with physical size of 115 x 85 x 95 cm (width, length, height) and our custom built elastic tabletop with physical size of 115 x 85 x 107 cm (width, length, height). Both test applications were implemented using Java and the Processing libraries for visualization. Figure 2: Study setup with elastic tabletop (including internal setup of projector, kinect sensor, and mirror) and multi-touch tabletop system. Procedure Participants were presented with the task of finding and matching images of fruits (see Figure 2). While one image showed the target, the other two had to be manipulated until the target was found. There were 8 images in each stack and all of the 8 fruit cards had to be found in each of the 5 experiment blocks. An intermediate break of 15 seconds took place between the blocks. Hence, every participant had to solve a total of 40 trials on each system and fill out a survey afterwards. Before starting the experiment, every participant did 5 trials for training. The task in the study was to manipulate two stacks of images in order to find a given target image, similar to the popular memory card game. Since stacks in reality also have a certain depth, stacks should provide an appropriate interaction metaphor to compare multi-touch and elastic tabletop interaction. Only push gestures from our interaction syntax were used on the elastic tabletop. On the multi-touch display, we also aimed for continuous manipulation gestures and hence implemented swipe gestures to the left and right to browse the image stacks. In both cases the image stacks were explored with an interaction technique appropriate to the technology. Half of the participants started with the elastic tabletop and the others with the multi-touch tabletop. This procedure avoided a bias towards one of the tabletops in the data. The Items in the stack were ordered randomly on the left and on the right side, but fixed overall for both systems. Data Collection We collected task completion times and conducted a NASA-TLX questionnaire to identify task difficulties (see Figure 3). A custom survey elicited subjective opinions regarding the two systems. Results In general, the task was solved faster on the multi-touch surface (t-test, mean = 1.2 seconds faster, t(20) = 4.14, p<.001) and judged more efficient in the survey. However, the elastic interaction was regarded easier to understand and to learn (p =.012). In both cases, the error rate was the same (zero). The NASA-TLX survey supported the findings in the data. The multi-touch system was rated easier to use (p =.005) and less demanding in a physical way (p<.001). Furthermore, we found a tendency towards a higher frustration with the elastic tabletop (p<.010). The data also showed that participants with equal experience in both systems were less frustrated with the elastic tabletop than with the multi-touch system. For this group, the task itself was regarded easier to solve with the elastic tabletop. However, it was rated physically more demanding and slower to solve than on the multi-touch tabletop.

While our study showed first indications that learning and understanding of an elastic tabletop is easier, it also showed that the multi-touch interaction technique was faster. Hence, we could find support for our second hypothesis, but no evidence that our first hypothesis holds true. One reason could be the study design, since the multi-touch interaction may have led to a memorization of the position of the items as well, however, not in depth, but in horizontal positions. The frustration with the elastic tabletop as reported by the participants stems mainly from jittering in the tracking system. The result was a significantly higher amount of card turns on the elastic tabletop. With a more robust tracking hardware, we hope to be able to remedy this source of frustration. Likewise, the precision of the depth interaction will then close the gap to multi-touch technology. Mean completion times for 40 trials (seconds) 18 16 14 12 10 8 6 4 2 0 16 14 12 10 8 6 4 2 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 Rating (0=lowest, 20=highest) Mental Demand Physical Demand Temporal Demand Individual study participants Performance Effort Frustration Elastic Tabletop Multi-touch Tabletop Figure 3: Mean completion times of the 21 study participants (error bars show standard deviations) and results of the NASA-TLX questionnaire (mean values across 21 participants, error bars show standard deviations).

Conclusions and Future Work In this report on our work in progress, we proposed a tentative interaction syntax for elastic tabletops. A thorough and systematic evaluation of all possible gestures is subject of future work. While our pilot study does not yet show significant benefits over standard rigid multi-touch surfaces, we believe that more robust hardware and suitable applications is going to prove that a more natural and intuitive interaction is possible with elastic tabletops. Acknowledgements Thomas Gründer has been supported by the European Social Fund and the Free State of Saxony. References [1] Alexander, J., Brotman, R., Holman, D., Younkin, A., Vertegaal, R., Kildal, J., Lucero, A. A., Roudaut, A., and Subramanian, S. Organic experiences: (re)shaping interactions with deformable displays. In CHI 13 Extended Abstracts, ACM 2013, 3171 3174. [2] Bacim, F., Sinclair, M., and Benko, H. Understanding touch selection accuracy on flat and hemispherical deformable surfaces. In Proc. of the 2013 Graphics Interface Conference, GI 13, Canadian Information Processing Society 2013, 197 204. [3] Cassinelli, A., and Ishikawa, M. Khronos projector. In ACM SIGGRAPH 2005 Emerging technologies, 2005. [4] Dand, D. Obake: Interactions with a 2.5D elastic display, 2013. [5] Follmer, S., Johnson, M., Adelson, E., and Ishii, H. deform: an interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch. In Proc. of the 24th annual UIST, ACM 2011, 527 536. [6] Gruender, T., Kammer, D., Brade, M., and Groh, R. Towards a design space for elastic displays. In ACM SIGCHI CHI - Workshop: Displays Take New Shape: An Agenda for Future Interactive Surfaces (Paris - France, 2013). [7] Jansen, Y., Karrer, T., and Borchers, J. MudPad: tactile feedback for touch surfaces. In CHI 11 Extended Abstracts, ACM 2011, 323 328. [8] Kammer, D., Wojdziak, J., Keck, M., Groh, R., and Taranko, S. Towards a formalization of multi-touch gestures. In ACM ITS, ACM 2010, 49 58. [9] Leithinger, D., and Ishii, H. Relief: a scalable actuated shape display. In Proc. of the fourth TEI, ACM 2010, 221 222. [10] Matoba, Y., Sato, T., Takahashi, N., and Koike, H. ClaytricSurface: an interactive surface with dynamic softness control capability. In ACM SIGGRAPH 2012 Emerging Technologies, ACM 2012, 6:1. [11] Peschke, J., Göbel, F., Gründer, T., Keck, M., Kammer, D., and Groh, R. DepthTouch: an elastic surface for tangible computing. In Proc. of the International Working Conference on Advanced Visual Interfaces, ACM 2012, 770 771. [12] Steimle, J., Benko, H., Cassinelli, A., Ishii, H., Leithinger, D., Maes, P., and Poupyrev, I. Displays take new shape: an agenda for future interactive surfaces. In CHI 13 Extended Abstracts, ACM 2013, 3283 3286. [13] Watanabe, Y., Cassinelli, A., Komuro, T., and Ishikawa, M. The deformable workspace: A membrane between real and virtual space. In 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, 2008. TABLETOP 2008 (2008), 145 152. [14] Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems, ACM 2009, 1083 1092.