Investigating Gestures on Elastic Tabletops
|
|
- Damian Wells
- 5 years ago
- Views:
Transcription
1 Investigating Gestures on Elastic Tabletops Dietrich Kammer Thomas Gründer Chair of Media Design Chair of Media Design Technische Universität DresdenTechnische Universität Dresden Dresden, Germany Dresden, Germany Fabian Göbel Chair of Media Design Technische Universität Dresden Dresden, Germany Rainer Groh Chair of Media Design Technische Universität Dresden Dresden, Germany Abstract This work in progress investigates gestures on tabletops with elastic displays that allow temporary deformations of the surface. While tabletops with rigid interactive surfaces have been subject of the research agenda of tangible, embedded, and embodied interaction for a considerable amount of time, we review novel systems that exploit the third dimension offered by tabletops with elastic surfaces. In addition, we propose a tentative interaction syntax. In a user study, we compare push gestures on elastic tabletops with swipe gestures on a multi-touch display. Author Keywords Elastic displays; Tabletops; Haptic interaction; Natural interaction; Elastic gestures ACM Classification Keywords H.5.2 [Information interfaces and presentation (e.g., HCI)]: User Interfaces: Input Devices and Strategies, Interaction Styles. General Terms Human Factors; Interaction Design; Elastic Displays Copyright is held by the author/owner(s). TEI 14, Feb 16 Feb 19, 2014, Munich, Germany. Introduction Elastic displays address a challenge that current tabletop systems are facing: the lack of differentiated haptic
2 feedback. While a very direct and natural interaction with 2D content is possible with standard rigid multi-touch surfaces, they only provide uniform feedback for the users hands. Elastic tabletops promise to remedy this lack of tactile and spatial experience. However, other than displays that can be permanently deformed, they also maintain consistency of shape and thus enable viewing an interface in a standard and well-established way. In contrast to gestures performed in mid-air, elastic displays add a third dimension to touch while maintaining haptic feedback. First, we give a brief review of related work in the domain of elastic displays, distinguishing the domain from other research in the field of deformable displays (cp. [1]). Second, we propose a tentative interaction syntax for elastic gestures. Our pilot study investigates whether an elastic display holds advantages over a multi-touch tabletop and shows that a number of challenges remain to establish this novel type of interactive surface. Related Work Recently, researchers have started to focus on interactive surfaces other than flat and rigid ones [1, 12]. While there is a considerable body of work in the literature concerning malleable displays [5, 10] and actuated displays [9], the knowledge about elastic displays that feature only temporary deformations is scarce [6, 2]. One of the first elastic displays presented is the Khronos projector by Cassinelli and Ishikawa [3]. It is a vertical installation of a deformable tissue that is used to fast-forward to a certain position in a video when touched and depressed. With the deformable workspace a comprehensive system for manipulating virtual 3D objects on vertical elastic displays is available [13]. However, in this paper we focus on horizontal tabletop systems with elastic displays. An elastic display that allows varied haptic feedback is MudPad [7]. Although it is used as horizontal display, its size is relatively small. One of the first published systems that exhibited a tabletop system with an elastic display is DepthTouch [11]. The Obake display is a prototype devised at MIT media lab that demonstrates various interactions with a silicone based screen [4]. The proposed interaction language features various combinations of intruding and extruding the elastic display, which are addressed in the next section. Interaction Syntax for Elastic Gestures Elastic displays afford novel types of interaction using hands. Although the gestures themselves are not elastic, we use the term elastic gestures. A systematic overview of interaction techniques with elastic displays is still lacking in the literature. For multi-touch gestures, various high-level abstractions exist, e. g. based on a semiotic analysis [8]. As a prerequisite for such future work in the domain of elastic displays, we propose a tentative interaction syntax for elastic gestures. We identified three main categories: push, pull, and touch (see Table 1). Pushing the surface produces valleys and requires a certain amount of strength from the user depending on the depth of pushing (see Figure 1). Pulling an elastic surface requires not only strength but also a certain amount of training and dexterity (see Figure 1). In our experience, most new users have difficulties performing this interaction. The third category is touch interaction, which is comparable to standard multi-touch interaction. In theory, all multi-touch gestures possible on a rigid planar surface can also be performed on an elastic screen.
3 Figure 1: Interacting with an elastic display using push and pull gestures. However, there is another parameter: pressure. A touch gesture can be performed with different degrees of force and thus in different depth levels. Difficulties in dragging on an elastic display have been described by [2]. Categories PUSH PULL TOUCH Object Manipulation Indirect & direct Indirect & direct Direct Static Hit Flip Tap, Hold Dynamic Multiple hands pushing surface Multiple hands pulling surface Joining & splitting of pulled areas Combination of techniques Multi-touch gestures with different pressure Table 1: Classification of interaction syntax for elastic displays Another important issue is the way objects are being manipulated. While touch interaction is commonly based on direct manipulation (i.e. touching) of virtual objects on the surface, pushing and pulling an elastic display can also be used to indirectly collect and disperse objects by exploiting natural physics. All of the main categories of the proposed elastic gestures can either be of static or dynamic nature (cp. [14]). The main property of static gestures is that no continuous movement of the user is necessary. For pushing, a simple hit (or bump) on the elastic surface causes a slight vibration, similar to ripples on a water surface. In the case of pulling, letting the surface flip down by quickly letting go of the cloth causes a similar effect. In the case of standard touch interaction, tap or hold gestures are considered to be static (cp. [14]). In contrast to hitting and flipping an elastic display, only visual feedback can be provided by the application. Static gestures are often used to perform selections, activate menus, or confirm actions. Dynamic gestures also address the movement while pushing or pulling the cloth in order to manipulate objects. The depth of a push or height of a pull are important parameters that can be used by an application for different purposes. A single user can employ two hands to push or pull the cloth at multiple locations (see Figure 1). In the case of multi-users more than two areas can be manipulated in parallel (see Figure 1). It is conceivable that multiple pulled areas can be brought together or be pulled away from each other (cp. [4]). As stated before, all dynamic multi-touch gestures are also conceivable on an elastic tabletop. Moreover, a combination of pulling, pushing, and touch interaction is possible. While considerable knowledge exists on multi-touch gestures [8, 14], there is little or no evaluation of how they are performed on an elastic display. Furthermore, the diverse pushing and pulling gestures possible on an elastic tabletop have not been thoroughly investigated yet (cp. [2]). Study on Memorability and Learnability We compared the interaction using a standard rigid multi-touch surface with a tabletop using an elastic display. In our pilot study we investigated the following underlying hypotheses: [H1] Interaction with an elastic display supports finding and memorizing different layers of information (memorability), [H2] The interaction with metaphors based on depth levels is easier to learn and
4 understand using an elastic display (learnability). Participants There were 21 participants (6 female), from 24 to 36 years averaging at about 29 (SD = 2.97). All but one were familiar with multi-touch tabletops (7 or more hours) and only 2 with interacting on an elastic display. Most of the participants (14) knew how to interact with the elastic tabletop (1-2 hours experience), but were no professionals in dealing with it. Apparatus We used a commercial multi-touch tabletop with physical size of 115 x 85 x 95 cm (width, length, height) and our custom built elastic tabletop with physical size of 115 x 85 x 107 cm (width, length, height). Both test applications were implemented using Java and the Processing libraries for visualization. Figure 2: Study setup with elastic tabletop (including internal setup of projector, kinect sensor, and mirror) and multi-touch tabletop system. Procedure Participants were presented with the task of finding and matching images of fruits (see Figure 2). While one image showed the target, the other two had to be manipulated until the target was found. There were 8 images in each stack and all of the 8 fruit cards had to be found in each of the 5 experiment blocks. An intermediate break of 15 seconds took place between the blocks. Hence, every participant had to solve a total of 40 trials on each system and fill out a survey afterwards. Before starting the experiment, every participant did 5 trials for training. The task in the study was to manipulate two stacks of images in order to find a given target image, similar to the popular memory card game. Since stacks in reality also have a certain depth, stacks should provide an appropriate interaction metaphor to compare multi-touch and elastic tabletop interaction. Only push gestures from our interaction syntax were used on the elastic tabletop. On the multi-touch display, we also aimed for continuous manipulation gestures and hence implemented swipe gestures to the left and right to browse the image stacks. In both cases the image stacks were explored with an interaction technique appropriate to the technology. Half of the participants started with the elastic tabletop and the others with the multi-touch tabletop. This procedure avoided a bias towards one of the tabletops in the data. The Items in the stack were ordered randomly on the left and on the right side, but fixed overall for both systems. Data Collection We collected task completion times and conducted a NASA-TLX questionnaire to identify task difficulties (see Figure 3). A custom survey elicited subjective opinions regarding the two systems. Results In general, the task was solved faster on the multi-touch surface (t-test, mean = 1.2 seconds faster, t(20) = 4.14, p<.001) and judged more efficient in the survey. However, the elastic interaction was regarded easier to understand and to learn (p =.012). In both cases, the error rate was the same (zero). The NASA-TLX survey supported the findings in the data. The multi-touch system was rated easier to use (p =.005) and less demanding in a physical way (p<.001). Furthermore, we found a tendency towards a higher frustration with the elastic tabletop (p<.010). The data also showed that participants with equal experience in both systems were less frustrated with the elastic tabletop than with the multi-touch system. For this group, the task itself was regarded easier to solve with the elastic tabletop. However, it was rated physically more demanding and slower to solve than on the multi-touch tabletop.
5 While our study showed first indications that learning and understanding of an elastic tabletop is easier, it also showed that the multi-touch interaction technique was faster. Hence, we could find support for our second hypothesis, but no evidence that our first hypothesis holds true. One reason could be the study design, since the multi-touch interaction may have led to a memorization of the position of the items as well, however, not in depth, but in horizontal positions. The frustration with the elastic tabletop as reported by the participants stems mainly from jittering in the tracking system. The result was a significantly higher amount of card turns on the elastic tabletop. With a more robust tracking hardware, we hope to be able to remedy this source of frustration. Likewise, the precision of the depth interaction will then close the gap to multi-touch technology. Mean completion times for 40 trials (seconds) Rating (0=lowest, 20=highest) Mental Demand Physical Demand Temporal Demand Individual study participants Performance Effort Frustration Elastic Tabletop Multi-touch Tabletop Figure 3: Mean completion times of the 21 study participants (error bars show standard deviations) and results of the NASA-TLX questionnaire (mean values across 21 participants, error bars show standard deviations).
6 Conclusions and Future Work In this report on our work in progress, we proposed a tentative interaction syntax for elastic tabletops. A thorough and systematic evaluation of all possible gestures is subject of future work. While our pilot study does not yet show significant benefits over standard rigid multi-touch surfaces, we believe that more robust hardware and suitable applications is going to prove that a more natural and intuitive interaction is possible with elastic tabletops. Acknowledgements Thomas Gründer has been supported by the European Social Fund and the Free State of Saxony. References [1] Alexander, J., Brotman, R., Holman, D., Younkin, A., Vertegaal, R., Kildal, J., Lucero, A. A., Roudaut, A., and Subramanian, S. Organic experiences: (re)shaping interactions with deformable displays. In CHI 13 Extended Abstracts, ACM 2013, [2] Bacim, F., Sinclair, M., and Benko, H. Understanding touch selection accuracy on flat and hemispherical deformable surfaces. In Proc. of the 2013 Graphics Interface Conference, GI 13, Canadian Information Processing Society 2013, [3] Cassinelli, A., and Ishikawa, M. Khronos projector. In ACM SIGGRAPH 2005 Emerging technologies, [4] Dand, D. Obake: Interactions with a 2.5D elastic display, [5] Follmer, S., Johnson, M., Adelson, E., and Ishii, H. deform: an interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch. In Proc. of the 24th annual UIST, ACM 2011, [6] Gruender, T., Kammer, D., Brade, M., and Groh, R. Towards a design space for elastic displays. In ACM SIGCHI CHI - Workshop: Displays Take New Shape: An Agenda for Future Interactive Surfaces (Paris - France, 2013). [7] Jansen, Y., Karrer, T., and Borchers, J. MudPad: tactile feedback for touch surfaces. In CHI 11 Extended Abstracts, ACM 2011, [8] Kammer, D., Wojdziak, J., Keck, M., Groh, R., and Taranko, S. Towards a formalization of multi-touch gestures. In ACM ITS, ACM 2010, [9] Leithinger, D., and Ishii, H. Relief: a scalable actuated shape display. In Proc. of the fourth TEI, ACM 2010, [10] Matoba, Y., Sato, T., Takahashi, N., and Koike, H. ClaytricSurface: an interactive surface with dynamic softness control capability. In ACM SIGGRAPH 2012 Emerging Technologies, ACM 2012, 6:1. [11] Peschke, J., Göbel, F., Gründer, T., Keck, M., Kammer, D., and Groh, R. DepthTouch: an elastic surface for tangible computing. In Proc. of the International Working Conference on Advanced Visual Interfaces, ACM 2012, [12] Steimle, J., Benko, H., Cassinelli, A., Ishii, H., Leithinger, D., Maes, P., and Poupyrev, I. Displays take new shape: an agenda for future interactive surfaces. In CHI 13 Extended Abstracts, ACM 2013, [13] Watanabe, Y., Cassinelli, A., Komuro, T., and Ishikawa, M. The deformable workspace: A membrane between real and virtual space. In 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems, TABLETOP 2008 (2008), [14] Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proc. of the SIGCHI Conference on Human Factors in Computing Systems, ACM 2009,
PopObject: A Robotic Screen for Embodying Video-Mediated Object Presentations
PopObject: A Robotic Screen for Embodying Video-Mediated Object Presentations Kana Kushida (&) and Hideyuki Nakanishi Department of Adaptive Machine Systems, Osaka University, 2-1 Yamadaoka, Suita, Osaka
More informationUsing Hands and Feet to Navigate and Manipulate Spatial Data
Using Hands and Feet to Navigate and Manipulate Spatial Data Johannes Schöning Institute for Geoinformatics University of Münster Weseler Str. 253 48151 Münster, Germany j.schoening@uni-muenster.de Florian
More informationDepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface
DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface Hrvoje Benko and Andrew D. Wilson Microsoft Research One Microsoft Way Redmond, WA 98052, USA
More informationBeyond: collapsible tools and gestures for computational design
Beyond: collapsible tools and gestures for computational design The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation As Published
More informationWelcome, Introduction, and Roadmap Joseph J. LaViola Jr.
Welcome, Introduction, and Roadmap Joseph J. LaViola Jr. Welcome, Introduction, & Roadmap 3D UIs 101 3D UIs 201 User Studies and 3D UIs Guidelines for Developing 3D UIs Video Games: 3D UIs for the Masses
More informationSimulation of Tangible User Interfaces with the ROS Middleware
Simulation of Tangible User Interfaces with the ROS Middleware Stefan Diewald 1 stefan.diewald@tum.de Andreas Möller 1 andreas.moeller@tum.de Luis Roalter 1 roalter@tum.de Matthias Kranz 2 matthias.kranz@uni-passau.de
More informationEvaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University
More informationFeelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces
Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics
More informationITS '14, Nov , Dresden, Germany
3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo,
More informationBeyond Actuated Tangibles: Introducing Robots to Interactive Tabletops
Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops Sowmya Somanath Department of Computer Science, University of Calgary, Canada. ssomanat@ucalgary.ca Ehud Sharlin Department of Computer
More informationDrumtastic: Haptic Guidance for Polyrhythmic Drumming Practice
Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice ABSTRACT W e present Drumtastic, an application where the user interacts with two Novint Falcon haptic devices to play virtual drums. The
More informationDigital Paper Bookmarks: Collaborative Structuring, Indexing and Tagging of Paper Documents
Digital Paper Bookmarks: Collaborative Structuring, Indexing and Tagging of Paper Documents Jürgen Steimle Technische Universität Darmstadt Hochschulstr. 10 64289 Darmstadt, Germany steimle@tk.informatik.tudarmstadt.de
More informationOcclusion-Aware Menu Design for Digital Tabletops
Occlusion-Aware Menu Design for Digital Tabletops Peter Brandl peter.brandl@fh-hagenberg.at Jakob Leitner jakob.leitner@fh-hagenberg.at Thomas Seifried thomas.seifried@fh-hagenberg.at Michael Haller michael.haller@fh-hagenberg.at
More informationDouble-side Multi-touch Input for Mobile Devices
Double-side Multi-touch Input for Mobile Devices Double side multi-touch input enables more possible manipulation methods. Erh-li (Early) Shen Jane Yung-jen Hsu National Taiwan University National Taiwan
More information3D Interactions with a Passive Deformable Haptic Glove
3D Interactions with a Passive Deformable Haptic Glove Thuong N. Hoang Wearable Computer Lab University of South Australia 1 Mawson Lakes Blvd Mawson Lakes, SA 5010, Australia ngocthuong@gmail.com Ross
More informationTapBoard: Making a Touch Screen Keyboard
TapBoard: Making a Touch Screen Keyboard Sunjun Kim, Jeongmin Son, and Geehyuk Lee @ KAIST HCI Laboratory Hwan Kim, and Woohun Lee @ KAIST Design Media Laboratory CHI 2013 @ Paris, France 1 TapBoard: Making
More informationProgramming reality: From Transitive Materials to organic user interfaces
Programming reality: From Transitive Materials to organic user interfaces The MIT Faculty has made this article openly available. Please share how this access benefits you. Your story matters. Citation
More informationTranslucent Tangibles on Tabletops: Exploring the Design Space
Translucent Tangibles on Tabletops: Exploring the Design Space Mathias Frisch mathias.frisch@tu-dresden.de Ulrike Kister ukister@acm.org Wolfgang Büschel bueschel@acm.org Ricardo Langner langner@acm.org
More informationsynchrolight: Three-dimensional Pointing System for Remote Video Communication
synchrolight: Three-dimensional Pointing System for Remote Video Communication Jifei Ou MIT Media Lab 75 Amherst St. Cambridge, MA 02139 jifei@media.mit.edu Sheng Kai Tang MIT Media Lab 75 Amherst St.
More informationPhysical Affordances of Check-in Stations for Museum Exhibits
Physical Affordances of Check-in Stations for Museum Exhibits Tilman Dingler tilman.dingler@vis.unistuttgart.de Benjamin Steeb benjamin@jsteeb.de Stefan Schneegass stefan.schneegass@vis.unistuttgart.de
More informationRunning an HCI Experiment in Multiple Parallel Universes
Author manuscript, published in "ACM CHI Conference on Human Factors in Computing Systems (alt.chi) (2014)" Running an HCI Experiment in Multiple Parallel Universes Univ. Paris Sud, CNRS, Univ. Paris Sud,
More informationMudpad: Fluid Haptics for Multitouch Surfaces
Mudpad: Fluid Haptics for Multitouch Surfaces Yvonne Jansen RWTH Aachen University 52056 Aachen, Germany yvonne@cs.rwth-aachen.de Abstract In this paper, we present an active haptic multitouch input device.
More informationOrganic UIs in Cross-Reality Spaces
Organic UIs in Cross-Reality Spaces Derek Reilly Jonathan Massey OCAD University GVU Center, Georgia Tech 205 Richmond St. Toronto, ON M5V 1V6 Canada dreilly@faculty.ocad.ca ragingpotato@gatech.edu Anthony
More informationCOMET: Collaboration in Applications for Mobile Environments by Twisting
COMET: Collaboration in Applications for Mobile Environments by Twisting Nitesh Goyal RWTH Aachen University Aachen 52056, Germany Nitesh.goyal@rwth-aachen.de Abstract In this paper, we describe a novel
More informationEffects of Display Sizes on a Scrolling Task using a Cylindrical Smartwatch
Effects of Display Sizes on a Scrolling Task using a Cylindrical Smartwatch Paul Strohmeier Human Media Lab Queen s University Kingston, ON, Canada paul@cs.queensu.ca Jesse Burstyn Human Media Lab Queen
More informationInformation Layout and Interaction on Virtual and Real Rotary Tables
Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer System Information Layout and Interaction on Virtual and Real Rotary Tables Hideki Koike, Shintaro Kajiwara, Kentaro Fukuchi
More informationTransporters: Vision & Touch Transitive Widgets for Capacitive Screens
Transporters: Vision & Touch Transitive Widgets for Capacitive Screens Florian Heller heller@cs.rwth-aachen.de Simon Voelker voelker@cs.rwth-aachen.de Chat Wacharamanotham chat@cs.rwth-aachen.de Jan Borchers
More informationHaptic Feedback in Remote Pointing
Haptic Feedback in Remote Pointing Laurens R. Krol Department of Industrial Design Eindhoven University of Technology Den Dolech 2, 5600MB Eindhoven, The Netherlands l.r.krol@student.tue.nl Dzmitry Aliakseyeu
More informationSocial and Spatial Interactions: Shared Co-Located Mobile Phone Use
Social and Spatial Interactions: Shared Co-Located Mobile Phone Use Andrés Lucero User Experience and Design Team Nokia Research Center FI-33721 Tampere, Finland andres.lucero@nokia.com Jaakko Keränen
More informationUsing Scalable, Interactive Floor Projection for Production Planning Scenario
Using Scalable, Interactive Floor Projection for Production Planning Scenario Michael Otto, Michael Prieur Daimler AG Wilhelm-Runge-Str. 11 D-89013 Ulm {michael.m.otto, michael.prieur}@daimler.com Enrico
More informationHaptic Camera Manipulation: Extending the Camera In Hand Metaphor
Haptic Camera Manipulation: Extending the Camera In Hand Metaphor Joan De Boeck, Karin Coninx Expertise Center for Digital Media Limburgs Universitair Centrum Wetenschapspark 2, B-3590 Diepenbeek, Belgium
More informationMultimodal Interaction Concepts for Mobile Augmented Reality Applications
Multimodal Interaction Concepts for Mobile Augmented Reality Applications Wolfgang Hürst and Casper van Wezel Utrecht University, PO Box 80.089, 3508 TB Utrecht, The Netherlands huerst@cs.uu.nl, cawezel@students.cs.uu.nl
More informationIntegration of Hand Gesture and Multi Touch Gesture with Glove Type Device
2016 4th Intl Conf on Applied Computing and Information Technology/3rd Intl Conf on Computational Science/Intelligence and Applied Informatics/1st Intl Conf on Big Data, Cloud Computing, Data Science &
More informationCheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone
CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone Young-Woo Park Department of Industrial Design, KAIST, Daejeon, Korea pyw@kaist.ac.kr Chang-Young Lim Graduate School of
More informationComparison of Haptic and Non-Speech Audio Feedback
Comparison of Haptic and Non-Speech Audio Feedback Cagatay Goncu 1 and Kim Marriott 1 Monash University, Mebourne, Australia, cagatay.goncu@monash.edu, kim.marriott@monash.edu Abstract. We report a usability
More informationAR Tamagotchi : Animate Everything Around Us
AR Tamagotchi : Animate Everything Around Us Byung-Hwa Park i-lab, Pohang University of Science and Technology (POSTECH), Pohang, South Korea pbh0616@postech.ac.kr Se-Young Oh Dept. of Electrical Engineering,
More informationPinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data
Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data Hrvoje Benko Microsoft Research One Microsoft Way Redmond, WA 98052 USA benko@microsoft.com Andrew D. Wilson Microsoft
More informationclayodor: Retrieving Scents through the Manipulation of Malleable Material
clayodor: Retrieving Scents through the Manipulation of Malleable Material Cindy Hsin-Liu Kao* cindykao@media.mit.edu Ermal Dreshaj* ermal@media.mit.edu Judith Amores* amores@media.mit.edu Sang-won Leigh*
More informationTangible Bits: Towards Seamless Interfaces between People, Bits and Atoms
Tangible Bits: Towards Seamless Interfaces between People, Bits and Atoms Published in the Proceedings of CHI '97 Hiroshi Ishii and Brygg Ullmer MIT Media Laboratory Tangible Media Group 20 Ames Street,
More informationIllusion of Surface Changes induced by Tactile and Visual Touch Feedback
Illusion of Surface Changes induced by Tactile and Visual Touch Feedback Katrin Wolf University of Stuttgart Pfaffenwaldring 5a 70569 Stuttgart Germany katrin.wolf@vis.uni-stuttgart.de Second Author VP
More informationDiamondTouch SDK:Support for Multi-User, Multi-Touch Applications
MITSUBISHI ELECTRIC RESEARCH LABORATORIES http://www.merl.com DiamondTouch SDK:Support for Multi-User, Multi-Touch Applications Alan Esenther, Cliff Forlines, Kathy Ryall, Sam Shipman TR2002-48 November
More informationInvestigating Phicon Feedback in Non- Visual Tangible User Interfaces
Investigating Phicon Feedback in Non- Visual Tangible User Interfaces David McGookin and Stephen Brewster Glasgow Interactive Systems Group School of Computing Science University of Glasgow Glasgow, G12
More informationINTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT
INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,
More informationArbitrating Multimodal Outputs: Using Ambient Displays as Interruptions
Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Ernesto Arroyo MIT Media Laboratory 20 Ames Street E15-313 Cambridge, MA 02139 USA earroyo@media.mit.edu Ted Selker MIT Media Laboratory
More informationInterior Design using Augmented Reality Environment
Interior Design using Augmented Reality Environment Kalyani Pampattiwar 2, Akshay Adiyodi 1, Manasvini Agrahara 1, Pankaj Gamnani 1 Assistant Professor, Department of Computer Engineering, SIES Graduate
More informationDynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone
Dynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone Fabian Hemmert Deutsche Telekom Laboratories Ernst-Reuter-Platz 7 10587 Berlin, Germany mail@fabianhemmert.de Gesche Joost Deutsche
More informationFlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy
FlexAR: A Tangible Augmented Reality Experience for Teaching Anatomy Michael Saenz Texas A&M University 401 Joe Routt Boulevard College Station, TX 77843 msaenz015@gmail.com Kelly Maset Texas A&M University
More informationIntegrating PhysX and OpenHaptics: Efficient Force Feedback Generation Using Physics Engine and Haptic Devices
This is the Pre-Published Version. Integrating PhysX and Opens: Efficient Force Feedback Generation Using Physics Engine and Devices 1 Leon Sze-Ho Chan 1, Kup-Sze Choi 1 School of Nursing, Hong Kong Polytechnic
More informationPaint with Your Voice: An Interactive, Sonic Installation
Paint with Your Voice: An Interactive, Sonic Installation Benjamin Böhm 1 benboehm86@gmail.com Julian Hermann 1 julian.hermann@img.fh-mainz.de Tim Rizzo 1 tim.rizzo@img.fh-mainz.de Anja Stöffler 1 anja.stoeffler@img.fh-mainz.de
More informationMarkerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces
Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces Huidong Bai The HIT Lab NZ, University of Canterbury, Christchurch, 8041 New Zealand huidong.bai@pg.canterbury.ac.nz Lei
More informationHaptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.
Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction. Figure 1. Setup for exploring texture perception using a (1) black box (2) consisting of changeable top with laser-cut haptic cues,
More informationThe Mixed Reality Book: A New Multimedia Reading Experience
The Mixed Reality Book: A New Multimedia Reading Experience Raphaël Grasset raphael.grasset@hitlabnz.org Andreas Dünser andreas.duenser@hitlabnz.org Mark Billinghurst mark.billinghurst@hitlabnz.org Hartmut
More informationThe Good, the Bad and the Hacked: Creative Coding on Objects
The Good, the Bad and the Hacked: Creative Coding on Objects Renaud Gervais, Jérémy Laviole, Asier Marzo, Martin Hachet To cite this version: Renaud Gervais, Jérémy Laviole, Asier Marzo, Martin Hachet.
More informationA Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones
A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones Jianwei Lai University of Maryland, Baltimore County 1000 Hilltop Circle, Baltimore, MD 21250 USA jianwei1@umbc.edu
More informationFrom Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness
From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness Alaa Azazi, Teddy Seyed, Frank Maurer University of Calgary, Department of Computer Science
More informationInteractive Multimedia Contents in the IllusionHole
Interactive Multimedia Contents in the IllusionHole Tokuo Yamaguchi, Kazuhiro Asai, Yoshifumi Kitamura, and Fumio Kishino Graduate School of Information Science and Technology, Osaka University, 2-1 Yamada-oka,
More informationAndriy Pavlovych. Research Interests
Research Interests Andriy Pavlovych andriyp@cse.yorku.ca http://www.cse.yorku.ca/~andriyp/ Human Computer Interaction o Human Performance in HCI Investigated the effects of latency, dropouts, spatial and
More informationHaplug: A Haptic Plug for Dynamic VR Interactions
Haplug: A Haptic Plug for Dynamic VR Interactions Nobuhisa Hanamitsu *, Ali Israr Disney Research, USA nobuhisa.hanamitsu@disneyresearch.com Abstract. We demonstrate applications of a new actuator, the
More informationArtex: Artificial Textures from Everyday Surfaces for Touchscreens
Artex: Artificial Textures from Everyday Surfaces for Touchscreens Andrew Crossan, John Williamson and Stephen Brewster Glasgow Interactive Systems Group Department of Computing Science University of Glasgow
More informationE90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright
E90 Project Proposal 6 December 2006 Paul Azunre Thomas Murray David Wright Table of Contents Abstract 3 Introduction..4 Technical Discussion...4 Tracking Input..4 Haptic Feedack.6 Project Implementation....7
More informationUsability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions
Sesar Innovation Days 2014 Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions DLR German Aerospace Center, DFS German Air Navigation Services Maria Uebbing-Rumke, DLR Hejar
More informationDirect Manipulation. and Instrumental Interaction. CS Direct Manipulation
Direct Manipulation and Instrumental Interaction 1 Review: Interaction vs. Interface What s the difference between user interaction and user interface? Interface refers to what the system presents to the
More informationApplication of 3D Terrain Representation System for Highway Landscape Design
Application of 3D Terrain Representation System for Highway Landscape Design Koji Makanae Miyagi University, Japan Nashwan Dawood Teesside University, UK Abstract In recent years, mixed or/and augmented
More informationProject Multimodal FooBilliard
Project Multimodal FooBilliard adding two multimodal user interfaces to an existing 3d billiard game Dominic Sina, Paul Frischknecht, Marian Briceag, Ulzhan Kakenova March May 2015, for Future User Interfaces
More informationQS Spiral: Visualizing Periodic Quantified Self Data
Downloaded from orbit.dtu.dk on: May 12, 2018 QS Spiral: Visualizing Periodic Quantified Self Data Larsen, Jakob Eg; Cuttone, Andrea; Jørgensen, Sune Lehmann Published in: Proceedings of CHI 2013 Workshop
More informationA Gestural Interaction Design Model for Multi-touch Displays
Songyang Lao laosongyang@ vip.sina.com A Gestural Interaction Design Model for Multi-touch Displays Xiangan Heng xianganh@ hotmail ABSTRACT Media platforms and devices that allow an input from a user s
More informationTwisting Touch: Combining Deformation and Touch as Input within the Same Interaction Cycle on Handheld Devices
Twisting Touch: Combining Deformation and Touch as Input within the Same Interaction Cycle on Handheld Devices Johan Kildal¹, Andrés Lucero², Marion Boberg² Nokia Research Center ¹ P.O. Box 226, FI-00045
More informationSensing Human Activities With Resonant Tuning
Sensing Human Activities With Resonant Tuning Ivan Poupyrev 1 ivan.poupyrev@disneyresearch.com Zhiquan Yeo 1, 2 zhiquan@disneyresearch.com Josh Griffin 1 joshdgriffin@disneyresearch.com Scott Hudson 2
More informationHandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays
HandMark Menus: Rapid Command Selection and Large Command Sets on Multi-Touch Displays Md. Sami Uddin 1, Carl Gutwin 1, and Benjamin Lafreniere 2 1 Computer Science, University of Saskatchewan 2 Autodesk
More informationUser Interface Software Projects
User Interface Software Projects Assoc. Professor Donald J. Patterson INF 134 Winter 2012 The author of this work license copyright to it according to the Creative Commons Attribution-Noncommercial-Share
More informationSimplifying Remote Collaboration through Spatial Mirroring
Simplifying Remote Collaboration through Spatial Mirroring Fabian Hennecke 1, Simon Voelker 2, Maximilian Schenk 1, Hauke Schaper 2, Jan Borchers 2, and Andreas Butz 1 1 University of Munich (LMU), HCI
More informationInteraction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application
Interaction Techniques for Immersive Virtual Environments: Design, Evaluation, and Application Doug A. Bowman Graphics, Visualization, and Usability Center College of Computing Georgia Institute of Technology
More informationEvaluation of Spatial Abilities through Tabletop AR
Evaluation of Spatial Abilities through Tabletop AR Moffat Mathews, Madan Challa, Cheng-Tse Chu, Gu Jian, Hartmut Seichter, Raphael Grasset Computer Science & Software Engineering Dept, University of Canterbury
More informationAPPEAL DECISION. Appeal No USA. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan. Tokyo, Japan
APPEAL DECISION Appeal No. 2013-6730 USA Appellant IMMERSION CORPORATION Tokyo, Japan Patent Attorney OKABE, Yuzuru Tokyo, Japan Patent Attorney OCHI, Takao Tokyo, Japan Patent Attorney TAKAHASHI, Seiichiro
More informationACTUI: Using Commodity Mobile Devices to Build Active Tangible User Interfaces
Demonstrations ACTUI: Using Commodity Mobile Devices to Build Active Tangible User Interfaces Ming Li Computer Graphics & Multimedia Group RWTH Aachen, AhornStr. 55 52074 Aachen, Germany mingli@cs.rwth-aachen.de
More informationExploring Surround Haptics Displays
Exploring Surround Haptics Displays Ali Israr Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh, PA 15213 USA israr@disneyresearch.com Ivan Poupyrev Disney Research 4615 Forbes Ave. Suite 420, Pittsburgh,
More informationTouch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device
Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device Andrew A. Stanley Stanford University Department of Mechanical Engineering astan@stanford.edu Alice X. Wu Stanford
More informationCapWidgets: Tangible Widgets versus Multi-Touch Controls on Mobile Devices
CapWidgets: Tangible Widgets versus Multi-Touch Controls on Mobile Devices Sven Kratz Mobile Interaction Lab University of Munich Amalienstr. 17, 80333 Munich Germany sven.kratz@ifi.lmu.de Michael Rohs
More information3D and Sequential Representations of Spatial Relationships among Photos
3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii
More informationWelcome to this course on «Natural Interactive Walking on Virtual Grounds»!
Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/
More informationOne Display for a Cockpit Interactive Solution: The Technology Challenges
One Display for a Cockpit Interactive Solution: The Technology Challenges A. Xalas, N. Sgouros, P. Kouros, J. Ellinas Department of Electronic Computer Systems, Technological Educational Institute of Piraeus,
More informationRunning an HCI Experiment in Multiple Parallel Universes
Running an HCI Experiment in Multiple Parallel Universes,, To cite this version:,,. Running an HCI Experiment in Multiple Parallel Universes. CHI 14 Extended Abstracts on Human Factors in Computing Systems.
More informationDiscrimination of Virtual Haptic Textures Rendered with Different Update Rates
Discrimination of Virtual Haptic Textures Rendered with Different Update Rates Seungmoon Choi and Hong Z. Tan Haptic Interface Research Laboratory Purdue University 465 Northwestern Avenue West Lafayette,
More informationDevelopment of Video Chat System Based on Space Sharing and Haptic Communication
Sensors and Materials, Vol. 30, No. 7 (2018) 1427 1435 MYU Tokyo 1427 S & M 1597 Development of Video Chat System Based on Space Sharing and Haptic Communication Takahiro Hayashi 1* and Keisuke Suzuki
More informationZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field
ZeroTouch: A Zero-Thickness Optical Multi-Touch Force Field Figure 1 Zero-thickness visual hull sensing with ZeroTouch. Copyright is held by the author/owner(s). CHI 2011, May 7 12, 2011, Vancouver, BC,
More informationmixed reality mixed reality & (tactile and) tangible interaction (tactile and) tangible interaction class housekeeping about me
Mixed Reality Tangible Interaction mixed reality (tactile and) mixed reality (tactile and) Jean-Marc Vezien Jean-Marc Vezien about me Assistant prof in Paris-Sud and co-head of masters contact: anastasia.bezerianos@lri.fr
More informationA Kinect-based 3D hand-gesture interface for 3D databases
A Kinect-based 3D hand-gesture interface for 3D databases Abstract. The use of natural interfaces improves significantly aspects related to human-computer interaction and consequently the productivity
More information3D User Interfaces. Using the Kinect and Beyond. John Murray. John Murray
Using the Kinect and Beyond // Center for Games and Playable Media // http://games.soe.ucsc.edu John Murray John Murray Expressive Title Here (Arial) Intelligence Studio Introduction to Interfaces User
More informationA Multi-Touch Enabled Steering Wheel Exploring the Design Space
A Multi-Touch Enabled Steering Wheel Exploring the Design Space Max Pfeiffer Tanja Döring Pervasive Computing and User Pervasive Computing and User Interface Engineering Group Interface Engineering Group
More informationJamming User Interfaces: Programmable Particle Stiffness and Sensing for Malleable and Shape-Changing Devices
Jamming User Interfaces: Programmable Particle Stiffness and Sensing for Malleable and Shape-Changing Devices Sean Follmer 1, Daniel Leithinger 1, Alex Olwal 1, Nadia Cheng 2, and Hiroshi Ishii 1 1 MIT
More informationExploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity
Exploring Passive Ambient Static Electric Field Sensing to Enhance Interaction Modalities Based on Body Motion and Activity Adiyan Mujibiya The University of Tokyo adiyan@acm.org http://lab.rekimoto.org/projects/mirage-exploring-interactionmodalities-using-off-body-static-electric-field-sensing/
More informationYu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp
Yu, W. and Brewster, S.A. (2003) Evaluation of multimodal graphs for blind people. Universal Access in the Information Society 2(2):pp. 105-124. http://eprints.gla.ac.uk/3273/ Glasgow eprints Service http://eprints.gla.ac.uk
More informationCollaborative Pseudo-Haptics: Two-User Stiffness Discrimination Based on Visual Feedback
Collaborative Pseudo-Haptics: Two-User Stiffness Discrimination Based on Visual Feedback Ferran Argelaguet Sanz, Takuya Sato, Thierry Duval, Yoshifumi Kitamura, Anatole Lécuyer To cite this version: Ferran
More informationTangible Sketching in 3D with Posey
Tangible Sketching in 3D with Posey Michael Philetus Weller CoDe Lab Carnegie Mellon University Pittsburgh, PA 15213 USA philetus@cmu.edu Mark D Gross COmputational DEsign Lab Carnegie Mellon University
More informationNon-Visual Menu Navigation: the Effect of an Audio-Tactile Display
http://dx.doi.org/10.14236/ewic/hci2014.25 Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display Oussama Metatla, Fiore Martin, Tony Stockman, Nick Bryan-Kinns School of Electronic Engineering
More informationDESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS. Lucia Terrenghi*
DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS Lucia Terrenghi* Abstract Embedding technologies into everyday life generates new contexts of mixed-reality. My research focuses on interaction techniques
More informationWhat was the first gestural interface?
stanford hci group / cs247 Human-Computer Interaction Design Studio What was the first gestural interface? 15 January 2013 http://cs247.stanford.edu Theremin Myron Krueger 1 Myron Krueger There were things
More informationAir-filled type Immersive Projection Display
Air-filled type Immersive Projection Display Wataru HASHIMOTO Faculty of Information Science and Technology, Osaka Institute of Technology, 1-79-1, Kitayama, Hirakata, Osaka 573-0196, Japan whashimo@is.oit.ac.jp
More informationHELPING THE DESIGN OF MIXED SYSTEMS
HELPING THE DESIGN OF MIXED SYSTEMS Céline Coutrix Grenoble Informatics Laboratory (LIG) University of Grenoble 1, France Abstract Several interaction paradigms are considered in pervasive computing environments.
More informationLightBeam: Nomadic Pico Projector Interaction with Real World Objects
LightBeam: Nomadic Pico Projector Interaction with Real World Objects Jochen Huber Technische Universität Darmstadt Hochschulstraße 10 64289 Darmstadt, Germany jhuber@tk.informatik.tudarmstadt.de Jürgen
More information