Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Similar documents
Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

NUI. Research Topic. Research Topic. Multi-touch TANGIBLE INTERACTION DESIGN ON MULTI-TOUCH DISPLAY. Tangible User Interface + Multi-touch

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

DESIGN FOR INTERACTION IN INSTRUMENTED ENVIRONMENTS. Lucia Terrenghi*

Improvisation and Tangible User Interfaces The case of the reactable

Salient features make a search easy

Haplug: A Haptic Plug for Dynamic VR Interactions

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

Designing Pseudo-Haptic Feedback Mechanisms for Communicating Weight in Decision Making Tasks

Using Hands and Feet to Navigate and Manipulate Spatial Data

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Glasgow eprints Service

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

synchrolight: Three-dimensional Pointing System for Remote Video Communication

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

Touch & Haptics. Touch & High Information Transfer Rate. Modern Haptics. Human. Haptics

Prototyping of Interactive Surfaces

Interactive Exploration of City Maps with Auditory Torches

COMS W4172 Design Principles

Translucent Tangibles on Tabletops: Exploring the Design Space

Investigating Gestures on Elastic Tabletops

New Metaphors in Tangible Desktops

Learning the Proprioceptive and Acoustic Properties of Household Objects. Jivko Sinapov Willow Collaborators: Kaijen and Radu 6/24/2010

Mudpad: Fluid Haptics for Multitouch Surfaces

Meaning, Mapping & Correspondence in Tangible User Interfaces

LCC 3710 Principles of Interaction Design. Readings. Tangible Interfaces. Research Motivation. Tangible Interaction Model.

Haptic Feedback in Remote Pointing

Haptic presentation of 3D objects in virtual reality for the visually disabled

Exploring the Perceptual Space of a Novel Slip-Stick Haptic Surface Display

Evaluating Effect of Sense of Ownership and Sense of Agency on Body Representation Change of Human Upper Limb

Interactive Multimedia Contents in the IllusionHole

Proprioception & force sensing

Beyond Visual: Shape, Haptics and Actuation in 3D UI

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges

Simulation of Tangible User Interfaces with the ROS Middleware

Slurp: Tangibility, Spatiality, and an Eyedropper

Dynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone

Social and Spatial Interactions: Shared Co-Located Mobile Phone Use

Exploring Surround Haptics Displays

Embodiment, Immediacy and Thinghood in the Design of Human-Computer Interaction

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Interaction Techniques for Musical Performance with Tabletop Tangible Interfaces

ITS '14, Nov , Dresden, Germany

Midterm project proposal due next Tue Sept 23 Group forming, and Midterm project and Final project Brainstorming sessions

Comparison of Haptic and Non-Speech Audio Feedback

The Pie Slider: Combining Advantages of the Real and the Virtual Space

Vision, haptics, and attention: new data from a multisensory Necker cube

Tableau Machine: An Alien Presence in the Home

Haptic messaging. Katariina Tiitinen

Absolute and Discrimination Thresholds of a Flexible Texture Display*

AIEDAM Special Issue: Sketching, and Pen-based Design Interaction Edited by: Maria C. Yang and Levent Burak Kara

Finding the Minimum Perceivable Size of a Tactile Element on an Ultrasonic Based Haptic Tablet

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

Beyond: collapsible tools and gestures for computational design

Tangible User Interfaces

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

Lecture 8: Tactile devices

Graphical User Interfaces for Blind Users: An Overview of Haptic Devices

Occlusion-Aware Menu Design for Digital Tabletops

Perception. What We Will Cover in This Section. Perception. How we interpret the information our senses receive. Overview Perception

Physical Affordances of Check-in Stations for Museum Exhibits

ENHANCING PRODUCT SENSORY EXPERIENCE: CULTURAL TOOLS FOR DESIGN EDUCATION

Multi-Modal User Interaction

CapWidgets: Tangible Widgets versus Multi-Touch Controls on Mobile Devices

Advancements in Gesture Recognition Technology

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Computer Haptics and Applications

Using low cost devices to support non-visual interaction with diagrams & cross-modal collaboration

Exploring the Physical Design Space for Situation Awareness and Performance in the Interactive Cockpit

Input-output channels

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Advanced User Interfaces: Topics in Human-Computer Interaction

Collaboration in Multimodal Virtual Environments

Realtime 3D Computer Graphics Virtual Reality

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Human Vision. Human Vision - Perception

Texture recognition using force sensitive resistors

Touch Perception and Emotional Appraisal for a Virtual Agent

A Three-Dimensional Evaluation of Body Representation Change of Human Upper Limb Focused on Sense of Ownership and Sense of Agency

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Introduction to Haptics

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

Haptic Rendering CPSC / Sonny Chan University of Calgary

Modeling Prehensile Actions for the Evaluation of Tangible User Interfaces

HELPING THE DESIGN OF MIXED SYSTEMS

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Running an HCI Experiment in Multiple Parallel Universes

Heads up interaction: glasgow university multimodal research. Eve Hoggan

The Mixed Reality Book: A New Multimedia Reading Experience

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

Transcription:

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces Katrin Wolf Telekom Innovation Laboratories TU Berlin, Germany katrin.wolf@acm.org Peter Bennett Interaction and Graphics Group University of Bristol, UK peter.bennett@bristol.ac.uk Copyright is held by the author/owner(s). TEI 2013, February 10-13, 2013, Barcelona, Spain ACM Abstract Tangible User Interfaces (TUIs) allow the representation of digital information via a number of sensory modalities including the haptic, visual and auditory senses. In this paper we suggest that the visual component of many TUIs dominates over the physical, to a detriment in the quality of the physical interaction. In order to investigate the possibilities of interacting with a less visually biased TUI we explore the extreme case of an almost entirely non-visual interface. We present an exploratory design of a Feelable User Interface (FUI), allowing the physical manipulation of an object over a textured surface whilst visually hiding both object and texture. This initial test investigates basic interaction with a FUI; the further aim is to guide the design of a FUI that allows the digital control of physical surface texture. Our aim in developing FUIs is to open up a novel design space for developing new TUIs, based upon the concepts of nuanced haptic interactions and a decreased reliance on the visual representation of information.

Figure 1. Modified TUI interaction model with visual and physical (tangible) representation. Keywords Tangible User Interface; Haptic; Non-Visual; Texture. ACM Classification Keywords H5.2 [Information interfaces and presentation]: User Interfaces. - Input Devices and Strategies. Introduction There have been many explorations on interfaces [3, 6, 11] that initialized or followed the paradigm of tangible interaction [6], but so far none have been designed to deliberately hide the tangible object from the users view. Three potential advantages of non-visual or blinded tangible interfaces motivate this work: 1. Modality coupling. Many tangible interfaces include grasping and rearranging objects [3, 7, 11]. From cognitive science we know that grasping an object involves visual and tactile modalities [2,4] and as long as vision is available, that modality is perceived dominantly [9]. But as soon as vision is avoided, proprioception and haptic perception gain more weight in human perception [8]. In comparison to vision, which can be missing while grasping objects; the sense of touch is always perceived. That is a strong argument to assume that haptic is the most appropriate coupled modality to tangible interaction. Some investigation is done for touch [5] and gesture [10] performance on differently shaped surfaces. Surprisingly we found neither research that investigates performance or perception of surface of different surface structure nor interacting with non-visual tangible interfaces, except of research addressing needs of impaired users. We see a research gap in the topic of tactile perception of different surface types that we aim to address in this project. 2. In-body perception. Tangible user interfaces do not commonly combine active (modulated) haptic feedback alongside the passive haptic feedback gained from object manipulation. TeslaTouch [1] provides haptic computer output for motion feedback to simulate surface structure when a finger is sliding above a touch screen through electric stimulation. That actuation technology varies in actuation frequency and strength and stimulates receptors for simulating object touches above different surface structures such as it is known from everyday experience. Our approach aims exploring the human sensibility of surfaces without any visual support. We want to understand the richness of in-body perception while moving an object above a surface from the human perspective. 3. Design space. This investigation of non-visual tangible interfaces aims to map out and expand a lesser-explored area of the overall tangible user interface design space. In doing so we hope to gain insights that could be applied to the design of more general tangible user interfaces. We will conclude this paper with some interface ideas that take advantage of our findings and that inspire further investigations. Tangible User Interfaces Our project is inspired by Ullmer s and Ishii s Frameworks for Tangible User Interfaces (TUI) [6]. The TUI model is meant to have a modality weight shifting from the culturally predominant visual paradigm of human-computer interaction onto physicality through using physical instead of graphical objects for representing and controlling digital content (see Fig. 1). Even though in a TUI physical objects are usually controlled through readjusting and moving them

Figure 2. FUI interaction model without visual but physical (feelable) representation. physically, the perception of these physical objects or tokens still dominated by their visual representation rather their physical characteristics. For instance, the tokens of the reactable [3] that represent the same class of objects are shaped equally and even the objects of different classes differ just little in their physical design. For distinguishing the different objects, icons and colors are stacked on their top and allow visual recognition by the users. In the Urp interface [7] the physical objects have no labels or different colors and are just model buildings of different shapes. But these forms are also perceived visually, so can be read without hands-on tangible exploration. In the Slurp project [11], not the objects but a function (moving content) is represented physically through a pipette that is standing for a physical container that stores digital content for moving it from one computer to another. In all these projects, despite physicality being the investigated aspect, vision is still having a dominant function for displaying information that is linked to the objects; for distinguishing the objects, for recognizing the position of an object and its relation to other ones, and for creating appearance and aesthetics of the interface. In general the tangible interface is first perceived and explored visually before tangible exploration of the interface takes place. Blindfolded TUI: Feelable User Interfaces We believe that the characteristics of TUI can be investigated more precisely if the physicality of the interface and its components, such as objects and system feedback, are not represented visually at all and therefore cannot be seen. In this project we develop a Feelable User Interface, which we understand as a subset of Tangible User Interfaces (see Fig. 2) for investigating the isolated tangible aspects of tangible interfaces in a more controlled setting through avoiding visual representations at all. Instead of using the visual appearance of physical objects, we are exploring other attributes that are given by physicality for free that means attributes that are naturally embodied in physicality, such as weight, friction; but also embodied interaction rules that come with physicality, such as gravity and the fact the in one position there can always just be one physical object. Moreover, we want to take advantage of natural physical feedback. For instance, object movements generate a specific sound when moved corresponding to their material and the material of the surface they are being moved across. Furthermore, the characteristics of the movement, such as speed, direction, collision or bouncing, can be distinguished by sound characteristics. These information or free advantages that are embodied in natural object movements might help supporting to control digital objects through giving naturally suggested beside information (feedback) that is known from controlling real physical objects. In contrast to this conceptual matching approach for interaction feedback between digital and physical interactions there are also conceptual mismatches between digital and physical object control. For instance, in digital object control there is the opportunity to cut out an object in one place and to paste it at a different place without having a movement in between. Also an object can be reproduces if there is one existing example. This is usually done through copy and paste actions. Moreover, deleting a digital object does not leave any garbage, which is not possible with physical objects. Furthermore, scaling and editing objects is from our perspective a design challenge for tangible and feelable user interfaces. From our

Figure 3. Surfaces for exploring haptic experience while dragging an object above: (1) no structure (2) linear structure (3) small dot pattern (4) large dot pattern. Figure 4. Experimental setup showing (a) magnet, (b) ballbearing and (c) textures. perspective, tangible user interfaces were using physical advantages for designing digital interfaces that include the benefit of embodied knowledge of dealing with object from interacting with real things. To the authors knowledge, no tangible interfaces have been developed that explore how to apply digital control actions onto physical objects which do not exist in the physical world, such as cut, copy, paste and delete. In this project, we want to find ways to represent these digital actions physically and tangibly, and we want to keep as close as possible to the TUI concept by avoiding the use of visual material. Exploration Our goal is to explore tactile perception of textured surfaces that tangible objects are moved above. In this initial exploration, we want to get a fundamental understanding of the user perception of different surface structures and are questioning: Research question 1. How different surface structure (smooth, rough, linear, with cut-out elements, see Fig. 3) are perceived if a tangible object is moved above. 2. Whether or not the structure is suggested to be part of the object or of the surface the object is moved above. We have chosen the four different surface structures (ref. question 1 and see Fig. 3) because they are wide spread in physical object design and tangible interaction usually tends to refer to everyday physical experiences. Question 2 is inspired by the opportunity to fake perception and simulate physical stimuli, as has been achieved previously [1]. Set-up Our apparatus allows for dragging a single object above different surface structures without any visual key through adding a physical layer between the touch and the object. The object we are using is a bearing ball and stands for a physical mouse pointer. The actual action of ball movement is hidden insight a black-boxset-up (see Fig. 4) and the surface is changing structure where it touches the bearing ball. All surfaces are made of the same material (plexi-glass) to avoid that the material affects the surface perception and as plexi-glass can easily be changed in structure through laser cutting. We have chosen a magnet to transfer the kinetic energy of the user s hand to the object because that de-coupled layout allows faking perception in further work. For instance using an electronic magnet and changing the force may give the illusion of changing friction or object weight. Frequently changing forces might fake the illusion of various surface structures, such as those we produced physically (see Fig. 3) and therefore allow for digitally mediated analogue interactions. Our approach is motivated by the belief that the mixture of computer controlled analogue and digital world offers the possibility of creating physical illusions that are difficult to achieve physically, such as changing the surface or size of an object. Conclusion and Future Work This paper presents the first step to investigate whether or not changing surface texture effects perception when dragging objects above and where the surface change is suggested by the user: at the ball or and the surface of the plexi-glass where the ball is moved above. We plan a user study to investigate

these two questions in a controlled set-up where participants will be asked to solve some dragging tasks and fill questionnaires afterwards about their perception. Our hypothesis is that users can distinguish between all four surface types. Furthermore we are questioning whether the participants feel that the texture is experienced as coming from underneath the magnet or from the unseen ball-bearing below. Future work after this initial study will involve developing actuation of the textured surfaces, allowing the computational control of the texture positions, spacing and patterns. References [1] Bau, O., Poupyrev, I., Israr, A., Harrison, C. 2010. TeslaTouch: electrovibration for touch surfaces. In Proc. UIST 2010, 283-292. [2] Held R. Visual-haptic mapping and the origin of cross-modal identity. Optom Vis Sci. 2009 Jun;86(6):595-8. Review. [3] Jordà, S., Kaltenbrunner, M., Geiger, G., and Bencina, R. 2005. The reactable, In Proc. ICMC 2005. [4] Newell, F.N., Ernst, M.O., Tjan, B.S., Bülthoff, H.H. 2001. Viewpoint dependence in visual and haptic opject recognition. Psychological Science, VOL 12, NO. 1, 2001, 37-42. [5] Roudaut, A., Pohl, H., Baudisch, P. 2011. Touch input on curved surfaces. In Proc. CHI 2011, 1011-1020. [6] Ullmer, B. and Ishii, H. 2001. Emerging Frameworks for Tangible User Interfaces, In Human- Computer Interaction in the New Millenium, 2001, 579-601. [7] Underkoffler, J., Ishii, H. 1999. Urp: a luminoustangible workbench for urban planning and design, In Proc. CHI 1999, 386-393. [8] Van Beers, R.J., Wolpert, D.M., Haggart, P. 2002. When Feeling Is More Important Than Seeing in Sensory Adaption, Current Biology, Vol. 12, May 14, 2002, 834 837. [9] Welch, R. B., Warren, D. H. 1986. Intersensory interactions. In Handbook of Perception and Human Performance, Vol. 1, 1986, 25.1 25.36. [10] Wolf, K., Schleicher, R., Kratz, S., Rohs, M. 2013. Tickle: A Surface independent Interaction Technique for Grasp Interfaces. In Proc. TEI 2013, 8 pages. [11] Zigelbaum, J., Kumpf, A., Vazquez, A., and Ishii, H. 2008. Slurp: tangibility spatiality and an eyedropper. In Proc. CHI 2008, 2565-2574.