Abstract. 2. Related Work. 1. Introduction Icon Design

Similar documents
Design and evaluation of Hapticons for enriched Instant Messaging

Perceptual Design of Haptic Icons

Direct Manipulation. and Instrumental Interaction. Direct Manipulation 1

Glasgow eprints Service

Direct Manipulation. and Instrumental Interaction. Direct Manipulation

Conversational Gestures For Direct Manipulation On The Audio Desktop

Sound rendering in Interactive Multimodal Systems. Federico Avanzini

3D Modelling Is Not For WIMPs Part II: Stylus/Mouse Clicks

Effective Iconography....convey ideas without words; attract attention...

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Exploring Surround Haptics Displays

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Thresholds for Dynamic Changes in a Rotary Switch

2. Introduction to Computer Haptics

Tangible and Haptic Interaction. William Choi CS 376 May 27, 2008

HEAD. Advanced Filters Module (Code 5019) Overview. Features. Module with various filter tools for sound design

AUDIO-ENHANCED COLLABORATION AT AN INTERACTIVE ELECTRONIC WHITEBOARD. Christian Müller Tomfelde and Sascha Steiner

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

LabVIEW Basics Peter Avitabile,Jeffrey Hodgkins Mechanical Engineering Department University of Massachusetts Lowell

MSMS Software for VR Simulations of Neural Prostheses and Patient Training and Rehabilitation

Glasgow eprints Service

Heads up interaction: glasgow university multimodal research. Eve Hoggan

An Introductory Guide to Circuit Simulation using NI Multisim 12

Proposal for development of the Robotic Backhoe with Haptic Display

From Shape to Sound: sonification of two dimensional curves by reenaction of biological movements

DESIGNING ENVIRONMENTAL SOUNDS BASED ON THE RESULTS OF INTERACTION BETWEEN OBJECTS IN THE REAL WORLD

Linear Motion Servo Plants: IP01 or IP02. Linear Experiment #0: Integration with WinCon. IP01 and IP02. Student Handout

GE 320: Introduction to Control Systems


VIRTUAL REALITY FOR NONDESTRUCTIVE EVALUATION APPLICATIONS

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Perceptual Overlays for Teaching Advanced Driving Skills

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Development of a telepresence agent

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Using Simple Force Feedback Mechanisms as Haptic Visualization Tools.

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Manipulation. Manipulation. Better Vision through Manipulation. Giorgio Metta Paul Fitzpatrick. Humanoid Robotics Group.

A Framework to Support the Designers of Haptic, Visual and Auditory Displays.

LCC 3710 Principles of Interaction Design. Readings. Sound in Interfaces. Speech Interfaces. Speech Applications. Motivation for Speech Interfaces

Haptic Display of Multiple Scalar Fields on a Surface

Discrimination of Virtual Haptic Textures Rendered with Different Update Rates

Creating Usable Pin Array Tactons for Non- Visual Information

Haptic Models of an Automotive Turn-Signal Switch: Identification and Playback Results

Haptics and the User Interface

LabVIEW 8" Student Edition

Immersive Natives. Die Zukunft der virtuellen Realität. Prof. Dr. Frank Steinicke. Human-Computer Interaction, Universität Hamburg

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Recommended Work Keys Scores for Digital Arts and Design

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Anticipation in networked musical performance

Do You Feel What I Hear?

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

STEM: Electronics Curriculum Map & Standards

MEAM 520. Haptic Rendering and Teleoperation

CHAPTER 2. RELATED WORK 9 similar study, Gillespie (1996) built a one-octave force-feedback piano keyboard to convey forces derived from this model to

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Using Haptics for Mobile Information Display

Technology Engineering and Design Education

Touch Perception and Emotional Appraisal for a Virtual Agent

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Spatialization and Timbre for Effective Auditory Graphing

INTERACTIVE 3D VIRTUAL HYDRAULICS Using virtual reality environments in teaching and research of fluid power systems and components

HUMAN COMPUTER INTERFACE

HEARING IMAGES: INTERACTIVE SONIFICATION INTERFACE FOR IMAGES

MEAM 520. Haptic Rendering and Teleoperation

Visualization and Animation of Protective Relay Operation

Laboratory Experiment #1 Introduction to Spectral Analysis

Ch 1. Ch 2 S 1. Haptic Display. Summary. Optimization. Dynamics. Paradox. Synthesizers. Ch 3 Ch 4. Ch 7. Ch 5. Ch 6

The Application of Human-Computer Interaction Idea in Computer Aided Industrial Design

Modeling and Simulation: Linking Entertainment & Defense

Dimensional Design; Explorations of the Auditory and Haptic Correlate for the Mobile Device

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Tutorial Day at MobileHCI 2008, Amsterdam

Sound is the human ear s perceived effect of pressure changes in the ambient air. Sound can be modeled as a function of time.

Comparison of Haptic and Non-Speech Audio Feedback

Towards affordance based human-system interaction based on cyber-physical systems

An Excavator Simulator for Determining the Principles of Operator Efficiency for Hydraulic Multi-DOF Systems Mark Elton and Dr. Wayne Book ABSTRACT

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

CS277 - Experimental Haptics Lecture 2. Haptic Rendering

The Haptic Impendance Control through Virtual Environment Force Compensation

Laboratory set-up for Real-Time study of Electric Drives with Integrated Interfaces for Test and Measurement

Lab 12 Laboratory 12 Data Acquisition Required Special Equipment: 12.1 Objectives 12.2 Introduction 12.3 A/D basics

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

Moving Man LAB #2 PRINT THESE PAGES AND TURN THEM IN BEFORE OR ON THE DUE DATE GIVEN IN YOUR .

Honors Drawing/Design for Production (DDP)

Using Signal Express to Automate Analog Electronics Experiments

IMAGE PROCESSING FOR EVERYONE

I R UNDERGRADUATE REPORT. Hardware and Design Factors for the Implementation of Virtual Reality as a Training Tool. by Walter Miranda Advisor:

Haptic Abilities of Freshman Engineers as Measured by the Haptic Visual Discrimination Test

Haptic presentation of 3D objects in virtual reality for the visually disabled

Handheld Haptics: A USB Media Controller with Force Sensing

Understanding OpenGL

Haptic Rendering and Volumetric Visualization with SenSitus

REAL-TIME CONTROL OF SONIFICATION MODELS WITH A HAPTIC INTERFACE. Thomas Hermann, Jan Krause and Helge Ritter

Exploring the Perceptual Space of a Novel Slip-Stick Haptic Surface Display

LAB II. INTRODUCTION TO LABVIEW

Article. Reference. A comparison of three nonvisual methods for presenting scientific graphs. ROTH, Patrick, et al.

Subject Area. Content Area: Visual Art. Course Primary Resource: A variety of Internet and print resources Grade Level: 3

AUDL GS08/GAV1 Auditory Perception. Envelope and temporal fine structure (TFS)

Transcription:

The Hapticon Editor: A Tool in Support of Haptic Communication Research Mario J. Enriquez and Karon E. MacLean Department of Computer Science University of British Columbia enriquez@cs.ubc.ca, maclean@cs.ubc.ca Abstract We define haptic icons, or hapticons, as brief programmed forces applied to a user through a haptic interface, with the role of communicating a simple idea in manner similar to visual or auditory icons. In this paper we present the design and implementation of an innovative software tool and graphical interface for the creation and editing of hapticons. The tool s features include various methods for creating new icons including direct recording of manual trajectories and creation from a choice of basis waveforms; novel direct-manipulation icon editing mechanisms, integrated playback and convenient storage of icons to file. We discuss some ways in which the tool has aided our research in the area of haptic iconography and present an innovative approach for generating and rendering simple textures on a low degree of freedom haptic device using what we call terrain display 1. Introduction Visual and auditory icons have long been integral to computer interfaces, as a means of indicating functionality, location and other low-dimensional information more efficiently than can displayed text [1,2]. Graphic icons, for example, are small and concise graphic representations of real or abstract objects. These icons should be easily identifiable by the user and can represent a spectrum of information, ranging from specific functions to abstract controls. In everyday interaction with manual controls such as those found in a car, on a workbench or throughout a building, we use parameters such as shape, texture and muscle memory to identify and locate different functions and states of handles ranging from doorknobs to pencils and radio controls. With the introduction of active haptic interfaces, a single handle - e.g. a knob or a joystick - can control several different and perhaps unrelated functions. These multi-function controllers can no longer be differentiated from one another by position, shape or texture differences, and it becomes a design challenge to make both the existence of available functions and their identity apparent to the user. Active haptic icons, or hapticons, may be able to solve this problem by rendering haptically distinct and meaningful sensations for the different functions. A systematic approach to hapticon design requires tools that allow people without engineering background closer participation in the creative process, thus broadening and enriching the area. The Hapticon Editor, with its simple, efficient approach, is such a tool. 2. Related Work 2.1. Icon Design There has been a great deal of work relating to the design of auditory and visual icons. The auditory and haptic iconic design space share many key attributes: they are both temporally sequential, while human perception has narrow limits for amplitude and period discrimination. Thus in our hapticon research program, we have found it most productive to follow auditory icon design. There have been two principal approaches to using sound to iconify information: Gaver et al. [3,4] studied Auditory Icons. These are essentially representations of the objects or notions that embody a literal, direct meaning: for example, using the sound of a paper being crushed to indicate deleting a computer file. Most of us are familiar with both the sound of crumpling paper and the action of deleting a file, and can easily make the association. While it is an intuitive approach, it does not address whether users can differentiate the icons or how many icons can be distinguished. Brewster et al. [5,6] took a different approach. Earcons are sounds and rhythms with no innate meaning: their target or meaning must be learned. Brewster s studies have focused on understanding and quantifying the different Earcons that can be perceptually differentiated by users, what sounds are most perceptually salient and whether certain sounds are appropriate for a given application. For example, a quiet sound might literally represent a very urgent or dangerous event because that event does not generate much sound in the real world. However, in a different

application, this might be an inappropriate representation. As an example of a compromise between the two general approaches, the Microsoft Office standard toolbar uses graphical icons in an approach that approximates the literal design criterion. Some of the icons in the toolbar are easily identifiable by most people, yet some others need to be learned. Our chosen approach to hapticon design is conceptually similar to Brewster s, in its first stage: we experimentally determine where icons should lie in a perceptual sense. For our work, we need to be able to systematically create and test haptic icons; this gave rise to the work presented in this paper. 2.2. Haptic Trajectory Acquisition There has also been prior work in the area of recording haptic trajectories, a key method of input for the tool described here. The purpose of MacLean s Haptic Camera [8] was to systematically obtain input haptic trajectories for later reproduction. Her system could obtain an approximate model for a real object s haptic response, and play it back. However, the Haptic Camera collected input from passive devices rather than from a human hand, and the force model obtained for the device could be edited only parametrically. Frei [7] designed a mechanical device that could record trajectories and play them back. His goal was to create an entertaining and pleasing motion when combined and repeated, but not to facilitate editing of the created trajectories. However, the means of input provided some of the inspiration for our work. 3. The Hardware Setup For our ongoing study of haptic icons, we are using a single degree of freedom (DOF) haptic display, configured as a knob (Figure 1). The low-dof interface is appropriate since we anticipate that haptic icons will be most useful in simple, embedded interfaces rather than in high-end desktop systems. The haptic forces are displayed on the knob by a direct-drive DC motor with an optical encoder for feedback. This research setup employs a closed loop controller situated on a PC and communicating with the hardware through an I/O board. 4. Basics of Operation The Hapticon Editor works by managing and storing a representation of the haptic forces into files. This allows us to treat haptic sensations as any other digital media. All operations performed with the Hapticon Editor affect the haptic icon file being currently edited. Each icon is stored in a separate file; and the icon is activated by selecting it from the file list on the top left part of the main screen (Figure 2). When a new file is created by either direct recording or superposition of basic waveforms, it will be added to the file list. The user can perform any of the functions represented in the bottom of the main screen on the active icon, including Play in Time, Play in Space, Record New, Edit, Create New, Add Icons. The area of the screen showing the sine waveform in Figure 2 presents a graphical representation of the haptic icon being edited. The buttons (graphical icons) on the bottom of the screen represent the available functions for creating and editing an opened haptic icon file and will be explained in detail in the following sections. 5. Hapticon Editor Functions 5.1 Creating a New Icon The Hapticon Editor allows you to create a new haptic icon in two ways: - Direct recording of the user s motions of the haptic knob (5.1.1.) - Creation by addition of simple waveforms (5.1.2.) 5.1.1. Direct Motion Recording (Record New button) Figure 1. The Haptic Display This function allows the user to directly store the knob motions. The function records the movements for a specified duration and stores this as positional information in a file for later reproduction or editing.

Figure 2. The Hapticon Editor Main Screen. 5.1.2. Creation of Icons From Simple Waveforms (Create New button) This function allows the user to create a new haptic icon from scratch. The process begins by choosing and appending simple waveforms to create a haptic icon file that can later be displayed through the haptic knob. The icon data is stored in a new file when complete. When activated, the New Icon Screen is displayed (Figure 3). Figure 3 shows the haptic icon creator screen. The functions in this screen allow the user to generate a new haptic icon by building it from simple waveforms. The total duration for the haptic icon can be specified in milliseconds in the space provided. The user can create the haptic icon using one or more of the given basic functions. Each function is appended one after another. The length, frequency and amplitude for each waveform to be appended can be specified. The graph shown in Figure 3 has been created by concatenating seven simple waveforms of varying amplitudes, frequencies and durations. This file has a total duration of 10 seconds and will be stored with the name <NewFile>. 5.2. Editing Functions The Hapticon Editor allows you to edit haptic icons in several different ways: -Adjust Amplitude Function (5.2.1.) -Graphic Editing of the Hapticon (5.2.2.) -Hapticon superposing (5.2.3.) 5.2.1. Adjust Amplitude Function This utility allows the user to either increase or decrease the overall amplitude of a haptic icon by specifying a multiplicative scale factor. This is useful when the overall feel of the haptic icon is either too weak or too strong but you wish to maintain its overall sensation.

Figure 3. The Hapticon Creator Screen 5.2.2. Haptic Icon Graphic Editor Function (Edit button) Once a haptic information file has been created, the user can graphically edit the icon using simple mouse commands. Figure 4 shows the graphic editor screen. This screen shows the haptic icon function as a series of connected dots. Using the mouse, the user can select one or more of these dots and then by moving them, modify the shape of the haptic icon. The selected dots can be moved up or down or set to center using the mouse and the editing functions on the lower right of the editor screen. When more than two dots are selected, moving the mouse up or down moves the selected dots in a parabolic shape with the center dot being moved the most. 5.2.3. Add Icons Function This utility allows the user to generate new icons by superposing existing icons. Figure 5 shows a capture of the haptic icon adder screen. Combining several simple icons can generate a more complex icon. Figure 5 shows an icon being generated by superimposing a low frequency sine waveform with a high frequency sine waveform. The resulting waveform can be stored with a name specified by the user. This added functionality allows the user a better palette for creating more complex functions to be used as haptic icons. 5.3. Playback Functions Creation of haptic icons is a highly iterative process, so it was critical for our tool to have an integrated and very easy to use playback functionality. Once a haptic information file has been created, there are two modes for displaying the file: -Playback of the haptic icon as a function of time (5.3.1.) -Playback of the haptic icon as a function of knob position (5.3.2.)

Figure 4. The Hapticon Editor Screen 5.3.1. Play in Time Function This utility displays the previously created/edited icon through the haptic display as a function of time. The icon is displayed by moving the knob to follow the positions indicated by the stored function for a specific time. Play in Time displays the data in the file as forces that vary through time, generating motions on the knob following the graph displayed from left to right. When a haptic icon was created through direct motion recording, the knob will mimic those motions previously stored. When the icon was created from simple waveforms, the knob will follow the motions specified by the contours of the waveform as a function of time. This playback method produces what we call passive haptic icons. The user merely holds the knob and feels the forces expressed through it. There is no need for any exploratory motion from the user to perceive this type of haptic icon. As the haptic icon is being displayed, a small red dot is superposed on the graph showing what part of it is being displayed on the knob at the time. The Playback Speed slider control, located on the main screen of the Hapticon Editor (Figure 2), allows adjustment of the playback speed for the haptic icons. This allows the user the possibility to record a haptic icon at a slow pace, and then playing it back at a rate faster than it could be manually input. It can also be used to slow down the reproduction of an icon to obtain a different sensation than the original recording provided. 5.3.2. Play in Space Function This utility presents the previously created/edited icon through the haptic display as a function of knob position. The user can actively explore the haptic icon, receiving feedback forces that are dependent on the function being displayed and the position of the knob. Figure 5. The Play in Space Function Operation The icon being presented can be explored by rotating the haptic knob, producing force feedback proportional to the inclination of the function at the position specified by the haptic display. This gives the user the sensation of exploring a one-dimensional topographic map.

Figure 6. Icon Adder Screen The knob reproduces the forces that would be felt when pushing a rolling object over the terrain of the function displayed. This allows testing of simple haptic textures that can be easily generated with this program. The main screen displays a superimposed red dot on the graphic representation of the hapticon to allow the user to see what part of the function is being displayed at that specific knob position. 6. Using the Hapticon Editor The Hapticon Editor has been an invaluable aid in designing and testing a collection of abstract haptic icons that are currently being used to conduct several psychophysical experiments in our lab. Amongst these projects is one that uses an interesting new approach to testing for perceptual differences amongst hapticons [9]. For this work, we study the perception of simple haptic icons utilizing a mathematical exploratory procedure called Multidimensional Scaling Analysis (MDS). 7. Conclusions and Future Work We have presented a simple tool for creating, editing storing and displaying haptic icons. This tool has been used to aid in our ongoing research on haptic icons for low degree of freedom haptic displays. It has proven to be an invaluable aid for the design and testing of haptic icons. Our future goals include extending our tool to collect initial trajectory bases from sources other than a human hand, e.g. recording existing controls, switches and rotational textures and later modifying them, thus integrating the benefits of MacLean s Haptic Camera with the current tool. This approach will provide us with a greater diversity and range of motions and forces than available from manual input alone.

8. References [1] Huggins W. H. & Entwisle D. R. (1974) Iconic Communication: an annotated biography - The John Hopkins University Press [2] Yazdani M. & Goring D. (1990) Iconic Communication - Dept. of Computer Science, Exeter University [3] Gaver, W. W. "Everyday Listening and Auditory Icons." Ph.D. Dissertation, University of California, San Diego, 1988. [4] Gaver, W. W. "The SonicFinder: An Interface that Uses Auditory Icons." Hum.-Comp. Inter. 4(1) (1989). [5] Brewster, S.A., Wright, P.C. & Edwards, A.D.N. (1992). A detailed investigation into the effectiveness of earcons. In G. Kramer (Ed.), Auditory display, sonification, audification and auditory interfaces. The Proceedings of the First International Conference on Auditory Display, Santa Fé Institute, Santa Fé: Addison-Wesley, pp. 471-498. [6] Brewster, S.A., Wright, P.C. & Edwards, A.D.N. (1993). An evaluation of earcons for use in auditory human-computer interfaces. In S. Ashlund, K. Mullet, A. Henderson, E. Hollnagel, & T. White (Eds.), Proceedings of InterCHI 93, Amsterdam: ACM Press, Addison-Wesley, pp. 222-227. [7] Frei, P., V. Su, et al. (2000). Curlybot: Designing a New Class of Computational Toys. Conference on Human Factors in Computing Systems (CHI 2000). [8] MacLean, K. E. (1996). The Haptic Camera: A Technique for Characterizing and Playing Back Haptic Environments. The 5th Ann. Symp. on Haptic Interfaces for Virtual Environments and Teleoperator Systems, ASME/IMECE, Atlanta, GA. [9] MacLean, K., Enriquez, M., Di Lollo, V. The Perceptual Design of Haptic Icons. In Review.