CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone

Similar documents
Haptic messaging. Katariina Tiitinen

Investigating Response Similarities between Real and Mediated Social Touch: A First Test

SyncDecor: Appliances for Sharing Mutual Awareness between Lovers Separated by Distance

Design and evaluation of Hapticons for enriched Instant Messaging

Exploring Surround Haptics Displays

Kissenger: A Kiss Messenger

Dynamic Knobs: Shape Change as a Means of Interaction on a Mobile Phone

Touch Perception and Emotional Appraisal for a Virtual Agent

Facilitation of Affection by Tactile Feedback of False Heartbeat

COMET: Collaboration in Applications for Mobile Environments by Twisting

MOBILE AND UBIQUITOUS HAPTICS

Touch Your Way: Haptic Sight for Visually Impaired People to Walk with Independence

Open Research Online The Open University s repository of research publications and other research outputs

Artex: Artificial Textures from Everyday Surfaces for Touchscreens

Design and Evaluation of Tactile Number Reading Methods on Smartphones

Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Vibrotactile Apparent Movement by DC Motors and Voice-coil Tactors

The TaSST: Tactile Sleeve for Social Touch

TapTap: A Haptic Wearable for Asynchronous Distributed Touch Therapy

An Emotional Tactile Interface Completing with Extremely High Temporal Bandwidth

Lecture 8: Tactile devices

Rich Tactile Output on Mobile Devices

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Mobile & ubiquitous haptics

Wearable Haptics. Deepa Mathew

HapticArmrest: Remote Tactile Feedback on Touch Surfaces Using Combined Actuators

Augmented Home. Integrating a Virtual World Game in a Physical Environment. Serge Offermans and Jun Hu

Tactile Actuators Using SMA Micro-wires and the Generation of Texture Sensation from Images

Beyond: collapsible tools and gestures for computational design

Towards an Anthropomorphic Lamp for Affective Interaction

An Investigation on Vibrotactile Emotional Patterns for the Blindfolded People

Exploration of Tactile Feedback in BI&A Dashboards

Running an HCI Experiment in Multiple Parallel Universes

Evaluating Touch Gestures for Scrolling on Notebook Computers

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

Simultaneous presentation of tactile and auditory motion on the abdomen to realize the experience of being cut by a sword

Haptics in Remote Collaborative Exercise Systems for Seniors

Development of Video Chat System Based on Space Sharing and Haptic Communication

Haplug: A Haptic Plug for Dynamic VR Interactions

HUMAN COMPUTER INTERFACE

Velvety Massage Interface (VMI): Tactile Massage System Applied Velvet Hand Illusion

Glasgow eprints Service

Computer Haptics and Applications

The Effect of Haptic Feedback on Basic Social Interaction within Shared Virtual Environments

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

A Design Study for the Haptic Vest as a Navigation System

Exploring the Perceptual Space of a Novel Slip-Stick Haptic Surface Display

Cutaneous Feedback of Fingertip Deformation and Vibration for Palpation in Robotic Surgery

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Evaluation of Five-finger Haptic Communication with Network Delay

AR Tamagotchi : Animate Everything Around Us

Research on emotional interaction design of mobile terminal application. Xiaomeng Mao

Ubiquitous Home Simulation Using Augmented Reality

Haptic-Emoticon: Haptic Content Creation and Sharing System To Enhancing Text-Based Communication

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

EMBRACE: THE EMOTION SHARING BRACELET

these systems has increased, regardless of the environmental conditions of the systems.

Social and Spatial Interactions: Shared Co-Located Mobile Phone Use

Comparison between audio and tactile systems for delivering simple navigational information to visually impaired pedestrians

Connexus: An Evocative Interface

Designing Tactile Vocabularies for Human-Computer Interaction

Design of New Micro Actuator for Tactile Display

A Pilot Study: Introduction of Time-domain Segment to Intensity-based Perception Model of High-frequency Vibration

Auditory-Tactile Interaction Using Digital Signal Processing In Musical Instruments

synchrolight: Three-dimensional Pointing System for Remote Video Communication

Reflections on a WYFIWIF Tool for Eliciting User Feedback

The Effect of Frequency Shifting on Audio-Tactile Conversion for Enriching Musical Experience

the human chapter 1 Traffic lights the human User-centred Design Light Vision part 1 (modified extract for AISD 2005) Information i/o

Investigating Phicon Feedback in Non- Visual Tangible User Interfaces

Haptic Cues: Texture as a Guide for Non-Visual Tangible Interaction.

Investigating Gestures on Elastic Tabletops

Markerless 3D Gesture-based Interaction for Handheld Augmented Reality Interfaces

Output Devices - Non-Visual

Toward an Augmented Reality System for Violin Learning Support

Heads up interaction: glasgow university multimodal research. Eve Hoggan

BODILY NON-VERBAL INTERACTION WITH VIRTUAL CHARACTERS

The Advent of New Information Content

HAPTICS AND AUTOMOTIVE HMI

Simulation of Tangible User Interfaces with the ROS Middleware

Autonomic gaze control of avatars using voice information in virtual space voice chat system

TapBoard: Making a Touch Screen Keyboard

Seminar: Haptic Interaction in Mobile Environments TIEVS63 (4 ECTS)

Beyond Visual: Shape, Haptics and Actuation in 3D UI

THE HRI EXPERIMENT FRAMEWORK FOR DESIGNERS

VibroGlove: An Assistive Technology Aid for Conveying Facial Expressions

User Experiences and Expectations of Vibrotactile, Thermal and Squeeze Feedback in Interpersonal Communication

Mudpad: Fluid Haptics for Multitouch Surfaces

Communicating Emotion Through a Haptic Link

Introducing a Spatiotemporal Tactile Variometer to Leverage Thermal Updrafts

Physical and Affective Interaction between Human and Mental Commit Robot

Haptic presentation of 3D objects in virtual reality for the visually disabled

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

Open Research Online The Open University s repository of research publications and other research outputs

IDENTIFYING AND COMMUNICATING 2D SHAPES USING AUDITORY FEEDBACK. Javier Sanchez

Interactive Simulation: UCF EIN5255. VR Software. Audio Output. Page 4-1

Fibratus tactile sensor using reflection image

ForceTap: Extending the Input Vocabulary of Mobile Touch Screens by adding Tap Gestures

Transcription:

CheekTouch: An Affective Interaction Technique while Speaking on the Mobile Phone Young-Woo Park Department of Industrial Design, KAIST, Daejeon, Korea pyw@kaist.ac.kr Chang-Young Lim Graduate School of Culture Technology, KAIST, Daejeon, Korea cylim@kaist.ac.kr Tek-Jin Nam Department of Industrial Design, KAIST, Daejeon, Korea tjnam@kaist.ac.kr Abstract We present a new affective interaction technique, called CheekTouch, by combining tactile feedback, delivered through the cheek, and multi-finger input, while speaking on the mobile phone. We designed a prototype by using a multi-touch mobile device and a 4x3 vibrotactile display device. We identified six affective touch behaviors (pinching, stroking, patting, slapping, kissing and tickling) that can be exchanged through one another s cheeks while speaking on the phone. We mapped the affective touch behaviors on tactile feedback expressions of the vibrotactile display. Results of a preliminary user study suggest that our technique is positively evaluated by the participants and applicable to intimate and emotional communication. Keywords Affective interaction, emotion and affective user interface, cheek based interaction, vibrotactile feedback, mediated touch, multi-touch, mobile phone interface. Copyright is held by the author/owner(s). CHI 2010, April 10 15, 2010, Atlanta, Georgia, USA. ACM 978-1-60558-930-5/10/04. ACM Classification Keywords H5.2. [Information interfaces and presentation (e.g., HCI)]: User Interfaces, Input devices and strategies, Haptic I/O. General Terms Design. 3241

Introduction In designing mobile phones, it has become critical to consider affective interaction [9]. The mobile phone became an emotional and social means. People use mobile phones more for informal and intimate purposes than for work or formal communication. On the other hand, touch is a powerful tool which enables affective interaction between humans. It is considered to be the most fundamental and primitive form of non-verbal communication methods. There is a growing interest on ways of incorporating touch in communication devices. However, as our daily social interaction is being mediated by mobile communication devices, it has become difficult to support affective interaction through touch when people are physically separated. It is important to consider an affective interaction method that maintains natural voice communication. The voice is still the most frequently used channel for communication among other channels such as video and text. However, little has been studied on the ways to integrate touch as a means of affective interaction during voice communication. It is challenging because people cannot easily change modes of interaction or use other channels in parallel while focusing on voice communication. The goal of this research is to propose and evaluate a new affective interaction technique to support emotional and intimate communication while speaking on the mobile phone. The new technique, called CheekTouch, uses cheeks as a medium of interaction. Tactile feedback is delivered on the cheek while people use multiple fingers as input while holding the mobile phone naturally. Related Work Mediated touch for mobile phone communication Haans et al. explained mediated social touch which allows one actor to touch another actor over a distance by means of tactile feedback technology [10]. ComTouch presented a vibrotactile device sleeve that fits over the back of the mobile phone to transfer finger pressure into vibration [7]. This approach can be effective for presenting tactile information for mobile communication. However, using the pressure of one finger is quite limited to express dynamic affective information. Brown s research used Tactons [4], for presenting multi-dimensional information in mobile phone alerts [5]. But, delivering information only with the tactile channel can result in an abstract and incomprehensible language. Mediated touch for emotional interaction DiSalvo et al. presented Hug [8], a soft form cushion. By stroking or squeezing the device, the user can physically deliver emotional states to another user. And LumiTouch [6] enables emotional communication using a digital picture frame. When one user touches her picture frame, the other picture frame lights up. These researches show designs that express emotions with various touch behaviors. These are also highlighted that the uses of everyday objects are useful as a medium for emotional communication. Technologies of tactile stimulation Most widespread technology used in HCI is the offset motor used to generate vibrotactile stimuli. Also, there were attempts to deliver tactile stimulation similar to human-human touch. Kevin et al. presented tapping and rubbing. By moving actuators perpendicular and parallel to the user s skin, the prototype can tap and 3242

rub the user [12]. Oliver et al. presented, BubbleWrap, which uses a matrix of electromagnetic actuators to provide different types of haptic sensations [3]. But these require high weight and voltages to actuate. Many studies on tactile communication using mediated devices showed that touch is important for delivering non-verbal affective information on human to human communication. But little has been studied to enrich affective communication while speaking on the mobile phone. In addition, there has been limited research on intuitive detection and expression of tactile information for use in mobile communication. These issues need to be focused on in order to prevent arbitrary and abstract tactile communication. Understanding Social Meanings of Cheek and Touch in Communication Touching one another s cheek in communication can have various social meanings according to different touch gestures and level of intimateness. Its uses are also culturally diverse. If the cheek is touched by a stranger or person who has a lower social status, this can intend aggressive and impolite manner in Asian culture. On the other hand, cheek touch between intimate friends or lovers can intend more positive and playful affection in Western culture. Also, various types of touches have different symbolic meanings depending on the situation and the touched body part. Haans et al. suggested that a short touch by another person can elicit strong emotional experiences [10]. Touch behavior plays part in giving encouragement, expressing tenderness, and showing emotional support in non-verbal communication. Proposed Technique: CheekTouch Based on the review of related works and theories on social meanings of cheek and touch, we propose a new interaction technique, CheekTouch. It combines tactile feedback delivered via the cheek and multi-finger input while speaking on the mobile phone. It is natural to use because it maintains the posture of speaking on the phone. Also, adding a tactile interface on the cheek can compensate for non-verbal cues on voice communication [14]. If another audio signal is added for nonverbal cue while speaking on the mobile phone, then it might disturb communication and overlap with verbal signals. Also, if video signals are added, users have to move their devices from their cheeks to see the visual messages which could lead to missing verbal signals. Therefore, tactile interfaces on the cheek can enrich communication while speaking on the mobile phone. Mapping Touch Behaviors onto Vibrotactile Patterns To develop CheekTouch and the interface language with touch behaviors and vibrotactile patterns, we first selected six kinds of touch behaviors (patting, slapping, pinching, stroking, tickling and kissing) that can be expressed with multi-finger gestures and also felt on the cheek while speaking on the mobile phone. The rationale for mapping input and output behaviors was to present one s intuitive expression of touches and also to deliver intuitive feedback onto one another. Those two must maintain synchronism while speaking on the mobile phone. Power, direction and position of touches are the elements to determine the pattern of vibrotactile stimulation to deliver intuitive feedback onto one another. 3243

Type of touch Patting Slapping Pinching Stroking Kissing Tickling Meanings in social interaction Comfort, Love, Farewell, Want for concentration Congratulate, Hard joke, Attention, Emphasize Tease, Playful, Intimacy Encouragement, Farewell, Love Love, Friendship, Appreciation, Farewell Tease, Intimacy, Love table 1. Meanings of 6 touch behaviors in social interaction figure 2. Example of expressing touch behavior while speaking on the mobile phone using index and middle finger The selected six touch behaviors were extracted among Argyle s 16 kinds of touch behaviors that are considered to be the most common in human communication [2]. First, we selected the types of touches that can be expressed with multi-finger gestures (patting, slapping, pinching, stroking, shaking, holding, grooming and tickling). Because, we can intuitively express various touch behaviors on the mobile phone's screen using the same fingers that hold the phone while speaking on it. Then, we classified the types of touches that can be outputted on the cheek (patting, slapping, pinching, kissing, stroking and tickling). Because, we naturally use the cheek for mobile voice communication, the rich receptors on the cheek are suitable for detecting various affective finger gestures. Finally, five types of touches are classified as patting, slapping, pinching, stroking and tickling. We added kissing because people from many cultures use this in common to give strong positive affection and also to indicate friendship to one another in communication. The social meanings of the classified six touch behaviors can be explained in Table. 1. This was based on the typical examples and enactment types of touches that Jones et al. identified in six different categories of meanings for individual touches (e.g., positive affection, control, playful, ritualistic, task-related, and accidental touches) [11]. Fig. 1 shows the mapping between touch behaviors and vibrotactile patterns. We used the index and middle finger (Fig. 2) for input, since those fingers are free to do gestures while holding the mobile phone for calling. For example, kissing was expressed by slightly gathering two fingers together (Fig. 1e). figure 1. Mapping between touch input with fingers and vibrotactile feedback pattern on the cheek CheekTouch Implementation Fig. 3 shows the structure of the prototype. We used a portable multi-touch mobile device (Apple s itouch [1]) to express six types of affective touch behaviors while speaking on the mobile phone. We used the mobile device s screen as a mean for expressing touch behaviors and sending the values of the touched points to a PC with OSC (Open Sound Control) messages through Wi-Fi using OSCemote [13] application. The 3244

received values were sent to a 4x3 vibrotactile display device made of 12 coin type actuators via OSC message analyzing software and an Arduino board. Each actuator was packaged with a sponge and soft clay to minimize the spreadability of vibrations. Therefore, this prototype enables one to express touch behaviors, such as patting, and feel the vibrotactile display pattern of patting at the same time while holding the mobile device. used the CheekTouch device (Fig. 3) for the study. In the first part, the experimenter shows a picture of the proposed method of six touch behavior input (Fig. 1) to participants. The participants express each touch behavior three times while thinking about the virtual scenario (e.g. patting: congratulate another for getting a job) given by the experimenter. After that, participants answer 13 questions categorized by five items which are usefulness, ease of use, ease of learning, satisfaction and intention of future usage through a questionnaire based on a 7 point scale (-3 ~ 3). In the second part, the experimenter randomly expresses six touch behaviors three times (total of 18 random touches), and the participants recognize the vibrotactile pattern on the cheek. Then, the participants distinguish which touch behavior was done by the experimenter on the multiple choice sheet. Usefulness Ease of Use Ease of Learning Satisfaction Intention of Future Usage Avg 1.67 1.50 2.17 1.62 1.88 M 1.46 1.25 2.17 1.50 1.50 F 1.88 1.75 2.17 1.73 2.25 table 2. Average usability evaluation result of expressing touch behaviors while speaking on the mobile phone according to gender (N = 12, M: average of male, F: average of female, value between -3 and 3 on a 7 point rating scale) figure 3. Structure of CheekTouch prototype Preliminary User Study Purpose A preliminary study of CheekTouch was divided into two parts. The first part was conducted to evaluate the usability of six touch behaviors while speaking on the phone. The second part was intended to examine the mapping appropriateness of six different vibrotactile feedback patterns on the cheek according to each touch behaviors. Method and Procedure Twelve university students (6 male, 6 female), between 24 and 29 years old, participated in the user study. We Result and Discussion In the result of the first part, it was generally positive in that the average of all five items were higher than 1.5 (Table. 2). Most participants reported that this method was easy to learn (Mean = 2.17) (Table. 2). Female participants reported the highest needs for future usage (Mean = 2.25) (Table. 2) of expressing touch behaviors while speaking on the phone. But three participants reported that expressing pinching behaviors was quite difficult (Mean = 1.25) (Table. 2). Also, nine participants reported that more touches like expressing shapes or unconscious touches should be included and not limited to six touch behaviors. These needs reflected the low results in the satisfaction category. An interesting result was that female participants reported more positively than male participants in all five items. This shows that satisfying needs for affective touches for females could be more 3245

figure 4. Percentage of correctness on each vibrotactile patterns according to touch behaviors important than males. Participants reported a low percentage of correctness on pinching (55.56%) and stroking (50%) touches than kissing (100%) and tickling (91.77%) (Fig. 4). They reported that the vibrotactile patterns of pinching and stroking were hard to distinguish. However, they reported high correctness for tickling. They felt that the sound and vibrating behavior of actuators acted like tickling with fingers. Conclusion and Future Work We made three contributions. First, we presented a new affective interaction technique, CheekTouch, in which its strength lies in its use of intimate tactile communication while speaking on the mobile phone naturally. CheekTouch allows users to communicate more emotionally by combining audio and tactile channels. Second, our preliminary user study informed us of positive future directions of CheekTouch and needs for improvement on touches with the vibrotactile stimulation. Third, we broadened the field of affective mobile interaction by understanding social meanings of cheek and touch in face-to-face human communication. Future work will explore improving vibrotactile stimulation to provide more accurate and various feedback patterns to users. Also, we could improve this technique to exploit more intimate and emotional social interactions in mobile communication between lovers or close friends. We were limited to six touches but users can express unconscious touches or particular emotional shapes like a heart or a smile while speaking on the mobile phone. References [1] Apple I-Pod Touch. www.apple.com/ipodtouch/. [2] Argyle, M. Bodily Communication. New York : International Universities Press (1975). [3] Bau. O, Petreski. U, Mackay. W, BubbleWrap: a textile-based electromagnetic haptic display, Ext. Abstracts CHI 2009, ACM Press (2009), 3607-3612. [4] Brewstar, S., Brown, L. M. Tactons: structured tactile messages for non-visual information display, Proc. AUIC 2004, Volume 28, (2004), 15-23. [5] Brown, L. M. and Kaaresoja, T. Feel who's talking: using tactons for mobile phone alerts. Ext. Abstracts CHI 2006, ACM Press (2006), 604-609. [6] Chang, A., Koerner, B., Resner, B., Wang, X. LumiTouch: An Emotional Communication Device. Ext. Abstracts CHI 2002, ACM Press (2002), 313-314. [7] Chang, A., O'Modhrain, S., Jacob, R., Gunther, E., Hiroshi, I. ComTouch: Design of a vibrotactile communication device. Proc. DIS (2002), 312-320. [8] DiSalvo, C., Gemperle, F., Forlizzi, J., Montgomery, E. The Hug: an exploration of robotic form for intimate communication. Proc. RO-MAN (2003), 1-4. [9] Donald, A. Norman. Emotional Design (2004). [10] Haans, A., Ijsselsteijn, W. Mediated Social Touch : A Review of Current Research and Future Directions, Journal of Virtual Reality (2006), 149-159. [11] Jones, SE., Yarbrough, AE. A naturalistic study of the meanings of touch. Communication Monographs (1985), 19-56. [12] Kevin, A., Patrick, B., Willam, G. G., James, D.H., Tapping and rubbing: exploring new dimensions of tactile feedback with voice coil motors, Proc UIST, ACM Press (2008), 181-190. [13] OSCemote I-Phone application. http://www.appstorehq.com/oscemote-iphone/. [14] Rover, A.F, van Essen, H.A. HIM: A Framework for Haptic Instant Messaging, Proc. CHI 2004, ACM Press (2004), 1313-1316. 3246