AFFECTIVE COMPUTING FOR HCI

Similar documents
Designing for Affective Interactions

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

h2 o Technology-Sense and People-Sensibility

Exploring YOUR inner-self through Vocal Profiling

Interface Design V: Beyond the Desktop

Kissenger: A Kiss Messenger

Introduction to Humans in HCI

Handling Emotions in Human-Computer Dialogues

Associated Emotion and its Expression in an Entertainment Robot QRIO

Emotions in HCI. Fraunhofer IGD Rostock Fraunhofer IGD Rostock. An Affective E-Learning System. Robin Kaiser

REBO: A LIFE-LIKE UNIVERSAL REMOTE CONTROL

Short Course on Computational Illumination

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

Autonomic gaze control of avatars using voice information in virtual space voice chat system

Computer Vision in Human-Computer Interaction

Home-Care Technology for Independent Living

Physical and Affective Interaction between Human and Mental Commit Robot

Measuring emotions: New research facilities at NHTV. Dr. Ondrej Mitas Senior lecturer, Tourism, NHTV

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

A SURVEY OF SOCIALLY INTERACTIVE ROBOTS

Towards affordance based human-system interaction based on cyber-physical systems

Game Studies. Prepare to be schooled.

Overview of Workshop 3: Qualities

The UCD community has made this article openly available. Please share how this access benefits you. Your story matters!

Touch Perception and Emotional Appraisal for a Virtual Agent

Learning Canned Presentations or Scripts By Mike Ferry

DISTINGUISHING USERS WITH CAPACITIVE TOUCH COMMUNICATION VU, BAID, GAO, GRUTESER, HOWARD, LINDQVIST, SPASOJEVIC, WALLING

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

HUMAN COMPUTER INTERFACE

GULLIVER PROJECT: PERFORMERS AND VISITORS

A Practical Approach to Understanding Robot Consciousness

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART

Assess how research on the construction of cognitive functions in robotic systems is undertaken in Japan, China, and Korea

S&T Stakeholders Conference

THE POSITION OF THE USER EXPERIENCE IN THE ACADEMIC LIBRARY

Tableau Machine: An Alien Presence in the Home

Properties of Sound. Goals and Introduction

LabVIEW based Intelligent Frontal & Non- Frontal Face Recognition System

Deconstructing Anger

Mindfulness: The Key to Health and Wellness. John Orr, MA, LPCC-S Mindful Youth Cincinnati, OH

Mindfulness: The Power of Clarity. Track 1 Session 2

Robot Diaries. Broadening Participation in the Computer Science Pipeline through Social Technical Exploration

The University of Algarve Informatics Laboratory

INDE/TC 455: User Interface Design

Humanoid robot. Honda's ASIMO, an example of a humanoid robot

Live Feeling on Movement of an Autonomous Robot Using a Biological Signal

STRATEGO EXPERT SYSTEM SHELL

Human Factors in Control

Playware Research Methodological Considerations

Experiment HP-1: Facial Electromyograms (EMG) and Emotion

Natural User Interface (NUI): a case study of a video based interaction technique for a computer game

Levels of Description: A Role for Robots in Cognitive Science Education

Development of an Interactive Humanoid Robot Robovie - An interdisciplinary research approach between cognitive science and robotics -

Non Verbal Communication of Emotions in Social Robots

Towards Wearable Gaze Supported Augmented Cognition

Introduction to Mediated Reality

Realizing Human-Centricity: Data-Driven Services

Human-Robot Collaborative Dance

AR Tamagotchi : Animate Everything Around Us

Adapting Data Collection Methods for Different Participants of the User Study: to Improve the Empathic Understanding between Designers and Users

Vision-based User-interfaces for Pervasive Computing. CHI 2003 Tutorial Notes. Trevor Darrell Vision Interface Group MIT AI Lab

An interdisciplinary collaboration of Theatre Arts and Social Robotics: The creation of empathy and embodiment in social robotics

Sven Wachsmuth Bielefeld University

Essay on A Survey of Socially Interactive Robots Authors: Terrence Fong, Illah Nourbakhsh, Kerstin Dautenhahn Summarized by: Mehwish Alam

Facilitation of Affection by Tactile Feedback of False Heartbeat

Network Institute Tech Labs

Humanoid Robots. by Julie Chambon

Designing for Spatial Multi-User Interaction. Eva Eriksson. IDC Interaction Design Collegium

ACTIVE LISTENING SKILLS. 1. Nonverbal skills: eye contact, open body posture, nodding head

Homunculus Love: Playing with People s Monsters

The Intel Science and Technology Center for Pervasive Computing

Intent Expression Using Eye Robot for Mascot Robot System

Digital Olfaction Society Fourth World Congress December 3-4, 2018 Tokyo Institute of Technology 0

Seaman Risk List. Seaman Risk Mitigation. Miles Von Schriltz. Risk # 2: We may not be able to get the game to recognize voice commands accurately.

Journal Title ISSN 5. MIS QUARTERLY BRIEFINGS IN BIOINFORMATICS

Flood Snakes & Ladders

Evaluating 3D Embodied Conversational Agents In Contrasting VRML Retail Applications

Definitions of Ambient Intelligence

Ensuring the Safety of an Autonomous Robot in Interaction with Children

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

11 Things You Need to Know When Hiring an Event Planner.

Human Factors. We take a closer look at the human factors that affect how people interact with computers and software:

ELG 5121/CSI 7631 Fall Projects Overview. Projects List

The Year of the Scientist: Balancing Health and Science

RUNNYMEDE COLLEGE & TECHTALENTS

Cognitive Media Processing

How Can I Deal With My Anger?

Eye catchers in comics: Controlling eye movements in reading pictorial and textual media.

Interactive Coffee Tables: Interfacing TV within an Intuitive, Fun and Shared Experience

Modeling Human-Robot Interaction for Intelligent Mobile Robotics

This list supersedes the one published in the November 2002 issue of CR.

ETICA E GOVERNANCE DELL INTELLIGENZA ARTIFICIALE

Emotional BWI Segway Robot

Informing a User of Robot s Mind by Motion

ISCW 2001 Tutorial. An Introduction to Augmented Reality

SIGVerse - A Simulation Platform for Human-Robot Interaction Jeffrey Too Chuan TAN and Tetsunari INAMURA National Institute of Informatics, Japan The

Design and evaluation of Hapticons for enriched Instant Messaging

Overview: Emerging Technologies and Issues

SM 3511 Interface Design. Introduction

Transcription:

AFFECTIVE COMPUTING FOR HCI Rosalind W. Picard MIT Media Laboratory 1 Introduction Not all computers need to pay attention to emotions, or to have emotional abilities. Some machines are useful as rigid tools, and it is fine to keep them that way. However, there are situations where the human-machine interaction could be improved by having machines naturally adapt to their users, and where communication about when, where, how, and how important it is to adapt involves emotional information, possibly including expressions of frustration, confusion, disliking, interest, and more. Affective computing expands humancomputer interaction by including emotional communication together with appropriate means of handling affective information. This paper highlights recent and ongoing work at the MIT Media Lab in affective computing, computing that relates to, arises from, or deliberately influences emotion. This work currently targets four broad areas related to HCI: (1) Reducing user frustration; (2) Enabling comfortable communication of user emotion; (3) Developing infrastructure and applications to handle affective information; and, (4) Building tools that help develop social-emotional skills. 2 Reducing User Frustration Not only do many people feel frustration with technology, but they show it. A widely-publicized 1999 study by Concord Communications in the U.S. found that 84% of help-desk managers surveyed said that users admitted to engaging in violent and abusive behavior toward computers. It seems that no matter how hard we researchers work on perfecting the machine and interface design, frustration can occur in the interaction. Most HCI research has aimed to prevent frustration, which continues to be an important goal. However, there is also a need to address frustration at run-time. Affective computing can be used to

address both: (1) Design-time and run-time identification of frustrating situations, and (2) Helping reduce user frustration during an interaction. We have developed a system that gathers and analyzes two physiological signals together with mouse clicks in an effort to characterize episodes of user behavior when the user experiences problems (Fernandez and Picard 1997). Initial results were significantly better than random at detecting and recognizing such episodes in 21 out of 24 users. We are also adapting mice with pressure sensors to make it easy for people to deliberately express frustration at an application, and to have these moments of expression associated with software events. Even if the system is not smart enough to fix the problem that irritates you, it could (perhaps anonymously) begin to let designers know what those things are--- providing a kind of continuous human factors analysis. It looks like things didn t go very well, and We apologize to you for this inconvenience are example statements that people use in helping one another manage negative emotions once they have occurred. Such statements are known to help alleviate strong negative emotions such as frustration or rage. But can a computer, which doesn t have feelings of caring, use such techniques effectively to help a user who is having a hard time? To investigate, we built an agent that practices some active listening, empathy, and sympathy, and tested it with 70 users who experienced various levels of frustration (Klein et al., 1999). The agent assesses frustration and interacts with the user through a text dialogue box (with no face, voice, fancy animation or use of the pronoun I ). Compared to two control conditions, interactions with the emotion-savvy agent led to behavior indicative of a significant decrease in frustration. These results suggest that today s machines can begin to help reduce frustration, even when they are not yet smart enough to identify or fix the cause of the frustration. 3 Enabling Communication of User Emotion People naturally express emotion to machines, but machines do not naturally recognize it. Emotion communication requires that a message be both sent and received. In addition to the efforts above aimed at user frustration, we are building tools to facilitate deliberate emotional expression by people, and to enable machines to recognize meaningful patterns of such expression. Emotion can be sensed in an ongoing way, or by interrupting the user for feedback. Consider a focus group where participants are asked to indicate clarity of packaging labels. If while reading line 3, a subject furrows his brow in confusion, then he has communicated in parallel with the task at hand, which has many advantages. Alternatively, he could stop at the end of the task and rate the label as mildly confusing on a questionnaire---non-parallel affective communication---occuring via interruption of or at completion of the primary

task. We are working to enable both kinds of communication, e.g., via eyeglasses that sense chances in facial muscles, such as furrowing the brow in confusion or interest (Scheirer et al., 1999). One advantage of these expression glasses is that they can be used in parallel with concentrating on a task or not, and can be activated either unconsciously or consciously. People are free to have a poker face to mask true confusion if they do not want to communicate their true feelings, and we think this is good. We are also exploring multi-modal means of emotion communication. Current recognition rates are up to 81% in automatically detecting and recognizing which of eight emotions an actress expressed through four physiological channels (Vyzas and Picard 1999), which is at a level comparable to machine recognition of facial and vocal expressions. We are also beginning to analyze affect in speech jointly with other natural modes of expression. However, all these efforts seem to push the abilities of traditional pattern recognition and signal processing algorithms, which have difficulty handling the day-to-day and interpersonal variations of emotional expression; consequently, we are conducting basic research in machine learning theory and in pattern recognition to develop better methods. It is important to keep in mind that some people do not feel comfortable with parallel communication of affect, especially with methods involving signals that people do not usually see. Users may prefer either no sensing, or nonparallel communication means such as dialogue boxes that they control, or tangible or non-tangible icons that they can hit, kick or otherwise interact with to directly communicate affective feedback. People have strong feelings about if, when, where, and how they want to communicate their emotions, and it would be absurd if affective computing technology did not respect these feelings. It is important to develop a variety of means and give users choices. 4 Developing Infrastructure and Applications Most people think it should be easy to gather data on frustration expression: Just sit a subject down in front of a computer running a certain operating system, and voilà!. Alternatively, hire an actor or actress to express emotions, and record them. If the actor uses method acting or another technique to try to self-induce true emotional feelings, then the results may closely approximate emotions that arise in natural situations. However, these examples are not as straightforward as they may seem at first: they are complicated by issues such as the artificiality of bringing people into laboratory settings, the mood and skill of an actor, whether or not an audience is present, the expectations of the subject who thinks you are trying to frustrate them, the unreliability of a given stimulus for inducing emotion, the fact that some emotions can be induced simply by a

subject s thoughts (over which experimenters have little or no control), and the sheer difficulty of accurately sensing, synchronizing and understanding the ground truth of emotional data. We have developed lab-based experimental methodologies for gathering data (Riseberg et al..1998). However, the best way to get realistic data may be to catch people expressing emotions to technology in everyday situations. Wearable and ubiquitous computing both offer new possibilities toward this goal. We have built affective wearables that sense information from a willing wearer going about daily activities (Picard and Healey 1997). Some of these wearables have been adapted to control devices for the user, such as a camera that saves video based on your arousal-response (Healey and Picard 1998), and a wearable DJ that not only tries to select music you like, but music that suits a feature of your mood (Healey et al.. 1998). We are sensing data from drivers in situ to learn about natural driving behaviors under stress (Healey et al.. 1999). We have also designed and built a wearable system to measure features of expression from professional conductors (Marrin and Picard 1998). Marrin is now adapting this conductor s jacket so the wearer can control the play of MIDI music in real-time while making expressive conducting gestures. 5 Building Tools to Develop Social-Emotional Skills Autistics, who tend to have severely impaired social emotional skills, have sometimes expressed that they love communicating by computer: computers allow for little transmission of non-verbal affective information and help level the playing field for them to communicate with non-autistics. Current intervention techniques for autistic children suggest that many of them can make progress recognizing and understanding the emotional expressions of people if given lots of examples to learn from, and extensive training with these examples. We have developed a system that is aimed at helping young autistic children learn to associate emotions with expressions and with situations. The system plays videos of both natural and animated situations giving rise to emotions, and the child interacts with the system by picking up one or more stuffed dwarfs that represent the set of emotions under study, and that wirelessly communicate with the computer. This effort, led by Kathi Blocher, is being tested with autistic kids aged 3-7 this month. We are also developing a stuffed animal, Tigger, that exhibits expressive behaviors in response to how a child plays with it, discriminating potentially abusive actions like poking of the eyes from potentially playful actions like bouncing and light pulling on the tail. This work, led by Dana Kirsch, is also undergoing trials with young children. Over the years, scientists have aimed to make machines that are intelligent and that help people be intelligent. However, they have almost completely ignored

the role of emotion in intelligence, leading to an imbalance where emotions are almost always ignored. We do not wish to see the scale tilted out of balance the other way, where machines twitch at every emotional expression or become overly emotional and utterly intolerable. However, we think research is needed to learn about how affect can be used in a balanced, respectful and intelligent way; this should be the practical aim of affective computing in HCI. 6 References These support this brief overview of HCI-related work in affective computing at the MIT Media Lab; for our references to related research not conducted at the MIT Media Lab, please see the lists in these articles. Fernandez, R. and Picard, R.W. (1997) Signal Processing for Recognition of Human Frustration, Proc. IEEE ICASSP 98, Seattle, WA. Healey, J., Dabek, F. and Picard, R.W. (1998). A New Affect-Perceiving Interface and its Application to Personalized Music Selection, Proc. 1998 Workshop on Perceptual User Interfaces, San Fransisco, CA. Healey, J. and Picard, R.W. (1998). StartleCam: A Cybernetic Wearable Camera, Proc. Intl. Symp. on Wearable Computing, Pittsburgh, PA. Healey, J., Seger, J., and Picard, R.W. (1999) Quantifying Driver Stress: Developing a System for Collecting and Processing Bio-Metric Signals in Natural Situations, Proc. Rocky-Mt. Bio-Eng. Symp.. Boulder, CO. Klein, J., Moon, Y, and Picard, R. W. (1999). This Computer Responds to User Frustration. CHI 99, Pittsburgh, PA. Marrin, T. and Picard, R. W. (1998). Analysis of Affective Musical Expression with the Conductor's Jacket, Proc XII Col. Musical Informatics, Gorizia, Italy. Picard, R. W, and Healey, J., (1997). Affective Wearables, Personal Technologies Vol 1, No. 4, 231-240. Riseberg, J., Klein, J., Fernandez, R. and Picard, R.W. (1998). Frustrating the User on Purpose: Using Biosignals in a Pilot Study to Detect the User's Emotional State, CHI 98, Los Angeles, CA. Scheirer, J., Fernandez, R. and Picard, R.W. (1999). Expression Glasses: A Wearable Device for Facial Expression Recognition, CHI '99, Pittsburgh, PA. Vyzas, E., and Picard, R. W. (1999).Online and Offline Recognition of Emotion Expression from Physiological Data, submitted to Workshop on Emotion-Based Agent Architectures, Int. Conf. on Autonomous Agents, Seattle, WA.