CS 350 COMPUTER/HUMAN INTERACTION

Similar documents
CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D.

Domain Understanding and Requirements Elicitation

User requirements. Unit 4

Perspectives to system quality. Measuring perceived system quality

User Experience Questionnaire Handbook

Usability vs. user experience

Selecting Photos for Sharing

An Integrated Approach Towards the Construction of an HCI Methodological Framework

Contextual Design Observations

Lecture 6: HCI, advanced course, Design rationale for HCI

CCG 360 o Stakeholder Survey

Course Syllabus. P age 1 5

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

ADVANCES IN THE USABILITY AND USE OF HEALTHCARE INFORMATION SYSTEMS: EXPERIENCES IN CANADA AND INTERNATIONALLY

Skylands Learning is your trusted learning advisor. That is our promise your trusted learning advisor. Four simple words.

CSE - Annual Research Review. From Informal WinWin Agreements to Formalized Requirements

SKETCHING THE UX: METHOD. Lesson 11 Sketching the UX: 10 plus 10 method

Design thinking, process and creative techniques

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

Carnegie Mellon University Notice

SKETCHING THE UX: METHOD. Lesson 11 Sketching the UX: 10 plus 10 method

The mobile device should be open to something neutral, like the device s Home screen

South Devon and Torbay CCG. CCG 360 o stakeholder survey 2015 Main report Version 1 Internal Use Only

Enfield CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Oxfordshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Southern Derbyshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Portsmouth CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Chapter 5 - Evaluation

Intelligence Communication in the Digital Age. Dr. Rubén Arcos, Ph.D.

Human Interface/ Human Error

Sutton CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

UNIT VIII SYSTEM METHODOLOGY 2014

R.I.T. Design Thinking. Synthesize and combine new ideas to create the design. Selected material from The UX Book, Hartson & Pyla

the role of mobile computing in daily life

Agile Product Planning

The Evolution of User Research Methodologies in Industry

Designing the consumer experience

Software-Intensive Systems Producibility

AUTOMATING A DESIGN REUSE FACILITY WITH CRITICAL PARAMETERS Lessons Learned in Developing the LINK-UP System

West Norfolk CCG. CCG 360 o stakeholder survey 2014 Main report. Version 1 Internal Use Only Version 7 Internal Use Only

MEDIA AND INFORMATION

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition

Where does architecture end and technology begin? Rami Razouk The Aerospace Corporation

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

Years 5 and 6 standard elaborations Australian Curriculum: Design and Technologies

UNIT-III LIFE-CYCLE PHASES

WINNING HEARTS & MINDS: TIPS FOR EMBEDDING USER EXPERIENCE IN YOUR ORGANIZATION. Michele Ide-Smith Red Gate Software

GREATER CLARK COUNTY SCHOOLS PACING GUIDE. Algebra I MATHEMATICS G R E A T E R C L A R K C O U N T Y S C H O O L S

CC532 Collaborative System Design

Expression Of Interest

Michael DeVries, M.S.

INVOLVING USERS TO SUCCESSFULLY MEET THE CHALLENGES OF THE DIGITAL LIBRARY: A 30 YEAR PERSONAL REFLECTION

Future Personas Experience the Customer of the Future

Week Theory Topic In Class Readings Weekly Applied Project work

Questionnaire Design with an HCI focus

Introducing Evaluation

Fundamentals of Systems Engineering. Human-Systems Engineering

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

2. Publishable summary

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems.

design research as critical practice.

Neuro refers to your brain and your neurology. It is about how you take in information. For example, you

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11

UML and Patterns.book Page 52 Thursday, September 16, :48 PM

Freshman Engineering Drawing and Visualization at Youngstown State University. Hazel M. Pierson, Daniel H. Suchora. Youngstown State University

HUMAN COMPUTER INTERFACE

Understanding Requirements. Slides copyright 1996, 2001, 2005, 2009, 2014 by Roger S. Pressman. For non-profit educational use only

D8.1 PROJECT PRESENTATION

in the New Zealand Curriculum

Envision original ideas and innovations for media artworks using personal experiences and/or the work of others.

Randomized Evaluations in Practice: Opportunities and Challenges. Kyle Murphy Policy Manager, J-PAL January 30 th, 2017

Years 9 and 10 standard elaborations Australian Curriculum: Design and Technologies

SWEN 256 Software Process & Project Management

The aims. An evaluation framework. Evaluation paradigm. User studies

Why Randomize? Dan Levy Harvard Kennedy School

The New Standard for Fire Prevention, Detection, and Extinguishing Solution for Homeowners

Cognitive interviewing. Practice and uses

Innovating From Within

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

Supporting the Self-Management of Chronic Pain Conditions with Tailored Momentary Self-Assessments

2017/18 Mini-Project Building Impulse: A novel digital toolkit for productive, healthy and resourceefficient. Final Report

Selecting, Developing and Designing the Visual Content for the Polymer Series

Joining Forces University of Art and Design Helsinki September 22-24, 2005

How to prepare for a job interview: easy tips and questions to think about

Design and Implementation Options for Digital Library Systems

Topic Paper HRI Theory and Evaluation

CS 889 Advanced Topics in Human- Computer Interaction. Experimental Methods in HCI

Human-Centered Design. Ashley Karr, UX Principal

- Faces - A Special Problem of Object Recognition

Write a Short Story. Short Story Unit Overview:

Applying Usability Testing in the Evaluation of Products and Services for Elderly People Lei-Juan HOU a,*, Jian-Bing LIU b, Xin-Zhu XING c

National Coalition for Core Arts Standards Media Arts Model Cornerstone Assessment: High School- Advanced

H2020 RIA COMANOID H2020-RIA

Closing Thoughts.

Chris James and Maria Iafano

The following slides will give you a short introduction to Research in Business Informatics.

A REAL ESTATE INVESTOR S GUIDE TO MANAGING CONTRACTORS FROM START TO FINISH. LendingOne YOUR LOGO

Improving the Design of Virtual Reality Headsets applying an Ergonomic Design Guideline

Transcription:

CS 350 COMPUTER/HUMAN INTERACTION Lecture 23 Includes selected slides from the companion website for Hartson & Pyla, The UX Book, 2012. MKP, All rights reserved. Used with permission.

Notes Swapping project work days and class days for rest of term. I.e., work days on Tuesdays; class days on Thursdays. Mid-project progress report due date extended to Thursday next week (April 12) 2

Outline Chapter 12 - UX Evaluation Introduction Formative vs. summative evaluation Rigorous vs. rapid UX evaluation methods Empirical vs. analytic methods Data collection techniques Chapter 13 Rapid evaluation methods Design walkthoughs and reviews UX inspection Heuristic evaluation Quasi-empirical methods 3

Introduction: Evaluation 4

Introduction User Testing? No! Users don't like to be tested Instead: user-based design (or UX) evaluation 5

Formative vs. summative evaluation Formative evaluation helps you form design Summative evaluation helps you sum up design When the cook tastes the soup, that s formative When the guests taste the soup, that s summative 6

Formative evaluation Diagnostic nature Uses qualitative data Immediate goal: To identify UX problems and their causes in design Ultimate goal: To fix the problems 7

Summative evaluation Collecting quantitative data To assess level of user experience quality due to a design Especially for assessing improvement in user experience due to iteration of Formative evaluation Re-design 8

Formal summative evaluation Comparative benchmark study based on rigorous experimental design aimed at comparing designs Controlled experiment, hypothesis testing Example, with m by n factorial design, y independent variables Results subjected to statistical tests for significance 9

Formal summative evaluation Contributes to our science base The only way you can make public claims based on your results An important HCI skill, but not covered in this course 10

Informal summative evaluation Partner of formative evaluation Example, measure time on task For engineering summing up or assessing of UX levels Done without experimental controls 11

Informal summative evaluation Usually without validity concerns, such as in sampling, degree of confidence Usually with small number of participants Only summary statistics (e.g., mean and variance) 12

Informal summative evaluation Uses metrics for user performance As indicators of user performance As indicators of design quality Metrics in comparison with preestablished UX target levels (Chapter 10) 13

Informal summative evaluation Results not validated Can be used only to guide engineering development process Cannot make any claims based on your result to your organization or to public An important ethical constraint 14

Engineering evaluation of UX Formative plus informal summative 15

Types of UX evaluation methods Orthogonal dimensions for classifying types Rigorous method vs. rapid method Empirical method vs. analytic method 16

Rigorous UX evaluation methods Use full process Preparation, data collection, data analysis, and reporting Chapters 12 and 14 through 18 Use no shortcuts or abridgements Certainly not perfect But is yardstick by which other evaluation methods are compared 17

Choose a rigorous empirical method When you need maximum effectiveness and thoroughness But expect it to be more expensive and time consuming When you need to manage risk carefully To assess quantitative UX measures and metrics E.g., time-on-task and error rates As indications of how well user does in performance-oriented context 18

Rapid UX evaluation methods Choose a rapid evaluation method For speed and cost savings But expect it to be (possibly acceptably) less effective For early stages of progress When things are changing a lot, anyway When investing in detailed evaluation is not warranted Choose a rapid method for initial reactions and early feedback Design walkthrough Informal demonstration of design concepts 19

Empirical method vs. analytic method Another dimension for classifying types Empirical methods Employ data observed in performance of real user participants Usually data collected in lab-based testing 20

Empirical method vs. analytic method Analytical methods Based on looking at inherent attributes of design Rather than seeing design in use Many rapid UX evaluation methods are analytic Example, design walkthroughs, UX inspection methods 21

Hybrid methods - analytical and empirical Often in practice, methods are a mix Example, expert UX inspection Can involve simulated empirical aspects Expert plays role of user Simultaneously performing tasks Observing UX problems, but much of it is analytical 22

Where the dimensions intersect 23

Formative data collection techniques Critical incident identification Think-aloud technique Both used in rigorous and rapid methods 24

Critical incident identification A critical incident is an event observed within task performance Significant indicator of UX problem Due to effects of design flaws on users Arguably single most important source of qualitative data in formative evaluation Can be difficult until you learn to do it 25

Critical incident identification Critical incident data Detailed and perishable Must be captured immediately and precisely as they arise during usage Essential for isolating specific UX problems That is why alpha and beta testing might not be as effective for formative evaluation 26

Think-aloud technique Participants let us in on their thinking Their intentions Rationale Perceptions of UX problems User participants verbally express their thoughts during interaction experience Also called think-aloud protocol or verbal protocol 27

Think-aloud technique Very effective qualitative data collection technique Technique is simple to use, for both analyst and participant Useful for walk-through of prototype Effective when participant helps with inspection Good for assessing internally felt emotional impact 28

Think-aloud technique Needed when User hesitates A real UX problem is hidden from observation Sometimes you have to remind participants to verbalize 29

Questionnaires A self-reporting data collection technique Primary instrument for collecting quantitative subjective data Used to supplement objective data An evaluation method on its own 30

Questionnaires In past, have been used primarily to assess user satisfaction But can contain probing questions about total user experience Especially good for emotional impact, perceived usefulness Inexpensive and easy to administer But require skill to produce so that data are valid and reliable 31

Semantic differential scales Also called Likert scales Each question posed on range of values describing attribute Most extreme value in each direction on scale is an anchor Scale divided with points between anchors Divide up difference between anchor meanings 32

Semantic differential scales Granularity of the scale Number of discrete points (choices), including anchors, we allow users Typical labeling of a point on a scale is verbal Often with associated numeric value Labels can also be pictorial Example, smiley faces Helps make it language-independent 33

Example: semantic differential scale To assess participant agreement with this statement The checkout process on this Website was easy to use. Might have these anchors: Strongly agree and strongly disagree In between scale might include: Agree, neutral, disagree Could have associated values of +2, +1, 0, -1, and -2 34

System Usability Scale (SUS) Just 10 questions Alternates positive and negative questions Prevents answers without really considering the questions Five-point Likert scale 35

Example: SUS questions 1. I think that I would like to use this system frequently 2. I found the system unnecessarily complex 3. I thought the system was easy to use 4. I would need technical support to be able to use this system 5. I found functions in this system integrated 36

Example: SUS questions 6. I think there is too much inconsistency in this system 7. I would imagine that most people would learn to use this system very quickly 8. I found system very cumbersome to use 9. I felt very confident using the system 10. I needed to learn a lot of things before I could get going 37

System Usability Scale (SUS) Robust, extensively used Widely adapted In public domain Technology independent 38

Adapting questionnaires You can modify an existing questionnaire Choosing a subset of questions Changing the wording in some questions Adding questions to address specific areas of concern Using different scale values Warning: Modifying a questionnaire can damage its validity 39

Evaluating emotional impact Data collection techniques especially for emotional impact Can be measured indirectly in terms of its indicators Emotion is a multifaceted phenomenon Expressed through feelings Verbal and non-verbal languages Facial expressions and other behaviors 40

Evaluating emotional impact Emotional impact indicators Self-reported via verbal techniques Physiological responses observed Physiological responses measured 41

Self reporting of emotional impact Most emotional impact involving aesthetics, emotional values, and simple joy of use Felt by user But not necessarily observed by evaluator Self reporting can tap into these feelings 42

Self reporting of emotional impact Concurrent self reporting Participants comment via think-aloud techniques on feelings and their causes in the user experience Retrospective self-reporting Questionnaires (see AttrakDiff in textbook) 43

Observing physiological responses Self-reporting can be biased Human users cannot always access own emotions So observe physiological responses to emotional impact encounters 44

Observing physiological responses Emotional tells of facial and bodily expressions can be Fleeting, subliminal Easily missed in real-time observation To capture reliably Might make video recordings Do frame-by-frame analysis 45

Bio-metrics Instruments to detect and measure physiological responses Measure autonomic or involuntary bodily changes Triggered by nervous system responses To emotional impact within interaction events 46

Bio-metrics Changes in perspiration measured by galvanic skin response measurements Detects changes in electrical conductivity Pupillary dilation is autonomous indication of Interest, engagement, excitement Downside of biometrics is need for specialized monitoring equipment 47

Evaluating phenomenological aspects of interaction Phenomenological aspects of interaction involve emotional impact over time Not snapshots of usage Not about tasks but about human activities Users invite product into their lives Give it a presence in daily activities Example, how someone uses a smartphone in their life 48

Evaluating phenomenological aspects of interaction Users build perceptions and judgment through exploration and learning As usage expands and emerges Data collection techniques for phenomenological aspects Have to be longitudinal 49

Need for self-reporting Self-reporting techniques often necessary Not as objective as direct observation But a practical solution 50

Introduction: Rapid UX Evaluation 51

Rapid evaluation techniques Aimed almost exclusively at collecting qualitative data Finding UX problems to fix Seldom, if ever, includes quantitative measurements Heavy dependency on practical techniques 52

Rapid evaluation techniques Everything less formal Less protocol and fewer rules Much more variability in process Almost every evaluation session different Tailored to prevailing conditions This flexibility means more spontaneous ingenuity Something experienced practitioners do best 53

Design walk-throughs and reviews Early stages of a project Have only Your conceptual design Scenarios, storyboards Maybe some screen sketches or wireframes Not enough for interacting with customers or users 54

Design walkthrough Easy and quick evaluation method Can be used at almost any stage Especially effective early, before prototype exists Audience can include Design team, UX analysts Subject-matter experts, customer representatives Potential users 55

Design walkthrough Goal is to explore design on behalf of users No interaction, so you (evaluators on the design team) do the driving Leader tells stories about users and usage, intentions and actions, and expected outcomes. 56

Rapid evaluation beyond early stages Uses interactive prototype Including paper prototypes Most of rapid evaluation techniques are variations of Inspection techniques Quasi-empirical testing 57

UX inspection Especially good for early stages and early design iterations Appropriate for existing system that has not undergone previous evaluation For when you cannot afford or cannot do lab-based testing 58

UX inspection Also called expert evaluation or expert inspection or heuristic evaluation (HE) But heuristic evaluation is actually one specific kind of inspection (Nielsen) 59

UX inspection Reminder: Cannot inspect the user experience But inspect design for user experience issues An analytical evaluation method The primary rapid evaluation technique 60

Heuristic evaluation Is one kind of UX inspection method A heuristic is a simplified, abstracted design guideline Drive inspection with small number (about 10) of heuristics 61

Heuristic evaluation Example heuristic: Visibility of System Status The system should always keep users informed about what is going on through appropriate feedback within reasonable time. 62

Heuristic evaluation Another example heuristic: Match Between System and The Real World The system should speak the users language, with words, phrases, and concepts familiar to the user rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order. Full listing of heuristics in book, link on course webpage 63

Emotional impact inspection Look for fun, aesthetics, innovation, Include packaging and out-of-the-box experience Try to envision long-term experience 64

The RITE UX Evaluation Method Rapid Iterative Testing and Evaluation (Wixon et al.) A quasi-empirical method A kind of abridged version of user-based testing Fast collaborative test-and-fix cycle Pick low-hanging fruit Relatively low cost 65

Quasi-empirical methods No formal predefined benchmark tasks For tasks, draw on Usage scenarios Essential use cases, step-by-step task interaction models 66

Quasi-empirical methods Cut corners as much as possible No quantitative data collected Single paramount mission is to identify UX problems that can be fixed efficiently Forget controlled conditions Interrupt and intervene at opportune moments Elicit thinking aloud Ask for explanations and specifics 67

Quasi-empirical methods Defined by freedom given to practitioners: To innovate, to make it up as they go To be flexible about goals and approaches To make impromptu changes of pace, direction, focus 68