Introducing Evaluation

Similar documents
The aims. An evaluation framework. Evaluation paradigm. User studies

An Evaluation Framework. Based on the slides available at book.com

Course Syllabus. P age 1 5

Human Computer Interaction (HCI) Designing Interactive systems Lecture 1 dr Kristina Lapin

Socio-cognitive Engineering

Human Computer Interaction (HCI) 1 lecture dr Kristina Lapin

Ubiquitous. Waves of computing

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

An Integrated Approach Towards the Construction of an HCI Methodological Framework

MEDIA AND INFORMATION

Evaluating Socio-Technical Systems with Heuristics a Feasible Approach?

Human-Computer Interaction IS 4300

Introduction. chapter Terminology. Timetable. Lecture team. Exercises. Lecture website

Model 2.4 Faculty member + student

Human-Computer Interaction

User Experience Design I (Interaction Design)

CSE 190: 3D User Interaction. Lecture #17: 3D UI Evaluation Jürgen P. Schulze, Ph.D.

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

End-to-End Infrastructure for Usability Evaluation of ehealth Applications and Services

Postprint. This is the accepted version of a paper presented at INTERACT'07. Citation for the original published paper:

CS 350 COMPUTER/HUMAN INTERACTION

ANALYSIS AND EVALUATION OF COGNITIVE BEHAVIOR IN SOFTWARE INTERFACES USING AN EXPERT SYSTEM

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

This document is downloaded from DR-NTU, Nanyang Technological University Library, Singapore.

HUMAN COMPUTER INTERFACE

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11

Facilitating Human System Integration Methods within the Acquisition Process

Tableau Machine: An Alien Presence in the Home

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

Available online at ScienceDirect. Procedia Engineering 111 (2015 )

RV - AULA 05 - PSI3502/2018. User Experience, Human Computer Interaction and UI

Communication and Culture Concentration 2013

EXPERIENCES FROM TRAINING AGILE SOFTWARE DEVELOPERS IN FOCUSED WORKSHOPS

Recommendations for E-Voting System Usability: Lessons from Literature for Interface Design, User Studies and Usability Criteria

Week Theory Topic In Class Readings Weekly Applied Project work

The Evolution of User Research Methodologies in Industry

TRIL Technology Research for Independent Living. Seamus Small TRIL Centre Manager 11 th May 2011

Project Lead the Way: Civil Engineering and Architecture, (CEA) Grades 9-12

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Human-Computer Interaction

Interfaces, part 2. Lecture 7 Dr. Kristina Lapin

Modeling Enterprise Systems

User Interface Software Projects

Social Interaction Design (SIxD) and Social Media

Case studies on specific organizations will include, but are not limited to, the following elements:

Chapter 11. Methods for Descriptive Studies

Pearson Education Limited Edinburgh Gate Harlow Essex CM20 2JE England and Associated Companies throughout the world

Exploring the Usability of Video Game Heuristics for Pervasive Game Development in Smart Home Environments

Selecting Photos for Sharing

The Crowd4Sat Project: Augmenting Satellite Observations with Crowdsourcing

Engineered Resilient Systems NDIA Systems Engineering Conference October 29, 2014

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Usability Evaluation Framework for Computer Vision Based Interfaces

Meaning, Mapping & Correspondence in Tangible User Interfaces

CMP2019M Human-Computer Interaction. Week 1 Introduction

Call for Chapters for RESOLVE Network Edited Volume

HUMAN COMPUTER INTERACTION 0. PREFACE. I-Chen Lin, National Chiao Tung University, Taiwan

2009 New Jersey Core Curriculum Content Standards - Technology

CS 889 Advanced Topics in Human- Computer Interaction. Experimental Methods in HCI

Lecture 1 - Introduction to HCI CS-C

Analyzing the User Inactiveness in a Mobile Social Game

Dix, Alan; Finlay, Janet; Abowd, Gregory; & Beale, Russell. Human- Graduate Software Engineering Education. Technical Report CMU-CS-93-

BOX, Floor 5, Tower 3, Clements Inn, London WC2A 2AZ, United Kingdom

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger.

The Disappearing Computer. Information Document, IST Call for proposals, February 2000.

INFORMATION VISUALISATION IN CONSTRUCTION INDUSTRY; A QUALITY PERSPECTIVE

BIG IDEAS. Personal design interests require the evaluation and refinement of skills. Learning Standards

Violent Intent Modeling System

Northfleet Technology College Course Outline: Information Technology in a Global Society

Methodologies for researching the usability of applications within mobile communities

Chapter 5 - Evaluation

Joining Forces University of Art and Design Helsinki September 22-24, 2005

Michael DeVries, M.S.

The Acceptance Design Model for Evaluating the Adoption of Folksonomies in UUM Library WEB OPAC

PROFESSIONAL DEVELOPMENT IN DESIGN THINKING

Context-sensitive Approach for Interactive Systems Design: Modular Scenario-based Methods for Context Representation

FP7 ICT Call 6: Cognitive Systems and Robotics

Exploring Activity-Based Ubiquitous Computing: Interaction Styles, Models and Tool Support

Project Lead the Way: Principles of Engineering, (POE) Grades 9-12

COLLABORATION WITH TANGIBLE AUGMENTED REALITY INTERFACES.

Reflecting on the Seminars: Roman Bold, Roman Bold, Orienting The Utility of Anthropology in Design

SMART USER INTERFACE FOR UBIQUITOUS COMPUTING: A STATE OF THE ART OF THE COMPUTER TO HUMAN COMMUNICATION

The University of Algarve Informatics Laboratory

Using smartphones for crowdsourcing research

Interaction Techniques in VR Workshop for interactive VR-Technology for On-Orbit Servicing

PRESS RELEASE EUROSATORY 2018

Improving the Design of Virtual Reality Headsets applying an Ergonomic Design Guideline

Human Interface/ Human Error

Towards Understanding and Improving IT Security Management Konstantin (Kosta) Beznosov

ADVANCES IN THE USABILITY AND USE OF HEALTHCARE INFORMATION SYSTEMS: EXPERIENCES IN CANADA AND INTERNATIONALLY

City, University of London Institutional Repository

EXPERIENTIAL MEDIA SYSTEMS

RISE OF THE HUDDLE SPACE

Map of Human Computer Interaction. Overview: Map of Human Computer Interaction

in the New Zealand Curriculum

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Test at a Glance. Updated June 2017

Transcription:

Projektas Informatikos ir programų sistemų studijų programų kokybės gerinimas ( VP1-2.2-ŠMM-07-K-02-039) Introducing Evaluation Lecture 13 Dr Kristina Lapin

Outline The types of evaluation Evaluation case studies Evaluation framework DECIDE Language of evaluation

The aims Explain the key concepts used in evaluation. Introduce different evaluation methods. Show how different methods are used for different purposes at different stages of the design process and in different contexts. Show how evaluators mix and modify methods. Discuss the practical challenges Illustrate how methods discussed in Chapters 7 and 8 are used in evaluation and describe some methods that are specific to evaluation.

Why, what, where and when to evaluate Iterative design & evaluation is a continuous process that examines: Why: to check users requirements and that users can use the product and they like it. What: a conceptual model, early prototypes of a new system and later, more complete prototypes. Where: in natural and laboratory settings. When: throughout design; finished products can be evaluated to collect information to inform new products.

Bruce Tognazzini tells you why you need to evaluate Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain. See AskTog.com for topical discussions about design and evaluation. http://www.asktog.com/columns/037testorelse.html

The language of evaluation Analytical evaluation Controlled experiment Field study Formative evaluation Heuristic evaluation Predictive evaluation Summative evaluation Usability laboratory User studies Usability studies Usability testing User testing

The types of evaluation Controlled settings involving users Examples laboratories and living labs Methods: usability testing experiments User s activities are controlled in order to test hypothesis and measure or observe certain behaviors + Good at revealing usability problems Poor at capturing context of use

Usability lab A combination of methods experiments, observation, interviews questionnaires Controlled environment http://iat.ubalt.edu/?page_id=13

Living labs People s use of technology in their everyday lives can be evaluated in living labs. Such evaluations are too difficult to do in a usability lab. Eg the Aware Home was embedded with a complex network of sensors and audio/video recording devices (Abowd et al., 2000). MIT Living Labs have been developed to evaluate people s everyday lives livinglabs.mit.edu.

The types of evaluation Natural settings involving users E.g. online communities and public places A little or no control of users activities in order to determin how the product would be used in the real world. Method: field studies to see how the product is used in the real world + Good at demonstrating how people use technologies Expensive and difficult to conduct

Natural Settings Involving Users Help identify opportunities for a new technologies Help establish requirements for a new design Facilitate the introduction of technology, or inform deployment of existing technology in the new context Methods: observation and logging In the wild studies: real and virtual environments

Studies in the wild ethnographic participant observation for two years 2007-2009 Academic conference in WoW Bainbridge, 2010

The types of evaluation Any settings not involving users consultants critique to predict, analyze & model aspects of the interface analytics Methods: Inspections, heiristics, walkthroughts, models and analytics + Cheap and quick to perform Can miss unpredictable usability problems and sublte aspects of user experience

Any settings Not Involving Users Inspection methods or modelling predict user user bahavior and to identify usability problems Heuristic evaluation (Nielsen, Tahir, 2002) Cognitive Walkthrough (Warthon, Rieman, lewis, Polson 1994) Analytics logging data analysis (Arikan, 2008) Models for comparing efficacy Keyboard Level Models, Fitts Law, Hick s Law

Characteristics of approaches Controlled environme nt with users Natural environme nt with users Any setting without users Users do task natural not involved Location controlled natural anywhere When prototype early prototype Data quantitative qualitative problems Feed back measures & errors descriptions problems Type applied naturalistic expert

Usability testing & field studies can compliment

Opportunistic evaluations Done early in the design process to provide designers with feedback quickly about the design idea. Early evaluations are informal and cheap. Helps developers to decide if an idea needs to be modifies or abandoned

Outline The types of evaluation Evaluation case studies Evaluation framework DECIDE Language of evaluation

Evaluation case studies Experiment to investigate a computer game In the wild field study of skiers Crowdsourcing

Challenge & engagement in a collaborative immersive game Physiological measures were used. Players were more engaged when playing against another person than when playing against a computer. What precautionary measures did the evaluators take? Mandryk, Inkpen 2004)

Challenge & engagement in a collaborative immersive game Mandryk, Inkpen 2004) 21

Challenge & engagement in a collaborative immersive game Mandryk, Inkpen 2004) 22

Challenge & engagement in a collaborative immersive game Mandryk, Inkpen 2004)

Challenge & engagement in a collaborative immersive game Mandryk, Inkpen 2004) 24

Challenge & engagement in a collaborative immersive game

Challenge & engagement in a collaborative immersive game What kind of setting was used in thus experiment? How much control did the evaluators exert? Which methods were recorded and when?

Why study skiers in the wild? Jambon et al. (2009) User experience in the wild. In: Proceedings of CHI 09, ACM Press, New York, p. 4070-4071.

e-skiing system components Jambon et al. (2009) User experience in the wild. In: Proceedings of CHI 09, ACM Press, New York, p. 4072.

Crowdsourcing-when might you use it?

Evaluating an ambient system The Hello Wall is a new kind of system that is designed to explore how people react to its presence. What are the challenges of evaluating systems like this?

Method Evaluation methods Controlled settings Natural settings Without users Observing x x Asking users x x Asking experts x x Testing x Modeling x

Outline The types of evaluation Evaluation case studies Evaluation framework DECIDE Language of evaluation

The aims are: Introduce and explain the DECIDE framework. Discuss the conceptual, practical, and ethical issues involved in evaluation. www.id-book.com 33

DECIDE: a framework to guide evaluation Determine the goals. Explore the questions. Choose the evaluation methods. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, analyze, interpret and present the data. www.id-book.com 34

Determine the goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the methods used for the study. Goals vary and could be to: identify the best metaphor for the design check that user requirements are met check for consistency investigate how technology affects working practices improve the usability of an existing product www.id-book.com 35

1. Determine the goals The HutchWorld patient support system distributed virtual community for the Fred Hutchinson Cancer Research Center in Seattle, Wa Which metaphor? 36

Explore the questions Questions help to guide the evaluation. The goal of finding out why some customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into subquestions: What are customers attitudes to e-tickets? Are they concerned about security? Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone? www.id-book.com 37

What goals and explore questions would you set for Hello.Wall? Ambient display Hello.Wall http://www.youtube.com/watch?v=qhna_9i8i9i&feature=playlist&p=c9f 2937C5CF2DD51&index=2 38

Choose the evaluation approach & methods The evaluation method influences how data is collected, analyzed and presented. E.g. field studies typically: Involve observation and interviews. Involve users in natural settings. Do not involve controlled tests. Produce qualitative data. www.id-book.com 39

Identify practical issues For example, how to: Select users Find evaluators Select equipment Stay on budget Stay on schedule www.id-book.com 40

Decide about ethical issues Develop an informed consent form Participants have a right to: - Know the goals of the study; - Know what will happen to the findings; - Privacy of personal information; - Leave when they wish; - Be treated politely. www.id-book.com 41

Evaluate, interpret & present data Methods used influence how data is evaluated, interpreted and presented. The following need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you expected? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment influencing the findings? i.e. Hawthorn effect. www.id-book.com 42

Key points Many issues to consider before conducting an evaluation study. These include: goals of the study; involvment or not of users; the methods to use; practical & ethical issues; how data will be collected, analyzed & presented. The DECIDE framework provides a useful checklist for planning an evaluation study. www.id-book.com 43

Outline The types of evaluation Evaluation case studies Evaluation framework DECIDE Language of evaluation

The language of evaluation Analytics Analytical evaluation Controlled experiment Expert review or crit Field study Formative evaluation Heuristic evaluation In the wild evaluation Living laboratory Predictive evaluation Summative evaluation Usability laboratory User studies Usability testing Users or participants

Key points Evaluation & design are closely integrated in user-centered design. Some of the same techniques are used in evaluation as for establishing requirements but they are used differently (e.g. observation interviews & questionnaires). Three types of evaluation: laboratory based with users, in the field with users, studies that do not involve users The main methods are: observing, asking users, asking experts, user testing, inspection, and modeling users task performance, analytics. Dealing with constraints is an important skill for evaluators to develop.

References Rogers, Sharp, Preece (2011). Interaction design: Beyond Human Computer Interaction. Wiley. G.D. Abowd, C.G. Atkeson, A.E. Bobick, I.A. Essa, B. MacIntyre, E.D. Mynatt, T.E. Starner (2000) Living Laboratories: The Future Computing Environments Group at the Georgia Institute of Technology. In CHI 00 Extended Absrtracts on Human Factors in Couputing Systems, CHI 2000, ACM, pp. 215-206 J. Nielsen, M. Tahir (2002) Homepage usability : 50 websites deconstructed. New Riders Press. (MIF bibliotekoje) Jeffrey Heer, Michael Bostock (2010) Crowdsourcing Graphical Perception: Using Mechanical Turk to Assess Visualization Design. In Proceedings of CHI 2010, ACM, pp. 203-212. Bainbridge, W.S. (2010) The Warcraft Civilization: Social Science in Virtual World. MIT Press, Camridge, MA. 47

References Cockton, G., Woolrych, A. (2008) Inspection-Based Evaluations. In The Human-Computer Interaction handbook: Fundamentakls, \evolving technologies and Wemerging Applications, A. Sears, J.A.Jacko (Eds) CRC Press. (MIF bibliotekoje) Warthon, C., Rieman, J., Lewis, C., Polson, P (1994) The cognitive walkthrough method: a practitioner's guide. In Book, Usability inspection methods, John Wiley & Sons, Inc. New York, NY, USA. Arikon, A. (2008). Multichannel Marketing: metrics and methods for on- and offline seccess. Sybex. R.L. Mandryk, K.M. Inkpen (2004) Physiological Indicators for the Evaluation of Co-located Collaborative Play. In CSCW 2004, ACM Press, pp. 102-111. T. Hollingsed, D.G. Novick (2007) Usability Inspection Methods after 15 Years of Research and Practice. SIGDOC 07, ACM. 48

A project for you The Butterfly Ballot: Anatomy of disaster was written by Bruce Tognazzini, and you can find it by going to AskTog.com and looking through the 2001 column. Alternatively go directly to: http://www.asktog.com/columns/04 2ButterflyBallot.html

A project for you continued Read Tog s account and look at the picture of the ballot card. Make a similar ballot card for a class election and ask 10 of your friends to vote using the card. After each person has voted ask who they intended to vote for and whether the card was confusing. Note down their comments. Redesign the card and perform the same test with 10 different people. Report your findings.

A project for you Find an evaluation study from the list of URLs on this site or one of your own choice. Use the DECIDE framework to analyze it. Describe the aspects of DECIDE that are explicitly addressed in the report and which are not. On a scale of 1-5, where 1 = poor and 5 = excellent, how would you rate this study? www.id-book.com 51