The aims. An evaluation framework. Evaluation paradigm. User studies

Similar documents
An Evaluation Framework. Based on the slides available at book.com

Introducing Evaluation

Questionnaire Design with an HCI focus

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition

Course Syllabus. P age 1 5

Socio-cognitive Engineering

An Integrated Approach Towards the Construction of an HCI Methodological Framework

Project Lead the Way: Civil Engineering and Architecture, (CEA) Grades 9-12

Definitions proposals for draft Framework for state aid for research and development and innovation Document Original text Proposal Notes

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

Edgewood College General Education Curriculum Goals

H5ST 04 (SCDHSC0370) Support the Use of Technological Aids to Promote Independence 1

Violent Intent Modeling System

Elizabeth Warson, PhD George Washington University

Figure 1: When asked whether Mexico has the intellectual capacity to perform economic-environmental modeling, expert respondents said yes.

RISE OF THE HUDDLE SPACE

Our responses are interleaved with the questions that were posed in your request for feedback.

SECTION 2. Computer Applications Technology

User Experience Design I (Interaction Design)

WP6 Genomics Organizing the societal debate on the use of genomic information in healthcare

Week 15. Mechanical Waves

Making policy: the contribution of research Sue Duncan

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

MEDIA AND INFORMATION

Literature Review Inventory management is considered as major concerns of every organization. In inventory holding, many steps are taken by managers

Usability Engineering (history) SFU CMPT week 2. (Some) Key questions. Usability engineering (objectives) Human-centered design.

Usability vs. user experience

EMERGING ISSUES IN SUSTAINABLE INDUSTRIAL DESIGN PRACTICE: IMPLICATIONS FOR DESIGNERS, MANUFACTURERS AND EDUCATORS

ETHICAL PERSPECTIVES ON ENGAGING COMMUNITIES IN DEBATES ABOUT NEW TECHNOLOGIES

CCG 360 o Stakeholder Survey

Improving the Design of Virtual Reality Headsets applying an Ergonomic Design Guideline

STUDY ON INTRODUCING GUIDELINES TO PREPARE A DATA PROTECTION POLICY

Governing energy transitions towards a low-carbon society: the role of reflexive regulation and strategic experiments

Towards a learning based paradigm of the futures research

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

UNIVERSIDAD CARLOS III DE MADRID

Valmet Paper Technology Center The proving grounds for improved efficiency

2009 New Jersey Core Curriculum Content Standards - Technology

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

Case Study Protocol NCPI Project 5.1

Communication Studies Courses-1

The Finding Respect and Ending Stigma around HIV (FRESH) Study Intervention Post-Workshop Survey Community Participants

Why Foresight: Staying Alert to Future Opportunities MARSHA RHEA, CAE, PRESIDENT, SIGNATURE I, LLC

Methodology for Agent-Oriented Software

Fachbereich Informatik

A Cultural Study of a Science Classroom and Graphing Calculator-based Technology Dennis A. Casey Virginia Polytechnic Institute and State University

International Cooperation in Strengthening Nuclear Security Capacities within Public Company Nuclear Facilities of Serbia

Technology Engineering and Design Education

Chaloemphon Meechai 1 1

Today s world has placed a renewed emphasis on recycling. New technology has given many former waste products a second chance. Consider the issues:

IAB Europe Guidance THE DEFINITION OF PERSONAL DATA. IAB Europe GDPR Implementation Working Group WHITE PAPER

Computer Ethics. Ethical questions in the design of technology. Viola Schiaffonati October 24 th 2017

CS 350 COMPUTER/HUMAN INTERACTION

Topic 7 and 8 UCD/Sustainability Topic Test

D1.10 SECOND ETHICAL REPORT

Computing Disciplines & Majors

The art of COLD CALLING

A New Storytelling Era: Digital Work and Professional Identity in the North American Comic Book Industry

User Centric Innovation

C 2 A L L Y O U R P A R T N E R I N U S E R E X P E R I E N C E

Usability testing. Frank E. Ritter 28 nov Frank Cornwell, /28/16

Contextual Design Observations

Early Prototyping for Organisational Learning in Uncertain Environments

Ontario Best Practices Research Initiative (OBRI) University Health Network

Security culture and information technology, SECURIT. Jonas Hallberg

Working together to deliver on Europe 2020

The Impact of Social Media: Conducting Independent Enquiry About Social Media Teacher's Slides for HKDSE Liberal Studies (July 2018)

PoS(ICHEP2016)343. Support for participating in outreach and the benefits of doing so. Speaker. Achintya Rao 1

School of Surveying and Construction Management

01.04 Demonstrate how corporations can often create demand for a product by bringing it onto the market and advertising it.

Challenge-led and participatory learning process to facilitate urban strategies for innovation on low carbon futures

Compass. Review of the evidence on knowledge translation and exchange in the violence against women field: Key findings and future directions

Where s The Beep? Privacy, Security, & User (Mis)undestandings of RFID

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

6.0 RESEARCH. 6.1 Overview LESSONS LEARNED

1999 Council for Trade in Services, - D R A F T, Interim Report on Electronic Commerce including for meeting on 9/2/1999

IBM Software Group. Mastering Requirements Management with Use Cases Module 2: Introduction to RMUC

CRITERIA FOR AREAS OF GENERAL EDUCATION. The areas of general education for the degree Associate in Arts are:

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems.

Inventions & Innovations

Communication and Culture Concentration 2013

RCAPS Working Paper Series

A GUIDE TO EFFECTIVE COMMUNICATION WITH MAIN CONTRACTORS

02.03 Identify control systems having no feedback path and requiring human intervention, and control system using feedback.

e-social Science as an Experience Technology: Distance From, and Attitudes Toward, e-research

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

General Questionnaire

Become a Freelance Course Designer & Teacher. Teaching Business Blueprint Week Three Model 2: Freelance. Marketplace.

Enfield CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Oxfordshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Southern Derbyshire CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

South Devon and Torbay CCG. CCG 360 o stakeholder survey 2015 Main report Version 1 Internal Use Only

User requirements. Unit 4

SoberIT Software Business and Engineering institute

Energy for society: The value and need for interdisciplinary research

Portsmouth CCG. CCG 360 o stakeholder survey 2015 Main report. Version 1 Internal Use Only Version 1 Internal Use Only

Critical and Social Perspectives on Mindfulness

UNIVERSITI TEKNOLOGI MARA MODELING THE HUMAN CENTERED DESIGN THROUGH HCI CAPABILITY

in the New Zealand Curriculum

Transcription:

The aims An evaluation framework Explain key evaluation concepts & terms. Describe the evaluation paradigms & techniques used in interaction design. Discuss the conceptual, practical and ethical issues that must be considered when planning evaluations. Introduce the DECIDE framework. Evaluation paradigm User studies Any kind of evaluation is guided explicitly or implicitly by a set of beliefs, which are often under-pined by theory. These beliefs and the methods associated with them are known as an evaluation paradigm User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and with new ones.

Four evaluation paradigms Quick and dirty quick and dirty usability testing field studies predictive evaluation quick & dirty evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in-line with users needs and are liked. Quick & dirty evaluations are done any time. The emphasis is on fast input to the design process rather than carefully documented findings. Usability testing Usability testing involves recording typical users performance on typical tasks in controlled settings. Field observations may also be used. As the users perform these tasks they are watched & recorded on video & their key presses are logged. This data is used to calculate performance times, identify errors & help explain why the users did what they did. User satisfaction questionnaires & interviews are used to elicit users opinions. Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them. In product design field studies can be used to: - identify opportunities for new technology - determine design requirements - decide how best to introduce new technology - evaluate technology in use.

Predictive evaluation Experts apply their knowledge of typical users, often guided by heuristics, to predict usability problems. Another approach involves theoretically based models. A key feature of predictive evaluation is that users need not be present Relatively quick & inexpensive Overview of techniques observing users, asking users their opinions, asking experts their opinions, testing users performance modeling users task performance DECIDE: A framework to guide evaluation Determine the goals the evaluation addresses. Explore the specific questions to be answered. Choose the evaluation paradigm and techniques to answer the questions. Identify the practical issues. Decide how to deal with the ethical issues. Evaluate, interpret and present the data. Determine the goals What are the high-level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor on which to base the design. Check to ensure that the final interface is consistent. Investigate how technology affects working practices. Improve the usability of an existing product.

Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill-defined studies. For example, the goal of finding out why many customers prefer to purchase paper airline tickets rather than e-tickets can be broken down into subquestions: - What are customers attitudes to these new tickets? - Are they concerned about security? - Is the interface for obtaining them poor? Choose the evaluation paradigm & techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented. E.g. field studies do not involve testing or modeling What questions might you ask about the design of a cell phone? Identify practical issues For example, how to: select users stay on budget staying on schedule find evaluators select equipment Decide on ethical issues Develop an informed consent form Participants have a right to: - know the goals of the study - what will happen to the findings - privacy of personal information - not to be quoted without their agreement - leave when they wish - be treated politely

Evaluate, interpret & present data How data is analyzed & presented depends on the paradigm and techniques used. The following also need to be considered: - Reliability: can the study be replicated? - Validity: is it measuring what you thought? - Biases: is the process creating biases? - Scope: can the findings be generalized? - Ecological validity: is the environment of the study influencing it - e.g. Hawthorn effect Pilot studies A small trial run of the main study. The aim is to make sure your plan is viable. Pilot studies check: - that you can conduct the procedure - that interview scripts, questionnaires, experiments, etc. work appropriately It s worth doing several to iron out problems before doing the main study. Ask colleagues if you can t spare real users. Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies. Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users. The DECIDE framework has six parts: - Determine the overall goals - Explore the questions that satisfy the goals - Choose the paradigm and techniques - Identify the practical issues - Decide on the ethical issues - Evaluate ways to analyze & present data A project for you Find an evaluation study from the list of URLs on this site or one of your own choice. Use the DECIDE framework to analyze it. Which paradigms are involved? Does the study report address each aspect of DECIDE? Is triangulation used? If so which techniques? On a scale of 1-5, where 1 = poor and 5 = excellent, how would you rate this study? Do a pilot study