An Evaluation Framework. Based on the slides available at book.com

Similar documents
The aims. An evaluation framework. Evaluation paradigm. User studies

Introducing Evaluation

Questionnaire Design with an HCI focus

Interaction Design. Beyond Human - Computer Interaction. 3rd Edition

Socio-cognitive Engineering

Course Syllabus. P age 1 5

Violent Intent Modeling System

Usability Engineering (history) SFU CMPT week 2. (Some) Key questions. Usability engineering (objectives) Human-centered design.

An Integrated Approach Towards the Construction of an HCI Methodological Framework

Project Lead the Way: Civil Engineering and Architecture, (CEA) Grades 9-12

H5ST 04 (SCDHSC0370) Support the Use of Technological Aids to Promote Independence 1

ETHICAL PERSPECTIVES ON ENGAGING COMMUNITIES IN DEBATES ABOUT NEW TECHNOLOGIES

Figure 1: When asked whether Mexico has the intellectual capacity to perform economic-environmental modeling, expert respondents said yes.

Our responses are interleaved with the questions that were posed in your request for feedback.

RISE OF THE HUDDLE SPACE

WP6 Genomics Organizing the societal debate on the use of genomic information in healthcare

Towards a learning based paradigm of the futures research

PoS(ICHEP2016)343. Support for participating in outreach and the benefits of doing so. Speaker. Achintya Rao 1

2009 New Jersey Core Curriculum Content Standards - Technology

Edgewood College General Education Curriculum Goals

Making policy: the contribution of research Sue Duncan

Definitions proposals for draft Framework for state aid for research and development and innovation Document Original text Proposal Notes

Ethics Review Data Sharing Bridging Legal Environments

Usability vs. user experience

CCG 360 o Stakeholder Survey

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

Improving the Design of Virtual Reality Headsets applying an Ergonomic Design Guideline

Elizabeth Warson, PhD George Washington University

Contextual Design Observations

Picks. Pick your inspiration. Addison Leong Joanne Jang Katherine Liu SunMi Lee Development Team manager Design User testing

SECTION 2. Computer Applications Technology

User Experience Design I (Interaction Design)

Week 15. Mechanical Waves

CSS 385 Introduction to Game Design & Development. Week-6, Lecture 1. Yusuf Pisan

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

Case Study Protocol NCPI Project 5.1

Design Science Research Methods. Prof. Dr. Roel Wieringa University of Twente, The Netherlands

Kaseya 2. User Guide. Version 7.0

INFORMATIONAL INTERVIEWING

User requirements. Unit 4

Interest Balancing Test Assessment on the processing of the copies of data subjects driving licences for the MOL Limo service

Chapter: Science, Technology, and Society

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

Why Foresight: Staying Alert to Future Opportunities MARSHA RHEA, CAE, PRESIDENT, SIGNATURE I, LLC

The Finding Respect and Ending Stigma around HIV (FRESH) Study Intervention Post-Workshop Survey Community Participants

Literature Review Inventory management is considered as major concerns of every organization. In inventory holding, many steps are taken by managers

Methodology for Agent-Oriented Software

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

MEDIA AND INFORMATION

ISO/IEC INTERNATIONAL STANDARD. Information technology Security techniques Privacy framework

International Cooperation in Strengthening Nuclear Security Capacities within Public Company Nuclear Facilities of Serbia

Technology Engineering and Design Education

Chaloemphon Meechai 1 1

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Today s world has placed a renewed emphasis on recycling. New technology has given many former waste products a second chance. Consider the issues:

How can value be measured and assessed?

UNIT TWO: Data for Simple Calculations. Enter and format a title Modify font style and size Enter column headings Move data Edit data

CS 350 COMPUTER/HUMAN INTERACTION

At the Heart of Amazing Experiences

Topic 7 and 8 UCD/Sustainability Topic Test

D1.10 SECOND ETHICAL REPORT

STUDY ON INTRODUCING GUIDELINES TO PREPARE A DATA PROTECTION POLICY

Instructor-added questions in Teaching Survey System. Instructions for selecting and customizing your survey

The art of COLD CALLING

User Centric Innovation

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

Usability testing. Frank E. Ritter 28 nov Frank Cornwell, /28/16

C 2 A L L Y O U R P A R T N E R I N U S E R E X P E R I E N C E

TOP 10 INTERVIEW QUESTIONS

Ideological Maps of Consumer Education. Sue L.T. McGregor Canada

Early Prototyping for Organisational Learning in Uncertain Environments

Domain Understanding and Requirements Elicitation

Ontario Best Practices Research Initiative (OBRI) University Health Network

SUPPLEMENTARY INFORMATION

Communication and Culture Concentration 2013

Security culture and information technology, SECURIT. Jonas Hallberg

School of Surveying and Construction Management

01.04 Demonstrate how corporations can often create demand for a product by bringing it onto the market and advertising it.

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems.

ES 330 Electronics II Fall 2016

From Information Technology to Mobile Information Technology: Applications in Hospitality and Tourism

Challenge-led and participatory learning process to facilitate urban strategies for innovation on low carbon futures

Compass. Review of the evidence on knowledge translation and exchange in the violence against women field: Key findings and future directions

EMERGING ISSUES IN SUSTAINABLE INDUSTRIAL DESIGN PRACTICE: IMPLICATIONS FOR DESIGNERS, MANUFACTURERS AND EDUCATORS

Fachbereich Informatik

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

IBM Software Group. Mastering Requirements Management with Use Cases Module 2: Introduction to RMUC

Supporting medical technology development with the analytic hierarchy process Hummel, Janna Marchien

Inventions & Innovations

Communication Studies Courses-1

F 6/7 HASS, 7 10 History, 7 10 Geography, 7 10 Civics and Citizenship and 7 10 Economics and Business

Valmet Paper Technology Center The proving grounds for improved efficiency

02.03 Identify control systems having no feedback path and requiring human intervention, and control system using feedback.

A GUIDE TO EFFECTIVE COMMUNICATION WITH MAIN CONTRACTORS

Here Are Your 50 Sweetie Ideas That You Can Start Implementing into Your Business Right Away

A Cultural Study of a Science Classroom and Graphing Calculator-based Technology Dennis A. Casey Virginia Polytechnic Institute and State University

General Questionnaire

Complementi di Informatica Medica a.a JunHua Li and Pradeep Ray - University of New South Wales, Sydney, Australia

FRAMEWORK Advances in biomedical technology are

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Transcription:

An Evaluation Framework

The aims Explain key evaluation concepts & terms Describe the evaluation paradigms & techniques used in interaction design Discuss the conceptual, practical and ethical issues that must be considered when designing evaluations Introduce the DECIDE framework

Evaluation paradigm Any kind of evaluation is guided explicitely or implicitely by a set of beliefs, which are often under pined by theory. These beliefs and the methods associated with them are known as an evaluation paradigm

User studies User studies involve looking at how people behave in their natural environments, or in the laboratory, both with old technologies and new ones

Evaluation paradigms Quick and dirty Usability testing Field studies Predictive evaluation

Quick and dirty Quick and dirty evaluation describes the common practice in which designers informally get feedback from users or consultants to confirm that their ideas are in line with users' needs and are liked Quick and dirty evaluations are done any time The emphasis is on fast input to the design process rather than carefully documented findings

Usability testing Usability testing involves recording typical users' performance on typical tasks in controlled settings. As the users perform these tasks they are watched & recorded on video & their key presses and mouse clicks are logged This data is used to calculate performance times, identify errors & help explain why users did what they did User satisfaction questionnaires & interviews are used to elicit users' opinions

Field studies Field studies are done in natural settings The aim is to understand what users do naturally and how technology impacts them In design, field studies can be used to: Identify opportunities for new technology Determine design requirements Decide how to best introduce new technology Evaluate technology in use

Predictive evaluation Experts apply their knowledge of typical users often guided by heuristics, to predict usability problems Another approach involves theoreticallybased models A key feature of predictive evaluation is that users need not be present Relatively quick and inexpensive

Overview of techniques Observing users Asking users their opinions Asking experts their opinions Testing users' performance Modeling users' task performance to predict the efficacy of a user interface IMPORTANT: some techniques are used in different ways in different evaluation paradigms

DECIDE: a framework to guide evaluation Determine the goals the evaluation addresses Explore the specific questions to be answered Choose the evaluation paradigm and techniques to answer the questions Identify the practical issues Decide how to deal with the ethical issues Evaluate, interpret and present the data

Determine the goals What are the high level goals of the evaluation? Who wants it and why? The goals influence the paradigm for the study Some examples of goals: Identify the best metaphor to base the design Check to ensure that the final interface is consistent Investigate how technology affects working practices

Explore the questions All evaluations need goals & questions to guide them so time is not wasted on ill defined studies For example, the goal of finding out how many customers prefer to purchase paper airline tickets rather than e tickets can be broken down into subquestions: What are the customers' attitudes to these new tickets? Are they concerned about security? Is the interface for obtaining them poor? What questions might you ask about the design of a cell phone?

Choose the evaluation paradigm & techniques The evaluation paradigm strongly influences the techniques used, how data is analyzed and presented E.g field studies do not involve testing and modeling

Identify practical issues For example, how to: Select users Stay on budget Staying on schedule Find evaluators Select equipment

Decide on ethical issues Develop an informed consent form Participants have a right to: Know the goals of the study What will happen with the findings Privacy of personal information Not to be quoted without their agreement Leave when they wish Be treated politely Uvic human ethics board Review documentation, approve

Evaluate, interpret and present data How data is analyzed and presented depends on the paradigm and techniques used The following also need to be considered: Reliability: can the study be replicated? Validity: is it measuring what you thought? Biases: it the process creating biases? Scope: can the findings be generalized Ecological validity: is the environment of the study influencing it e.g Hawthorn effect (or easy midterm effect)

Pilot Studies A small trial run of the main study The aim is to make sure your plan is viable Pilot studies check: That you can conduct the procedure The interview scripts, questionnaires, experiments, etc. work appropriately It's worth doing several to iron out problems before the main study Ask colleagues if you can't spare real users

Key points An evaluation paradigm is an approach that is influenced by particular theories and philosophies Five categories of techniques were identified: observing users, asking users, asking experts, user testing, modeling users The DECIDE framework has six parts: Determine the overall goals Explore the questions that satisfy the goals Choose the paradigm and techniques Identify the practical issues Decide on the ethical issues Evaluate ways to analyze and present data DO A PILOT STUDY