City, University of London Institutional Repository

Similar documents
HCITools: Strategies and Best Practices for Designing, Evaluating and Sharing Technical HCI Toolkits

Bridging the Gap: Moving from Contextual Analysis to Design CHI 2010 Workshop Proposal

Open Research Online The Open University s repository of research publications and other research outputs

Twenty-Thirty Health care Scenarios - exploring potential changes in health care in England over the next 20 years

HCITools: Strategies and Best Practices for Designing, Evaluating and Sharing Technical HCI Toolkits

Personal tracking and everyday relationships: Reflections on three prior studies

Published in: Proceedings of the Workshop on What to Study in HCI at CHI 2015 Conference on Human Factors in Computing Systems

Culturally Sensitive Design for Privacy: A case study of the Arabian Gulf

Replicating an International Survey on User Experience: Challenges, Successes and Limitations

CADTH HEALTH TECHNOLOGY MANAGEMENT PROGRAM Horizon Scanning Products and Services Processes

Advancing Health and Prosperity. A Brief to the Advisory Panel on Healthcare Innovation

Interacting with ehealth - Towards grand challenges for HCI

SHTG primary submission process

Questions on Design, Social Justice and Breastpumps

Reflecting on Domestic Displays for Photo Viewing and Sharing

Reflections on Design Methods for Underserved Communities

Supporting medical technology development with the analytic hierarchy process Hummel, Janna Marchien

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

Activity-Centric Configuration Work in Nomadic Computing

IFIP 13.6 HWID Human Work Interaction Design

Medical Research Council

Supporting Wards with Interactive Resources and Logic-based Systems

Bridging a Gap Between Data Science Research and Health DIY Movement

Organic UIs in Cross-Reality Spaces

Generification in change: the complexity of modelling the healthcare domain.

City, University of London Institutional Repository

Towards affordance based human-system interaction based on cyber-physical systems

User Policies in Pervasive Computing Environments

Published in: Information Technology in Health Care: Socio-Technical Approaches From Safe Systems to Patient Safety

Automated Virtual Observation Therapy

Parenteral Nutrition Down Under Inc. (PNDU) Working with Pharmaceutical Companies Policy (Policy)

TECHNOLOGY, INNOVATION AND HEALTH COMMUNICATION Why Context Matters and How to Assess Context

HUMAN COMPUTER INTERFACE

Developing a Community of Practice to Support Global HCI Education

Open Research Online The Open University s repository of research publications and other research outputs

Horizon Scanning. Why & how to launch it in Lithuania? Prof. Dr. Rafael Popper

2. Evidence themes and their importance along the development path

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Beyond the switch: explicit and implicit interaction with light Aliakseyeu, D.; Meerbeek, B.W.; Mason, J.; Lucero, A.; Ozcelebi, T.; Pihlajaniemi, H.

CS 889 Advanced Topics in Human- Computer Interaction. Experimental Methods in HCI

Please also note that this is an annual survey, so many of these questions will be familiar to you if you completed a survey last year.

Public engagement, impact, and the 21st Century University: the context. Paul Manners Director, National Coordinating Centre for Public Engagement

Furnari, S. (2016). The Oxford Handbook of Creative Industries. Administrative Science Quarterly, 61(3), NP29-NP32. doi: /

Questionnaire Design with an HCI focus

Designing for End-User Programming through Voice: Developing Study Methodology

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

Convergence and Differentiation within the Framework of European Scientific and Technical Cooperation on HTA

Response to the Western Australian Government Sustainable Health Review

Prof Ina Fourie. Department of Information Science, University of Pretoria

Child Computer Interaction

Four principles for selecting HCI research questions

The Isolated Practitioner

A Collaboration with DARCI

Issues in Emerging Health Technologies Bulletin Process

Doing, supporting and using public health research. The Public Health England strategy for research, development and innovation

HTA Position Paper. The International Network of Agencies for Health Technology Assessment (INAHTA) defines HTA as:

Global Harmonization Task Force

The Evolution of User Research Methodologies in Industry

Technologies for Well-Being: Opportunities and Challenges for HCI

Lecture 1 - Introduction to HCI CS-C

Driving Innovation. Connect and Catalyse. Medicines and Healthcare the Global perspective (+10 years) Zahid Latif

SOCRATES. Auditory Evoked Potentials

The Future of Laboratory Information Systems: User Input Impacting the Development of Laboratory Information Systems

INTEGRATING HUMAN FACTORS AND USABILITY TESTING INTO MEDICAL DEVICE RISK MANAGEMENT

ABHI Response to the Kennedy short study on Valuing Innovation

SECOND GLOBAL SYMPOSIUM ON HEALTH SYSTEMS RESEARCH SCIENCE TO ACCELERATE UNIVERSAL HEALTH COVERAGE

Taking an Ethnography of Bodily Experiences into Design analytical and methodological challenges

Creating a Vision for Health Literacy s Future: The Research Agenda

IT and Systems Science Transformational Impact on Technology, Society, Work, Life, Education, Training

Isolating the private from the public: reconsidering engagement in museums and galleries

Committee on Development and Intellectual Property (CDIP)

Co-Design. Dr Louise Moody Coventry University Devices 4 Dignity, UK

CO-ORDINATION MECHANISMS FOR DIGITISATION POLICIES AND PROGRAMMES:

Rutgers University, the State University of New Jersey

An Integrated Approach Towards the Construction of an HCI Methodological Framework

IMHA Research. In short it is addressing two questions:

2nd ACM International Workshop on Mobile Systems for Computational Social Science

ADVANCES IN THE USABILITY AND USE OF HEALTHCARE INFORMATION SYSTEMS: EXPERIENCES IN CANADA AND INTERNATIONALLY

The essential role of. mental models in HCI: Card, Moran and Newell

ENCePP Work Plan

Communication and Culture Concentration 2013

University of Dundee. Design in Action Knowledge Exchange Process Model Woods, Melanie; Marra, M.; Coulson, S. DOI: 10.

Critical InfoVis Exploring the Politics of Visualization*

Cheshire, Warrington and Wirral Area Team Commissioning for Value Pack

Before I talk through the strategy itself, I want to tell you more about why

Human Factors Points to Consider for IDE Devices

Implementing digital resources for clinicians and patients varying needs.

Introduction. chapter Terminology. Timetable. Lecture team. Exercises. Lecture website

Haynes, D. (2013). The Future of Regulation. Paper presented at the ifutures, 25 Jul 2013, Sheffield, UK.

Evaluating Naïve Users Experiences Of Novel ICT Products

REF2021 Exhibitions and the REF

The workspace design concept: A new framework of participatory ergonomics

DiMe4Heritage: Design Research for Museum Digital Media

Design Home Energy Feedback: Understanding Home Contexts and Filling the Gaps

AGING IN PLACE WORKSHOP

Research Excellence Framework

CHI 2013: Changing Perspectives, Paris, France. Work

e-care Living Lab - 5 avenue du Grand Sablon La Tronche - FRANCE Tel: +33 (0)

"Working Groups for Harmonisation and Alignment in Brain Imaging Methods for Neurodegeneration" Final version

IPM 12/13 Course Projects

Transcription:

City Research Online City, University of London Institutional Repository Citation: Randell, R., Mamykina, L., Fitzpatrick, G., Tanggaard, C. & Wilson, S. (2009). Evaluating New Interactions in Healthcare: Challenges and Approaches. Paper presented at the CHI2009 Conference on Human Factors in Computing Systems, 03-04-2009-09-04-2009, Boston, MA, USA. This is the accepted version of the paper. This version of the publication may differ from the final published version. Permanent repository link: http://openaccess.city.ac.uk/4740/ Link to published version: Copyright and reuse: City Research Online aims to make research outputs of City, University of London available to a wider audience. Copyright and Moral Rights remain with the author(s) and/or copyright holders. URLs from City Research Online may be freely distributed and linked to. City Research Online: http://openaccess.city.ac.uk/ publications@city.ac.uk

Evaluating new interactions in healthcare: challenges and approaches Rebecca Randell Centre for HCI Design City University London, EC1V 0HB UK rebecca.randell.1@city.ac.uk Geraldine Fitzpatrick Interact Lab University of Sussex Brighton, BN1 9QH UK g.a.fitzpatrick@sussex.ac.uk Stephanie Wilson Centre for HCI Design City University London, EC1V 0HB UK steph@soi.city.ac.uk Lena Mamykina GVU Center, Georgia Institute of Technology Atlanta, GA 30332 USA mamykina@cc.gatech.edu Charlotte Tang Department of Computer Science University of Calgary Calgary, AB T2N 1N4 Canada char.tang@ucalgary.ca Abstract New technologies for supporting the provision of healthcare are increasingly pervasive. While healthcare computing previously referred to a desktop computer within the consulting room, we are now seeing an ever broader range of software, hardware and settings. This workshop is concerned with how to conduct evaluations which allow assessment of the overall impact of technology. The workshop will explore challenges and approaches for evaluating new interactions in healthcare. In this paper we outline the goals for this workshop and summarize the issues and questions it intends to explore. Keywords Healthcare, evaluation ACM Classification Keywords H5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous. Copyright is held by the author/owner(s). CHI 2009, April 3 April 9, 2009, Boston, MA, USA ACM 978-1-60558-247-4/08/04. Introduction New technologies for supporting the provision of healthcare are increasingly pervasive and we are now seeing an ever broader range of software, hardware and settings. While previously IT was predominantly based in the consulting room, clinicians now have

2 access to an increasing amount of information, including electronic patient records (EPRs), via devices such a PDAs, computers-on-wheels (COWs), and tablet PCs (e.g. [6, 10]). The mobility of these devices means that they can be accessed on wards, by the patient bedside and during ward rounds. Healthcare technologies are also making their way into patients homes, e.g. as telecare and assistive technology packages, to enable them to take greater control of their health, including in the management of chronic diseases such as diabetes [5]. While previous evaluations of healthcare technologies have focused on certain aspects of the technology, such as specific user interface features, or the impact on certain aspects of individual or group behaviour, such as the time taken to complete a task or communication within a clinical team, or on clinical outcomes, this workshop is particularly concerned with how to conduct evaluations which allow assessment of the overall impact of technology in its context of use. User acceptance is an enduring problem for the introduction of healthcare technologies, suggesting a need for evaluation techniques that allow us to demonstrate to potential users a clear benefit. The introduction of healthcare computing applications involves a number of components technological, social and organisational. If the results of an evaluation are to inform wider implementation, it is necessary to not only know whether or not an application brings benefit but also to know the nature of the components and the specific context in which it was introduced [8]. As healthcare computing increasingly moves away from the desktop, into hospital wards and patients homes via mobile technologies, additional challenges to evaluation arise. For example, current evaluation reports on homecare technologies focus largely on clinical outcomes [3] but ignore aspects of the lived experience of the technology and its social acceptability and fit into domestic life or its impact on community care processes [2]. While progress has been made in HCI in developing evaluation methods for such challenging settings (e.g. [7]), we are interested in how these can be interpreted for healthcare settings and incorporated into coherent evaluation methodologies which allow assessment of the overall impact of healthcare technologies. Lack of recent discussion of evaluation methodology within CHI has been noted [1]; we hope this workshop will reignite debate on this topic within the specific context of new healthcare technologies. Workshop goals The goals for this workshop are as follows: To provide an opportunity for HCI researchers to share and learn from each other s experiences of evaluating new healthcare technologies. To elaborate the challenges in the evaluation of new healthcare technologies. To understand how these issues play out in different settings, e.g., hospital and home. To explore how existing methods of HCI evaluation could be adapted and expanded. To work towards an agenda for the evaluation of new technologies in healthcare, identifying key components of the intervention to be studied, appropriate processes and outcomes to be reported, and methods for doing so.

3 To develop a community of HCI practitioners to take the agenda forward. To draw together the discussions that emerge from the workshop to be disseminated to the HCI community through a special journal issue. While the specific issues to be addressed in the workshop will be determined by the paper submissions, we outline below some potential issues to explore, arranged according to the broader questions of what, how, who and where. Workshop questions What? The components of a healthcare technology intervention include the type of hardware, the functionality provided by the software, particular interface features, the physical configuration of the hardware, the aesthetic design of the device, the training provided, and the organisational culture. Is it necessary to explore the impact of all of these components? If not, which should take priority? What other components should we consider? How can we understand the impact of these different components? Should we be testing multiple designs in order to, for example, understand the benefits of different interface features [11]? Looking at the impact of the technology, how do we determine appropriate process measures and patient outcomes for systems, such as EPRs, that do not have an easily visible and quantifiable relation to patient care? Or that have a clear quantifiable relation to patient care but more subtle yet critical experiential aspects that are critical to their acceptability and success? How? The choice of evaluation methodology must arise from and be appropriate for the problem or research question under consideration [4]. Should we use quantitative or qualitative approaches, or a combination of the two? CHI is currently dominated by quantitative empirical evaluations [1] but new healthcare technologies may result in consequences more subtle than expected and difficult to capture quantitatively [12]. What is the relevance of expert evaluation, and who constitutes an expert in this context? Are there methods from other domains that could be usefully adapted for the evaluation of new healthcare technologies, e.g. from health services research? Who? New healthcare technologies may be designed for either clinicians or patients, or they may be designed for clinicians and patients to use together. Where technologies designed for clinicians are used when interacting with patients, to what extent should we be paying attention to the experience of the patient and the impact on interaction and communication? Are there other groups of users that we also want to consider? For example, if a technology is being used in the home, the extended family is likely to be involved. What role do we want to give users in the evaluation? Where we have multiple users and multiple interpretations of the system, how do we draw these together to provide an overall assessment [9]?

4 A review of CHI evaluations highlights a decrease over time in the number of subjects in quantitative empirical studies [1]. What are the challenges in recruiting participants to evaluation studies of new healthcare technologies and how could these challenges be overcome? How do we determine an appropriate sample size for such evaluations? Where? Evaluating new healthcare technologies in the context of use can be difficult but evaluation strategies that fail to do this may not succeed in gauging the true impact of the technology [13]. What are the challenges of evaluating technologies in home settings and what approaches can we use to overcome these challenges? What is the potential of lab based studies for evaluating new healthcare technologies? Traditional HCI evaluation is appropriate for settings with well-known tasks and outcomes [4]; how do we develop appropriate tasks and how can we judge their success? References [1] Barkhuus L, Rode JA. From Mice to Men - 24 Years of Evaluation in CHI. ACM CHI'07 - AltCHI (2007). [2] Blythe M, Monk AF, Doughty K. Socially dependable design: The challenge of ageing populations for HCI. Interacting with Computers 17, 6 (2005), 672-89. [3] Department of Health. Supporting self care - a practical option: Diagnostic, monitoring and assistive tools, devices, technologies and equipment to support self care; 2006. [4] Greenberg S, Buxton B. Usability evaluation considered harmful (some of the time). Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008). [5] Mamykina L, Mynatt E, Davidson P, Greenblatt D. MAHI: investigation of social scaffolding for reflective thinking in diabetes management. Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, ACM (2008). [6] McLoughlin E, O'Sullivan D, Bertolotto M, Wilson DC. MEDIC: MobilE Diagnosis for Improved Care. Proceedings of the 2006 ACM symposium on Applied computing, ACM (2006). [7] Palen L, Salzman M. Voice-mail diary studies for naturalistic data capture under mobile conditions. Proceedings of the 2002 ACM conference on Computer supported cooperative work, ACM (2002). [8] Randell R, Mitchell N, Dowding D, Cullum N, Thompson C. Effects of computerized decision support systems on nursing performance and patient outcomes. Journal of Health Services Research and Policy 12, 4 (2007), 242-9. [9] Sengers P, Gaver B. Staying open to interpretation: engaging multiple meanings in design and evaluation. Proceedings of the 6th conference on Designing Interactive systems, ACM (2006). [10] Tang C, Carpendale S. Support for Informal Information Use and its Formalization in Medical Work. 21st IEEE International Symposium on Computer-Based Medical Systems (2008). [11] Tohidi M, Buxton W, Baecker R, Sellen A. Getting the right design and the design right. Proceedings of the SIGCHI conference on Human Factors in computing systems, ACM (2006). [12] Wilson S, Galliers J, Fone J. Not All Sharing Is Equal: The Impact of a Large Display on Small Group Collaborative Work. Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work, ACM (2006). [13] Wilson S, Galliers J, Fone J. Cognitive Artifacts in Support of Medical Shift Handover: An In Use, In Situ Evaluation. International Journal of Human-Computer Interaction 22, 1 (2007), 59-80.