The Evolution of User Research Methodologies in Industry

Similar documents
THE ROLE OF USER CENTERED DESIGN PROCESS IN UNDERSTANDING YOUR USERS

Bridging the Gap: Moving from Contextual Analysis to Design CHI 2010 Workshop Proposal

Introduction to Humans in HCI

MANAGING HUMAN-CENTERED DESIGN ARTIFACTS IN DISTRIBUTED DEVELOPMENT ENVIRONMENT WITH KNOWLEDGE STORAGE

MEDIA AND INFORMATION

Evaluating Naïve Users Experiences Of Novel ICT Products

Design and Implementation Options for Digital Library Systems

The Use of the Delphi Method to Determine the Benefits of the Personas Method An Approach to Systems Design

AGILE USER EXPERIENCE

Open Research Online The Open University s repository of research publications and other research outputs

Playware Research Methodological Considerations

SME Adoption of Wireless LAN Technology: Applying the UTAUT Model

ISO ISO is the standard for procedures and methods on User Centered Design of interactive systems.

By Mark Hindsbo Vice President and General Manager, ANSYS

Six steps to measurable design. Matt Bernius Lead Experience Planner. Kristin Youngling Sr. Director, Data Strategy

HUMAN COMPUTER INTERFACE

Software Project Management 4th Edition. Chapter 3. Project evaluation & estimation

EXPERIENCES FROM TRAINING AGILE SOFTWARE DEVELOPERS IN FOCUSED WORKSHOPS

CMSC434 Intro to Human-Computer Interaction. Ethnography Tuesday, February 19, 2013 Instructor: Jon Froehlich TA: Matthew Mauriello

A Brief Survey of HCI Technology. Lecture #3

Software-Intensive Systems Producibility

Using Deep Learning for Sentiment Analysis and Opinion Mining

NASA s Strategy for Enabling the Discovery, Access, and Use of Earth Science Data

Baroque Technology. Jan Borchers. RWTH Aachen University, Germany

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

WORKSHOP ON BASIC RESEARCH: POLICY RELEVANT DEFINITIONS AND MEASUREMENT ISSUES PAPER. Holmenkollen Park Hotel, Oslo, Norway October 2001

Contribution of the support and operation of government agency to the achievement in government-funded strategic research programs

250 Introduction to Applied Programming Fall. 3(2-2) Creation of software that responds to user input. Introduces

The 13 th International Conference on Quality and Dependability - CCF an Outstanding Event in the Field

DiMe4Heritage: Design Research for Museum Digital Media

The Wholeness of User Experience Eric Schaffer answers questions on HFI s Certified User Experience Analyst (CXA) program

Participatory Sensing for Community Building

Socio-cognitive Engineering

The paradox of standardisation and innovation

User Feedback at Scale: Spotlight on UX Research for Enterprise Systems

IBI GROUP S TOP 10. Smart City Strategy Success Factors

User experience and service design

CMSC434. Introduction to Human-Computer Interaction. Week 02 Lecture 02 Feb 2, 2016 Design Thinking and Design Process. Jon

UX CAPSTONE USER EXPERIENCE + DEVELOPMENT PROCESS

GROUP OF SENIOR OFFICIALS ON GLOBAL RESEARCH INFRASTRUCTURES

Assistant Professor Computer Science. Introduction to Human-Computer Interaction

CHI 2013: Changing Perspectives, Paris, France. Work

Effective Iconography....convey ideas without words; attract attention...

Cross Linking Research and Education and Entrepreneurship

This document is a preview generated by EVS

Call for Chapters for RESOLVE Network Edited Volume

Digital Divide and Social Media: Connectivity Doesn t End the Digital Divide, Skills Do By Danica Radovanovic December 14, 2011

CS 350 COMPUTER/HUMAN INTERACTION

Object-Mediated User Knowledge Elicitation Method

INTRODUCING CO-DESIGN WITH CUSTOMERS IN 3D VIRTUAL SPACE

Software as a Medical Device (SaMD)

This is the peer reviewed author accepted manuscript (post print) version of a published work that appeared in final form in:

About Software Engineering.

Innovation for Defence Excellence and Security (IDEaS)

Economic Clusters Efficiency Mathematical Evaluation

THE EVOLUTION OF TECHNOLOGY DIFFUSION AND THE GREAT DIVERGENCE

Projection Based HCI (Human Computer Interface) System using Image Processing

Abstract. Keywords: virtual worlds; robots; robotics; standards; communication and interaction.

University of Dundee. Design in Action Knowledge Exchange Process Model Woods, Melanie; Marra, M.; Coulson, S. DOI: 10.

Design Ideas for Everyday Mobile and Ubiquitous Computing Based on Qualitative User Data

Universal Usability: Children. A brief overview of research for and by children in HCI

Patenting and Protecting Early Stage R&D

ISO INTERNATIONAL STANDARD

THE STATE OF UC ADOPTION

Compendium Overview. By John Hagel and John Seely Brown

Human-Computer Interaction

Fundamental Research in Systems Engineering: Asking Why? rather than How?

WWTP Operator The poor cousin?

Course Introduction and Overview of Software Engineering. Richard N. Taylor Informatics 211 Fall 2007

Designing and Testing User-Centric Systems with both User Experience and Design Science Research Principles

Project Title. Missed Detections: From Data to Actionable Estimates. Your logo here

Senate Bill (SB) 488 definition of comparative energy usage

PRODUCT SCOTLAND: BRINGING DESIGNERS, ANTHROPOLOGISTS, ARTISTS AND ENGINEERS TOGETHER

NMR Infrastructures. in Europe. Lucia Banci Sco1

STRATEGIC FRAMEWORK Updated August 2017

DataCAD 18 Softlock. Universal Installer. Installation. Evaluation

FUTURE NOW Securing Digital Success

Research Content, Workflows and Beyond. Lim Kok Keng

Using Program Slicing to Identify Faults in Software:

2nd ACM International Workshop on Mobile Systems for Computational Social Science

DreamCatcher Agile Studio: Product Brochure

Design Research Methods in Systemic Design

President Barack Obama The White House Washington, DC June 19, Dear Mr. President,

Satellite Environmental Information and Development Aid: An Analysis of Longer- Term Prospects

RISE OF THE HUDDLE SPACE

Wi-Fi Fingerprinting through Active Learning using Smartphones

The Reproducible Research Movement in Statistics

Built-Rite Tool & Die

The importance of User Centered Design methods applied to the design of a new workstation: a case study

Beyond the switch: explicit and implicit interaction with light Aliakseyeu, D.; Meerbeek, B.W.; Mason, J.; Lucero, A.; Ozcelebi, T.; Pihlajaniemi, H.

REAL TIME, REAL LIVES,

HTA Position Paper. The International Network of Agencies for Health Technology Assessment (INAHTA) defines HTA as:

Policy Partnership on Science, Technology and Innovation Strategic Plan ( ) (Endorsed)

International Collaboration Tools for Industrial Development

happiness.* BY BRYAN IRWIN AND ALIZA LEVENTHAL

Exploring the value of emerging technology in the lean enterprise

Technology and Innovation in the NHS Scottish Health Innovations Ltd

Improving Application Development with Digital Libraries

immersive visualization workflow

BIM 360 with AutoCAD Civil 3D, Autodesk Vault Collaboration AEC, and Autodesk Buzzsaw

Transcription:

1 The Evolution of User Research Methodologies in Industry Jon Innes Augmentum, Inc. Suite 400 1065 E. Hillsdale Blvd., Foster City, CA 94404, USA jinnes@acm.org Abstract User research methodologies continue to evolve to meet the needs of industrial settings. Traditional summative methods continue to be refined in terms of efficiency, and formative or generative methodologies of some form have become an accepted practice in most industrial settings. More recently, an emphasis on measuring the emotional impact of designs has been in vogue. One of the challenges facing those entering industry from academic settings is that the methods appropriate for academic inquiry often fail to meet the needs of industrial settings in terms of efficiency or effectiveness. This paper will explore some of the current challenges faced by product design teams and examine some of the methods that show promise in overcoming those challenges. Copyright is held by the author/owner(s). CHI 2008, April 5 10, 2008, Florence, Italy ACM 978-1-60558-012-8/08/04. Keywords Human Computer Interaction; User Experience, Usability, Evaluation Methods, Evolution, Methodology ACM Classification Keywords H.5 information interfaces and presentation; H.5.2 User Interfaces; User-centered design Introduction The field of human computer interaction is a relatively new field that is rapidly evolving. ACM SIGCHI just recently celebrated its 25 th anniversary. While 25 years may seem like a significant time to some in the relatively young technology profession, two factors impact the maturity of the methods used in our field. First, the technology industry itself is rapidly evolving in lock step with the nature of the products and services it produces. Traditional hardware and software development lifecycles have compressed, and new types of products and services have emerged such as web sites and cell phones that have even shorter development lifecycles. Second, the nature of human-computer interaction work has shifted. The majority of practitioners today work in product (or service) development rather than corporate research settings. Researchers in such settings quickly realize that the demands of these

2 environments often render historical methods impractical and must adapt or create new methods that meet the demands of their situations. This paper will examine the some of the current trends in user research and attempt to categorize them as either having increased the effectiveness of the research, or increased the efficiency of the data collection process (or both). Trends in usability testing Usability testing of software and hardware is one of the oldest methods for improving user experience. This method was adapted from traditional academic experimental methods to meet the needs of early industrial settings. The driving force behind these adaptations was to make the methods more suitable to usability problem detection rather than scientific hypothesis testing [1]. One could argue that finding problems is the same as null hypothesis testing (there are no problems) in academic research, but traditional research usually has highly specific hypotheses, and typically adheres to more rigorous data collection and analyses standards. Testing is becoming more formative and iterative Traditional usability testing (utilizing metrics), while still widely used in its original format, has further evolved into so called discount methods such as the RITE method [2]. These methods have evolved to meet the needs of projects using shorter more iterative development cycles, which require repeated studies on a changing design. Typically, these projects require very fast turnaround times in terms of collection and analysis of data, in other words higher levels of efficiency. Other forces driving the adoption of RITE method like approaches to testing include increased team participation in user-centered design activities, and pressure to be more formative by providing feedback earlier in the product lifecycle. Both trends are generally positive but issues can arise if research specialists fail to maintain enough control of the research process to ensure data quality. Adaptations to usability testing like the RITE method when conducted correctly can help make the research more effective (improving the impact of the findings) by improving the number of iterations possible in a development cycle. This increases the likelihood that team members will fix problems and validate the updated designs with additional tests. It would be helpful if academic programs provided better training on these new methods and helped evolve them through scientific analysis of their effectiveness in industry settings. Technology is changing our approaches to testing Many recent adaptations to usability testing methods have focused on making improvements in efficiency. Some of these improvements have been made possible due to technological advances, such as the use of specialized software and hardware to assist in capturing data in both text (data logging by observers, or survey data from participants) and video, or remote collaboration software that allows tests to be run using remote participants. Low cost eye tracking tools have spread studies of user s gaze patterns. While eye tracking is not a new methodology, advances in technology have made it more cost and time efficient to conduct studies of this type, making them practical in industrial settings.

3 The most extreme example of how technology can make existing methods more efficient is the automation of summative usability testing [3] which enables participants to test software with limited human supervision. As technology advances, so does our ability to increase the efficiency of our methods. While this may increase adoption of existing methods, it does not necessarily increase the effectiveness of the method itself, but certainly promises to make researchers more effective by allowing them to focus on activities of higher value to the organization. More companies have adopted usability scorecards As members of senior management in companies become better educated about the benefits of usability testing, the demand for the creation of product usability scorecards is spreading. Two related factors are driving this change. First, is the development of summative usability testing standards like ISO/IEC 25062:2006 [4] that have helped spread adoption of summative testing. Second is the overall business management trend of tracking metrics via methods like Balanced Scorecards [5]. This is a positive trend because it helps executives and senior managers track investments in user experience like other business metrics, which can only help make summative testing efforts more effective. It has also resulted in researchers in industrial settings adapting methods like magnitude estimation for comparing user experiences [6] that are appropriate for comparing different types of products. Measuring emotion has become a common practice As companies become more interested in designing experiences, some are starting to recognize that traditional metrics like task completion rates do not capture consumer emotions. While we may be able to thank Don Norman for promoting the concept of emotion in design [7], it is more likely that executives are slowly adopting the concepts of experience design due to other influences, including the mass media s coverage of design. One challenge is that researchers may find their existing user satisfaction measurement methods to be inadequate for this purpose. Perhaps academic research can provide some useful tools to assist in doing this in an effective and efficient manner. It seems unlikely that the use of P300 brainwave measurement will find wide spread adoption! Another issue is few companies can afford to spend time on measuring emotion as they are already struggling to do more traditional user research. When task completion rates remain very low, it is likely users will not be happy. Perhaps in these situations it might be more effective to focus on problems that are more basic and easier to measure. Trends in ethnographic research Watching users interact with a system, or do work that could be improved by technology is one of the oldest methods in traditional human factors work. One could argue that Frederick Taylor, an influential management theorist at the turn of the 20 th century, was the first to promote these methods in American industry. Whatever their origin, clearly such techniques predate computer technology. These methods were some of the first to be adapted to computer software and hardware in attempt to address the shortcomings of traditional usability testing. Use of ethnographic research for product strategy The adoption of ethnographic research has evolved to the point that companies are regularly utilizing this method to define market and product strategies outside

4 of product development initiatives. Some market research consulting firms have even adopted ethnographic methods and created new methods that incorporate ethnography such as the Jobs & Outcomes method [10]. Some companies have gone as far as to create dedicated groups of ethnographic researchers who conduct studies to investigate new markets to inform corporate strategy and product planning. In 2005, the first Ethnographic Praxis in Industry Conference (EPIC) was held by the American Anthropological Society as a response to the growing interest in ethnographic methods in industrial settings. This is indicates that industry may have finally realized that user-centered methods can be most effective when applied before the project team has even started development. It does however; create issues if these researchers fail to consider the needs of the teams who do product design or development because they have never worked on product design or development themselves. Most methods have been adapted to become more practical within product development environments, but as a result of increasing specialization of practitioners and ethnography s role in early product definition phases, ethnographic methods currently appear to be moving towards being less rather than more pragmatic. This trend may reverse if classically trained ethnographers find themselves unable to impact product design like the early psychologists who first adapted experimental methods for use in industry. Usage log analysis and feedback forms Some traditional software companies have historically used beta tests to deploy so-called instrumented versions of their products designed to track user behavior in an effort to improve usability. Before the wide spread adoption of the internet this practice was limited by the challenges of getting users to agree to the practice, as well as the logistics collecting the resulting data files. Many help manuals solicited user feedback via postage paid cards or by printing email addresses in printed materials soliciting feedback from users of manuals. Response rates to these requests for feedback were typically poor. Pervasive network technologies have made modern versions of these methods far more efficient and somewhat more effective by making it more convenient for users to provide feedback. Once web-server logs made it convenient to study user s actual interactions with interfaces this practice started spreading. It became very common for major web sites to analyze server logs and to collect user feedback via online forms during the dot com era. This has now spread back to desktop application design. The most recent update of Microsoft Office (Office 2007) was based on an extensive analysis of logs of user s interactions with the interface during real world use [11]. Many desktop applications now leverage both behavioral logging and prompts for written user feedback (while using the product) as they assume users are online most of the time. These are just some examples of where technology has enabled us to improve the efficiency of existing methodologies. Summary User research methods continue to evolve primarily through the efforts of individuals working in industry. It seems fitting that our community refines methods through user experience with methods. Several challenges exist with this approach. How do researchers

5 in industry learn what methods are appropriate? Where do they learn about these new methods as they evolve? How do we ensure these new methods are valid? In the past methods were often evaluated carefully by scientists in corporate research settings who had time to study the validity of new methods and publish papers on their findings at forums such as CHI. As corporate research has been deemphasized by many companies the number of scientists doing studies of user research methods has dropped. One hope is that academia will find ways to collaborate with industry to do this type of work, including doing research into new methods that meet the needs of those in industry. Acknowledgements Thanks to Chauncey Wilson for his review and comments on drafts of this paper. References [1] Wixon, D., & Wilson, C. The Usability Engineering Framework. In Helander, G (ed), Handbook of Human- Computer Interaction. Elsevier Science: Amsterdam, The Netherlands. (1997). 653-688. [2] Medlock, M. C., Wixon, D, Terrano, M., Romero, R., Fulton, B. Using the RITE Method to Improve Products: A Definition and a Case Study. In Proc. Usability Professionals Association, (2002). [3] Harrison, C. Using Technology to Automate Summative Usability Testing. In C. Righi & J. James (eds), User-Centered Design Stories: Real-World UCD Case Studies. Morgan Kaufmann: San Francisco. (2007) 497-516. [4] ISO 25062. Software Product Quality Requirements and Evaluation (SQuaRE) Common Industry Format (CIF) for Usability Test Reports. International Standards Organization: Geneva, Switzerland, 2006. [5] Kaplan, R., Norton, P. Using the Balanced Scorecard as a Strategic Management System. Harvard Business Review. Cambridge MA, Jan-Feb 1996. [6] McGee, M. Master Usability Scaling: Magnitude Estimation and Master Scaling Applied to Usability Measurement. In Proc. CHI 2004, ACM Press (2004), 335-342. [7] Norman, D. Emotional Design: Why We Love (Or Hate) Everyday Things. Basic Books: Cambridge MA, 2004. [8] Beyer, H., Holtzblatt, K. Contextual Design: A Customer-Centered Approach to Systems Design. Morgan Kaufmann: San Francisco, 1997. [9] Holtzblatt, K., Wendell, J., Wood, S. Rapid Contextual Design: A How-to Guide to Key Techniques for User-Centered Design. Morgan Kaufmann: San Francisco, 2005. [10] Ulwick, A. What Customers Want: Using Outcome- Driven Innovation to Create Breakthrough Products and Services. McGraw-Hill: New York, 2005. [11] Harris, J. (2005). Beyond Menus and Toolbars in Microsoft Office. Presentation at BayCHI http://www.baychi.org/calendar/20051213