User Feedback at Scale: Spotlight on UX Research for Enterprise Systems

Similar documents
Committee on Development and Intellectual Property (CDIP)

USER RESEARCH: THE CHALLENGES OF DESIGNING FOR PEOPLE DALIA EL-SHIMY UX RESEARCH LEAD, SHOPIFY

The Modern Design Organization. Leah Buley, UX London May 2016

Expression Of Interest

WFEO STANDING COMMITTEE ON ENGINEERING FOR INNOVATIVE TECHNOLOGY (WFEO-CEIT) STRATEGIC PLAN ( )

2018 NISO Calendar of Educational Events

The Evolution of User Research Methodologies in Industry

DIGITAL TRANSFORMATION LESSONS LEARNED FROM EARLY INITIATIVES

Digital Disruption Thrive or Survive. Devendra Dhawale, August 10, 2018

Accenture Technology Vision 2015 Delivering Public Service for the Future Five digital trends: A public service outlook

C 2 A L L Y O U R P A R T N E R I N U S E R E X P E R I E N C E

Enhancing industrial processes in the industry sector by the means of service design

Great Minds. Internship Program IBM Research - China

EBC Annex 79. Occupant behaviour-centric building design and operation

Venture Capital Search Highlights

BI TRENDS FOR Data De-silofication: The Secret to Success in the Analytics Economy

TERMS OF REFERENCE FOR CONSULTANTS

Executive Summary FUTURE SYSTEMS. Thriving in a world of constant change

Sample Meeting Agenda UX COUNCIL

2018 NISO Calendar of Educational Events

Violent Intent Modeling System

Human-Centered Design. Ashley Karr, UX Principal

Science Impact Enhancing the Use of USGS Science

Michael DeVries, M.S.

The Brand s Pocket Guide to UX & Usability Research

An Integrated Approach Towards the Construction of an HCI Methodological Framework

Smart Cities at CES 2018: January 9-12

Vorwerk Thermomix C O N S U L T A N C Y C A S E S T U D Y

Strategic Plan Public engagement with research

Remuneration Report

technologies, Gigaom provides deep insight on the disruptive companies, people and technologies shaping the future for all of us.

Towards a Software Engineering Research Framework: Extending Design Science Research

SMART PLACES WHAT. WHY. HOW.

TECHNOLOGY ASSESSMENT STRATEGIC PLAN MISSION STATEMENT VISION STATEMENT

HARNESSING TECHNOLOGY

Earth Cube Technical Solution Paper the Open Science Grid Example Miron Livny 1, Brooklin Gore 1 and Terry Millar 2

immersive visualization workflow

RISE OF THE HUDDLE SPACE

USING THE INDUSTRIAL INTERNET OF THINGS TO TRANSFORM HUMAN SAFETY AND ENERGY CONSUMPTION IN THE MINING INDUSTRY

We are IntechOpen, the world s leading publisher of Open Access books Built by scientists, for scientists. International authors and editors

ICT strategy and solutions for upstream oil and gas. Supporting exploration and production globally

A Reconfigurable Citizen Observatory Platform for the Brussels Capital Region. by Jesse Zaman

Esri and Autodesk What s Next?

Interoperable systems that are trusted and secure

6 Benefits of Hiring a Local Internet Marketing Agency for Your Business

BIM FOR INFRASTRUCTURE THE IMPACT OF TODAY S TECHNOLOGY ON BIM

ROADMAP 12. Portland, OR June 18-19, Event Summary. Areas of Interest. Roadmap 12 Call for Proposals Case Studies, Speakers, & Breakout Sessions

UX Aspects of Threat Information Sharing

DiMe4Heritage: Design Research for Museum Digital Media

Findings of a User Study of Automatically Generated Personas

Industry at a Crossroads: The Rise of Digital in the Outcome-Driven R&D Organization

About Software Engineering.

MEMBER/CUSTOMER SHOWCASE. Granta Design s CRM Journey. Presented by: Craig Seymour, Cloud Thing & Pippa Newby, Granta Design

FMI Prefabrication Forum. The Changing Face of Engineering & Construction

UN-GGIM Future Trends in Geospatial Information Management 1

University of Dundee. Design in Action Knowledge Exchange Process Model Woods, Melanie; Marra, M.; Coulson, S. DOI: 10.

Selecting, Developing and Designing the Visual Content for the Polymer Series

Patent Prosecution & Strategic Patent Counseling

Robotic automation goes mainstream: Accenture announces agreement with IPsoft

Overview of USP s Research and Innovation Activities. Michael Ambrose Ph.D. Director, Research and Innovation

DATA AT THE CENTER. Esri and Autodesk What s Next? February 2018

SMART MANUFACTURING: 7 ESSENTIAL BUILDING BLOCKS

ITR8. We are the agency that converts clever ideas into successful products. Semir Chouabi

Automated Test Summit 2005 Keynote

UNIT-III LIFE-CYCLE PHASES

CHI 2013: Changing Perspectives, Paris, France. Work

SPONSORSHIP PROSPECTUS. October 2-3, 2018 JW Marriott 110 E 2nd St, Austin, TX 78701

APSEC President s Report

Publication Date Reporter Pharma Boardroom 24/05/2018 Staff Reporter

Anatomic and Computational Pathology Diagnostic Artificial Intelligence at Scale

What is Digital Literacy and Why is it Important?

Our Corporate Strategy Digital

ACCELERATING TECHNOLOGY VISION FOR AEROSPACE AND DEFENSE 2017

Mission: Materials innovation

Design and Implementation Options for Digital Library Systems

The work under the Environment under Review subprogramme focuses on strengthening the interface between science, policy and governance by bridging

UX Adoption Maturity model. Paul Blunden: first published March 2015

Why, How & What Digital Workplace

Facilitating Human System Integration Methods within the Acquisition Process

Inside Track Research Note. in association with. and. Managing Software Exposure. Time to fully embed security into your application lifecycle

THE STATE OF UC ADOPTION

HP Laboratories. US Labor Rates for Directed Research Activities. Researcher Qualifications and Descriptions. HP Labs US Labor Rates

Challenging convention with technological ingenuity

WINNING HEARTS & MINDS: TIPS FOR EMBEDDING USER EXPERIENCE IN YOUR ORGANIZATION. Michele Ide-Smith Red Gate Software

Issues and Challenges in Coupling Tropos with User-Centred Design

If you believe that your ideas are important enough to publish on the World

in the New Zealand Curriculum

Towards a Consumer-Driven Energy System

ADVANCING KNOWLEDGE. FOR CANADA S FUTURE Enabling excellence, building partnerships, connecting research to canadians SSHRC S STRATEGIC PLAN TO 2020

High Performance Computing Systems and Scalable Networks for. Information Technology. Joint White Paper from the

ARC Network Police Program: New Grassroots Police Program for all Traffic Investigation and Reconstruction Teams

Korn Ferry Civil Aviation Practice

Beta Testing For New Ways of Sitting

Computing Disciplines & Majors

Public Report Briefing July 23, 2014 Jerry Schubel, Committee Chair

Written by Greenlight VR, Inc. & UploadVR, Inc.

VIEW POINT CHANGING THE BUSINESS LANDSCAPE WITH COGNITIVE SERVICES

Frontier Technology Futures: Support to the DFID Bangladesh Country Office

An Integrated Expert User with End User in Technology Acceptance Model for Actual Evaluation

Software-Intensive Systems Producibility

Transcription:

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 349 User Feedback at Scale: Spotlight on UX Research for Enterprise Systems Panel Chair: Speakers: Danielle Smith, PhD; Express Scripts Steve Fadden, PhD; Salesforce Melissa Meingast, MA; Hewlett Packard Enterprise Michelle Peterson, PhD; Sentier Strategic Resources Anna Rowe, PhD; SolarWinds A strong User Experience (UX) discipline has become a business imperative across commercial industry. Accordingly, Human Factors professionals may be part of UX teams in large organizations designing enterprise systems (business-to-business technologies that serve as corporate back-ends or enabling technologies for other products). These teams integrate user research into their design processes by using methods that are similar to consumer product researchers, but often differ in terms of scope and timeline. This panel aims to share best practices and lessons learned in business-to-business UX by bringing together leaders from different enterprise technology companies. Panelists will discuss (1) logistics of gathering feedback from a limited, sometimes hard to reach group of users, (2) how to gather technical feedback using (sometimes) non-technical researchers, and (3) how to structure a user research team (and their deliverables) to effectively turn that feedback into business direction and product experience feedback within an enterprise technology company. Copyright 2016 by Human Factors and Ergonomics Society. DOI 10.1177/1541931213601079 PANEL SUMMARY Danielle Smith Director of UX Research & Accessibility Express Scripts Austin, TX danielle@express-scripts.com Twitter: @danielleps0 www.linkedin.com/in/danielleps The idea that a strong User Experience (UX) design practice is a business imperative has become a generally accepted reality in most organizations with 93% of executives noting improving the customer experience as a strategic priority and large firms touting publicly that they invest dollars and talent resources in building UX capability (Burns, 2012; Buley, 2015). With this focus on UX, internal commercial design teams have had to mature to meet the demand from their organizations. Part of this growth has included adding dedicated research staff or repurposing UX generalists to focus on user research as the skills required to reliably gather customer feedback using various methods is considered a fundamental for an effective UX practice (Buley, 2015). Human Factors Engineering (HFE) professionals may find themselves as user researchers in such environments. Notably, those outside the industry may not realize UX teams exist within enterprise as well as consumer product corporations. In effort to expose the HFE community to one facet of the work in enterprise UX, this panel brings together UX research leaders at enterprise technology organizations to discuss how they gather user feedback to inform the infrastructure of our current technology landscape. In this context, the infrastructure that these enterprises provide are products and services that enable other businesses to operate. As Bailey, et. al. (2003) note, this organizational infrastructure is growing increasingly complex, involving numerous software and hardware components. Therefore, the individuals who manage these infrastructures have a highly technical, specialized skill set (and the UX researchers that study them need to understand those systems on some level). Research suggests that technical users, such as Information Technology (IT) system administrators, have information and system needs in relation to the tools they use that differ from regular computer users (Haber & Bailey, 2007; Haber & Kandogan, 2007; Wixom & Todd, 2005), with attributes such as trust taking on increased importance (Velasquez et. al, 2008; Takayama & Kandogan, 2006). Consequently, as technology companies must adapt their efforts to understand such a specialized audience (versus a more accessible consumer audience) to deliver usable products and services that are often highly complex. This specialization and complexity compounds the challenge of enterprise UX research teams gathering feedback from specialized users, on complex products used in technical contexts, while operating within large, complex organizations.

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 350 HFE professionals supporting special populations is nothing new. Many of our colleagues support the military, healthcare, and aeronautics industries. This panel aims to expand that conversation to include HFE professionals providing UX support to enterprise technology development. Given the growth of the IT industry and associated growing popularity of big data analytics, more and more HFE professionals are working to deliver delightful user experiences to engineers, marketers, business decision makers, and IT professionals. As such, this panel will focus on research processes that focus on specialist users such as: developers, business intelligence specialists, IT administrators, managers, and architects. Panelists will discuss how they balance the speed of feedback with difficult recruits, the impact of various development processes (i.e. waterfall, agile) on their approaches, how to design research to get the data that different teams need (e.g. marketing, product), and how to collect technical feedback with a research team that are not domain experts. This panel will start with a brief introduction to the operating environment of commercial UX teams (including the roles of UX, Product Managers, Marketing, and development). Each panelist will then briefly introduce themselves and their approaches to gathering user input to inform product design. The panelists will discuss how they develop and train research team members, engage participants (including CIOs, CTOs, architects and other specialists), select tools and techniques, and build relationships within their organization to impact enterprise product design. SALESFORCE: USING MULTIPLE DATA SOURCES TO DELIVER AN INDUSTRY-LEADING EXPERIENCE Steve Fadden Principal Researcher, Analytics UX Salesforce Lecturer, UC Berkeley School of Information San Francisco, CA 94105 sfadden@gmail.com Twitter: @sfadden Salesforce is a Fortune 500 company that provides cloud-based Customer Relationship Management (CRM) capabilities to customers around the globe. Given the robust functionality and customizability of the product, specialists such as system administrators and business analysts contribute to the implementation, configuration and use of their CRM system and tools. The Analytics User Experience team works closely with Salesforce administrators, who configure and operate their organization s Salesforce instance, as well as with technical architects, sales operations specialists, and business analysts responsible for implementing and interpreting analytics. Steve is a Principal Researcher, serving a hybrid research and management role with the Analytics User Experience (UX) team. To meet the needs and expectations of our diverse user population, our team engages in a continuous, comprehensive cycle of gathering formative and summative information about the current and future design of our product. First, we monitor usage data to review trends and adoption patterns of what s being used (or not used) to assess if behaviors are consistent with expectations. This provides us with useful information about aggregate usage, but doesn t tell us why some functions are used more than others, and how the product can be enhanced. Therefore, we use different research methods to review qualitative feedback and assess future design direction for our product. Several embedded survey mechanisms provide us with direct feedback from users of our mobile and desktop products. These surveys provide open feedback about product strengths and areas of improvement, and they allow us to track trends in usability and loyalty ratings We also engage in studies to understand how users are implementing our product in their context of use, through remote contextual inquiry (English, & Rampoldi-Hnilo 2004) and semi-structured interviewing (Wood 1997). Usability studies, surveys, and remote feedback sessions (Bolt & Tulathimutte 2010) also provide valuable information about opportunities, needs, pain points, and functional priorities before, during, and after new features have been released. To facilitate our research, we are fortunate to have internal recruiters who maintain a panel of customers who have opted-in to participating in research studies, as well as active communities of current and potential customers who want to provide feedback. While some studies include participant compensation, others are conducted without compensation, and yield as many (and sometimes more) responses because they are conducted as part of a community event. Many of our customers genuinely enjoy providing feedback to enhance the product, and we have a rich community where customers engage with product managers, user experience personnel, and each other in order to discuss enhancements as well as to share best practices. HEWLETT PACKARD ENTERPRISE: DESIGNING RESEARCH TO GET INPUT ON COMPLEX SYSTEMS Melissa Meingast

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 351 EG Design HF & UX Research PM Hewlett Packard Enterprise Houston, TX, 77070 Melissa.Meingast@hpe.com Hewlett Packard Enterprise is a Fortune 500 company that offers worldwide IT, technology & enterprise products and solutions, with industry-leading positions in servers, storage, wired and wireless networking, converged systems, software, services and cloud. Melissa manages a HF and UX research program within the Enterprise Group, one of the three primary business units of Hewlett Packard Enterprise. As part of an enterprise design team, the HF and UX research function is supplied as an internal consulting service. The goal of this function is to interface with a variety of engineering and product teams to understand and evaluate their strategic and tactical user research needs, to design and lead research programs to meet these needs based on current scientific research as well as domain expertise, and to serve as a user advocate throughout the enterprise. Determining individual programs needs, and defining a program to best meet them, requires that teams have a relationship with HF based on collaboration, rather than policing. Relationship building is key, as is the consistent demonstration of the value which this type of research brings to a design process. Additionally, this function may be brought in to help support individual teams within the enterprise that may have internal UX and/or HF research functions within their teams, as well as in partnership with other companies to execute joint research programs. Due to the global nature of this function, and the scale of Hewlett Packard Enterprise s product range, research spans hardware technologies such as high density computing servers and networking equipment, to data center management software, to cloud computing platforms, to internal call center support tools and processes. A range of research methodologies are employed, depending on the needs of each individual program. While some of those most commonly utilized methodologies are common to most UX research programs, including moderated laboratory based user testing, heuristic evaluations, surveys, participatory design, contextual inquiry, card sorts, and guided interviews, they have to be tailored to fit within the enterprise product space. Equipment can be extremely large, or have unique power requirements. Software often requires significant technical expertise to implement a test environment, and end-to-end installation processes can be time consuming. Further, research involving a skilled professional population leads itself to methodologies that may otherwise not be appropriate, allowing the participant to serve as the domain expert. In addition to a wide range of products, there are a range of user types and communities that participate in the research. Some of these may be informal communities that revolve around job role, such as enterprise business system administrators, while others may be formal communities, such as those that occur within open source projects, like Open Stack. Mechanisms for locating and qualifying research participants, collecting data, and sharing that data back with both state holders and the community (and even identifying who the stakeholders are) are shaped by these differences. RACKSPACE: BUILDING A USER RESEARCH TEAM FROM THE GROUND UP IN THE ENTERPRISE SPACE Michelle Peterson (Formerly: Head of UX Research, Rackspace) Currently: VP of Research, Sentier Strategic Resources michelle.peterson@sentier.com https://www.linkedin.com/in/michellepeterson Rackspace is a cloud-enabled managed hosting provider that prides itself on top-notch customer service known as Fanatical Support. The Experience Design team is responsible for improving the experience of our website visitors, users of our products, and those who reach out to our customer service center for help. That means that we have to consider the needs of CIO s, CTO s, CMO s, and others in decision-making roles for IT infrastructure as well as the needs of developers, systems administrators, and others in hands-on roles. Michelle manages a group of UX Research specialists within the Rackspace Experience Design and Development team whose work spans across all of Rackspace (internal operations and external customer experiences). Designing products and services for a range of people with varying degrees of technical knowledge is not easy. In early 2014, the Experience Design team was just forming, and neither designers, developers, nor product managers had deep knowledge about how customers were actually using the products, what their goals were, or what features and workflows would help achieve those goals. We needed to build a user research team who could start gathering the answers to these questions quickly. Over the course of about six months, we built a team of ten researchers with a variety of research skills. We added five User Researchers because we needed ethnographic work done in order to understand the context of use for our products. We

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 352 needed to understand the tasks people were doing and how they were doing them. We needed to understand users mental models to inform the information architecture of both our website and our web-based products. We needed usability assessments to understand what was working and what wasn t working, and we needed the ability to rapidly iterate designs based on quick user feedback. We also needed to do all of this in partnership with our more technically well-versed colleagues to ensure that we could conduct sessions that got deep enough into the technical details, without getting derailed by the jargon and complex concepts. In addition to the user researchers, we also added two Data Scientists to our team because we needed the ability to mine a vast array of data that already existed in-house, including ticketing data, chat data, product usage data, and web analytics data. We have used this data to determine the most common problems people contact support about and to identify the most commonly used aspects of our products. We ve been able to look at the words customers actually use frequently on our forums, in tickets, and in suggestion submissions to determine if some word or phrase is an industry standard or if it s Rackspacespecific jargon and unlikely to be understood by our users. Product usage data has allowed us to identify customers who may be migrating to another provider, and follow-up with them to determine why. Making sure that researchers have access to technical and high-level study participants is no small task, so we hired two Research Coordinators. They handle finding people who match our recruiting criteria and building a database of potential research participants. They track how frequently a person has been contacted and how frequently a person has participated in studies. They also handle all the scheduling and logistics for studies and take care of compensating participants after they have participated. Despite the complexity of the domain and the variety of user types, being able to conduct high-quality user research quickly has allowed designers, product managers, and customer service leaders to make better decisions. Some of those decisions were big, such as the decision to completely redesign the set of tools our customer service agents use to help customers who call in or submit help tickets. Some of those decisions seemed small at the time and yet had big effects, for example the addition of a single checkbox in a product control panel, increased product adoption by 10 times over the baseline. Regardless, they are all decisions that are helping Rackspace achieve its vision to be recognized as one of the world s greatest service companies. SOLARWINDS: THE VALUE OF RELATIONSHIP BUILDING Anna Rowe Senior Manager, UX Research & Design SolarWinds Austin, TX 78735 anna.rowe@solarwinds.com www.linkedin.com/in/annarowe1 SolarWinds builds software to manage and monitor IT infrastructure. If you work for a company with more than about five IT people, your IT Team is probably using SolarWinds or a SolarWinds competitor to manage and monitor your company s network, servers and applications. Anna leads a team of UX designers and researchers who are working to make SolarWinds products easy to understand and use. SolarWinds products solve complex problems, but unlike typical enterprise software products that are sold to executives by teams of consultants, our target audience is the hands-on practitioner. Our bottom-up sales approach requires that products sell themselves via downloadable trials that must show value within hours rather than days or weeks. Users need to be able to install and setup the software on their own, relying on the product itself rather than a team of consultants and sales engineers. The upshot is that SolarWinds wouldn t exist without easy-to-use products. At the same time, SolarWinds is an exceptionally pragmatic company that doesn t have the patience for Big R research. How do we ensure that user research fits into the product development process at SolarWinds? We do it by applying a MacGyver approach, where a heavy focus on relationship building is the paper clip that holds it all together. Relationship building is a cornerstone of the SolarWinds UX process. We work to build strong relationships within the organization and with our customers. From an organization perspective, UX collaborates closely with product managers, developers, sales engineers, support representatives, and our internal IT team. In our loose and fast-paced organization, one of the only fixed principles is to encourage as many people from as many different disciplines as possible to observe interactions with users live so that everyone can share what is learned and bring it back to their respective disciplines. One of the biggest differentiators between SolarWinds and companies that build enterprise software is the exceptionally close relationship that we work to build and maintain with our end users. Our customers use SolarWinds products daily, and we ve built our UX organization from the ground up with the idea of

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 353 honoring and mining their vested interest in improving the products that they live in. For example, to make UX activities provide immediate value for the end user, we turn over control of the last 5 to 10 minutes of any UX activity to the user, so that they can tell us whatever is on their mind, from a feature request, to a good or bad experience with sales or support. We ve also developed unique methods for collecting longitudinal feedback, such as private community spaces where users monitor and comment on a feature s progress and participate in various activities until the feature is released, and diary studies where users send us survey responses and video snippets describing their activities and reactions at key points in time. The close relationships that we create and maintain with our users pay off in dividends. Because users feel valued and heard when they work with us, we can do more activities in a given product lifecycle and do them faster. For example, our recruiting incidence rate is between 25% and 100%, which means that recruiting for a six-user feature walkthrough would require us to reach out to as many as 24 customers, and as few as 6. These close relationships also give us ad hoc data collection opportunities: Users often contact us to show us problems, issues or suggestions that they have with how our products work in their environment. These ad-hoc show me how you use it sessions yield rich contextual data about how products are performing in the real world. Enterprise UX research can be tough because we address complex problems with (sometimes) hard-tofind users, but focusing on relationship building can provide an efficient (and fun!) path for bringing the user into your company s product development process. CONCLUSIONS Gathering user input to inform the infrastructure of our modern technology landscape can be challenging. This panel brings together UX leaders from various enterprise technology companies in hopes to spur collaboration, inspire academic research (as well as future applied researchers), and to generally share best practices to address the inherent challenges of this work. Panelists will share their approaches to recruiting, building teams, designing studies, and delivering insights to the organization. Importantly, most of this discussion applies to anyone working within a large organization including those serving consumer populations as questions around building teams, designing practical yet sound research, and delivering insights are common themes. This panel is unique in that it will also address the unique challenges of recruiting and collecting insights from a technical user population. Further, some panelists will discuss how they use non-traditional sources (e.g. data analytics / big data, site intercept surveys) to inform research studies and Human Factors design improvements. Other panelists will discuss how they gather insight from communities of users to deliver design direction to product teams quickly. The audience will be encouraged to share their insight regarding research methods, building research teams to deliver impact at scale, and resource management. REFERENCES Bailey, J., Etgen, M., and Freeman, K. (2003). Situation Awareness and System Administration. Presented at the workshop: System Administrators are Users, Too: Designing Workspaces for Managing Internet-scale Systems, CHI, Ft. Lauderdale, FL. Bolt, N., & Tulathimutte, T. (2010). Remote research. Brooklyn, NY: Rosenfeld Media. Burns, M. (2012, April 24). The state of customer experience 2012. Retrieved from Forrester research database. Buley, L. (2015, March 23). How to modernize user experience: It s not just about usability testing anymore. Retrieved from Forrester research database. English, J., & Rampoldi-Hnilo, L. (2004). Remote Contextual Inquiry: A Technique to Improve Enterprise Software. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 48(13), 1483-1487. Haber, E. and J. Bailey (2007). Design Guidelines for System Administration Tools Developed through Ethnographic Field Studies. Proceedings of the 2007 Symposium on Computer Human Interaction for the Management of Information Technology, 1-9. Haber, E. and E. Kandogan, (2007). Security Administrators in the Wild: Ethnographic Studies of Security Administrators. Presented at the workshop Security User Studies: Methodologies and Best Practices, CHI, San Jose, CA. Takayama, L. and E. Kandogan, (2006). Trust as an Underlying Factor of System Administrator Interface Choice. Proceedings of ACM CHI 2006 Conference on Human Factors in Computing Systems, 1391-1396. Velasquez, N, Weisband, S. and Durcikova, A. (2008) Designing Tools for System Administrators: An Empirical Test of the Integrated User Satisfaction Model. Proceedings of the 22nd conference on Large installation system administration conference (LISA 08), 1-8. Wixom, B. H. and Todd, P. A. (2005). A Theoretical Integration of User Satisfaction and Technology Acceptance, Information Systems Research, 16 (1), 85-102. Wood, L. E. (1997). Semi-structured interviewing for usercentered design. Interactions, 4(2), 48-61.