Wearable Digitization of Life Science Experiments

Similar documents
The Ubiquitous Lab Or enhancing the molecular biology research experience

The elabbench: An Interactive Tabletop System for the Biology Laboratory

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Wi-Fi Fingerprinting through Active Learning using Smartphones

Designing an interface between the textile and electronics using e-textile composites

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Interactions and Applications for See- Through interfaces: Industrial application examples

2nd ACM International Workshop on Mobile Systems for Computational Social Science

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Early Take-Over Preparation in Stereoscopic 3D

Baroesque Barometric Skirt

Beyond the switch: explicit and implicit interaction with light Aliakseyeu, D.; Meerbeek, B.W.; Mason, J.; Lucero, A.; Ozcelebi, T.; Pihlajaniemi, H.

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

We should start thinking about Privacy Implications of Sonic Input in Everyday Augmented Reality!

Available online at ScienceDirect. Procedia Computer Science 56 (2015 )

Designing for Spatial Multi-User Interaction. Eva Eriksson. IDC Interaction Design Collegium

3D and Sequential Representations of Spatial Relationships among Photos

Digital Paper Bookmarks: Collaborative Structuring, Indexing and Tagging of Paper Documents

Unit 23. QCF Level 3 Extended Certificate Unit 23 Human Computer Interaction

Indoor Positioning with a WLAN Access Point List on a Mobile Device

LIS 688 DigiLib Amanda Goodman Fall 2010

DESIGN STYLE FOR BUILDING INTERIOR 3D OBJECTS USING MARKER BASED AUGMENTED REALITY

CAPACITIES FOR TECHNOLOGY TRANSFER

Reflecting on Domestic Displays for Photo Viewing and Sharing

Open Research Online The Open University s repository of research publications and other research outputs

UNIT 2 TOPICS IN COMPUTER SCIENCE. Emerging Technologies and Society

GridOrbit Public Display: Providing Grid Awareness in a Biology Laboratory

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

Chairs' Summary/Proposal for International Workshop on Human Activity Sensing Corpus and Its Application (HASCA2013)

HALEY Sound Around the Clock

Enhancing Shipboard Maintenance with Augmented Reality

Magic Touch A Simple. Object Location Tracking System Enabling the Development of. Physical-Virtual Artefacts in Office Environments

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Virtual Co-Location for Crime Scene Investigation and Going Beyond

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

Paint with Your Voice: An Interactive, Sonic Installation

SUPPORTING LOCALIZED ACTIVITIES IN UBIQUITOUS COMPUTING ENVIRONMENTS. Helder Pinto

TOWARDS AUTOMATED CAPTURING OF CMM INSPECTION STRATEGIES

Industrial Use of Mixed Reality in VRVis Projects

Sensing Human Activities With Resonant Tuning

Service Cooperation and Co-creative Intelligence Cycle Based on Mixed-Reality Technology

The HiveSurf Prototype Project - Application for a Ubiquitous Computing World

Designing for End-User Programming through Voice: Developing Study Methodology

Capacitive Face Cushion for Smartphone-Based Virtual Reality Headsets

rainbottles: gathering raindrops of data from the cloud

Virtual Context Based Services for Support of Interaction in Virtual Worlds

Prototyping Automotive Cyber- Physical Systems

GOALS TO ASPECTS: DISCOVERING ASPECTS ORIENTED REQUIREMENTS

Ubiquitous Computing. michael bernstein spring cs376.stanford.edu. Wednesday, April 3, 13

Distributed Robotics: Building an environment for digital cooperation. Artificial Intelligence series

Annotation Overlay with a Wearable Computer Using Augmented Reality

Automated Virtual Observation Therapy

ActivityDesk: Multi-Device Configuration Work using an Interactive Desk

Consenting Agents: Semi-Autonomous Interactions for Ubiquitous Consent

synchrolight: Three-dimensional Pointing System for Remote Video Communication

MOBAJES: Multi-user Gesture Interaction System with Wearable Mobile Device

Development of an Intelligent Agent based Manufacturing System

Effective Iconography....convey ideas without words; attract attention...

PLEASE NOTE! THIS IS SELF ARCHIVED VERSION OF THE ORIGINAL ARTICLE

Cognitive robots and emotional intelligence Cloud robotics Ethical, legal and social issues of robotic Construction robots Human activities in many

The UCD community has made this article openly available. Please share how this access benefits you. Your story matters!

CAESSA: Visual Authoring of Context- Aware Experience Sampling Studies

XLS Electronic Pipettes

From rationalization to complexity: evolution of artifacts in design.

Wearable Laser Pointer Versus Head-Mounted Display for Tele-Guidance Applications?

Guidelines for Implementing Augmented Reality Procedures in Assisting Assembly Operations

The Physicality of Digital Museums

Published in: Proceedings of the Workshop on What to Study in HCI at CHI 2015 Conference on Human Factors in Computing Systems

Multi-task Learning of Dish Detection and Calorie Estimation

Open Archive TOULOUSE Archive Ouverte (OATAO)

Face Registration Using Wearable Active Vision Systems for Augmented Memory

Simplified, high performance transceiver for phase modulated RFID applications

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Re-build-ing Boundaries: The Roles of Boundaries in Mixed Reality Play

6 Ubiquitous User Interfaces

Thermo Scientific SPECTRONIC 200 Education

Immersive solutions for future Air Traffic Control and Management

The Appropriation Paradox: Benefits and Burdens of Appropriating Collaboration Technologies

Kissenger: A Kiss Messenger

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

THE VIRTUAL-AUGMENTED-REALITY ENVIRONMENT FOR BUILDING COMMISSION: CASE STUDY

HUMAN-COMPUTER INTERACTION: OVERVIEW ON STATE OF THE ART TECHNOLOGY

Control4 Smart Home Lighting Guide

AR-Enhanced Human-Robot-Interaction Methodologies Algorithms

A Reconfigurable Citizen Observatory Platform for the Brussels Capital Region. by Jesse Zaman

Determine the Future of Lean Dr. Rupy Sawhney and Enrique Macias de Anda

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

MoCap Tango: Traces of Complexity

User requirements for wearable smart textiles. Does the usage context matter (medical vs. sports)?

Organic UIs in Cross-Reality Spaces

Tutorial: The Web of Things

Adding Context Information to Digital Photos

Introducing a Spatiotemporal Tactile Variometer to Leverage Thermal Updrafts

MELODIOUS WALKABOUT: IMPLICIT NAVIGATION WITH CONTEXTUALIZED PERSONAL AUDIO CONTENTS

Activity-Centric Configuration Work in Nomadic Computing

Head Tracking for Google Cardboard by Simond Lee

A SURVEY OF MOBILE APPLICATION USING AUGMENTED REALITY

D8.1 PROJECT PRESENTATION

Some UX & Service Design Challenges in Noise Monitoring and Mitigation

Transcription:

UBICOMP '14 ADJUNCT, SEPTEMBER 13-17, 2014, SEATTLE, WA, USA Wearable Digitization of Life Science Experiments Philipp M. Scholl Embedded Sensing Systems TU Darmstadt Darmstadt, 64287 scholl@ess.tu-darmstadt.de Kristof Van Laerhoven Embedded Sensing Systems TU Darmstadt Darmstadt, 64287 krisof@ess.tu-darmstadt.de Abstract Experimental work in Life Sciences is done with protective garment to contain harmful agents and to avoid contaminations. This limits the amount of documentation that can be done during experimentation, since pen n paper and other equipment is hardly allowed in those environments. Relying on her memory, the scientist has to reconstruct the important details of her experiment later on. Wearable computers, like Google Glass or wrist-worn Smartwatches, can enhance the scientist s ability to record key information while conducting his experiment. Especially the possibility of hands-free, and implicit interaction with the wearable system creates new possibilities for augmenting the scientist s memory. Author Keywords Life Science, Documentation, Wearable Computing, Google Glass, RFID, Assistance Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. UbiComp 14, September 13 17 2014, Seattle, WA, USA Copyright is held by the owner/author(s). Publication rights licensed to ACM. ACM 978-1-4503-3047-3/14/09...$15.00. http://dx.doi.org/10.1145/2638728.2641719 ACM Classification Keywords H.5.m [Information interfaces and presentation (e.g., HCI)]: Miscellaneous. Introduction Vannervar Bush s vision[3] of a scientist able to record his activities and access information while conducting experiments is becoming a non-prototypical technical 1381

UBICOMP '14 ADJUNCT, SEPTEMBER 13-17, 2014, SEATTLE, WA, USA possibility. In Bush s scenario a scientist with a walnut-sized, forehead-mounted camera... moves about and observes, he photographs and comments. Time is automatically recorded to tie the two records together. Those records can later be reviewed, combined and further analysed on a personal computer. 70 years later, wearable devices like Google Glass are enabling scientists to enjoy the reality of this vision. In life sciences, especially in laboratory work, wearable devices have the highest potential to augment the scientist s memory. Not only does the scientist wear protective garment (cf. Fig. 1), but she is also continuously manipulating the environment with her hands. Cutting organic material, mixing fluids, calculating dilutions, taking notes, measuring quantities, labelling samples, operating machinery etc. hardly leave time for capturing the ongoing experiment digitally. Especially when considering traditional touch input methodologies. Those require the experimenter to take off his garment to avoid contaminations. Devices that can be worn underneath and provide non-touch input/output modalities, or continuous sensor recordings allow to capture an experiment easily while conducting it. The question arises how this im- and explicit interaction can supplement current work-practises. Reproducibility of experimental research in biology is of major concern. A recent report has found that results of pre-clinical cancer research[2] could be duplicated in only 6 of 54 cases. Reproducing experimental research requires the exact restoration of environmental conditions, which first of all requires them to be documented in a protocol. However these protocols are routinely written offline, i.e. after the experiment has been conducted and reconstructed from memory. Important information could 1382 have already been lost or simply overlooked. Hand-written laboratory notebooks still serve as the main archival mediums for experimenters[11]. These contain not only notes, but also figures and print-outs, providing a very flexible system. Compared to digitized information these notes are hard to organize later, search, copy and share. Documentation capabilities, and therefore reproducibility of experiments, can be enhanced with wearable technology. Figure 1: A scientist working in a laboratory with an L1 security level, taking notes on the current workflow while conducting an experiment. In this paper we describe possible interaction designs for a combination of wearable devices in wet laboratory environments. After presenting related work on digitizing laboratory work, we describe interactions geared towards extending short-term memory (STM) during an experiment, as well as long-term memory (LTM) for archival purposes. We conclude with an outlook on future challenges.

WORKSHOP: WAHM Related Work The necessity of computer-supported electronic notebooks [4] has been observed in several research projects before. Related work aims at providing interactive displays, recording equipment in the environment (such as augmented containers, cameras, or microphones), and introducing extra input methods in the lab (such as keypads, photo capture buttons, barcode scanners) to fulfill this necessity. The LabScape[1] project was an early investigation in an ubiquitous computing platform to help scientists and students to access and capture information in the laboratory. It uses interactive flowchart diagrams to vizualize experiment procedures and annotate ongoing procedures that are accessed via a touch-tablet, barcode scanner, numeric keypad and wireless keyboard. The Combechem project proposed the Semantic Smart Laboratory[6], a system for supporting chemistry experiments focused on providing a flexible ontology for describing experiments and storing them for later retrieval. The Prism[11] project reports on a study of biologists work practices and presents a hybrid system using hand-written notes as well as digital content to capture, visualise and interact with activity streams in the laboratory. The a-book[8] combines a tablet and PDA to capture paper notebook writing and merging the physical and electronic information involved in biology laboratory notebooks. A system to support biologists in the field was presented in the ButterflyNet[13] project, in which handwritten notes are captured and combined with visual and audio information for later access. The elabbench[12] and Biotisch[5] take the integration in the laboratory further by replacing the traditional workbench with a tabletop system that presents information on the bench s surface, also allowing interaction, sensing of augmented 1383 objects (e.g, racks of test tubes) and taking pictures of the whole setup with an overhead camera. The gathered information is stored in a wiki-like notebook for retrieval. Figure 2: An example laboratory protocol page. These systems form the basis of augmenting the memory of a scientist in the laboratory. By providing the means to

UBICOMP '14 ADJUNCT, SEPTEMBER 13-17, 2014, SEATTLE, WA, USA design an experimental workflow beforehand, make recordings, taking notes and storing information about augmented objects they support the scientist remembrance capabilities. In contrast, our proposed system focuses on the largely unexplored area of supporting and augmenting such laboratory tasks by means of a lightweight wearable system that requires little to no interference with the laboratory environment and its inventory, and support hands-free operation. We argue that this approach of augmenting the researchers instead of the laboratory, has many advantages, not in the least the fact that the users in existing laboratories can opt to keep on documenting experiments with traditional methods. The concepts of wearable workflow monitoring, documentation access, and assistance have been thoroughly explored by research in other domains[7, 9, 10] though. Key components of the envisioned system in this paper were also inspired by the Remembrance Agent (Remem) project at the MIT more than a decade ago. We are however focused on the specific scenario of documenting and assisting wet laboratory tasks, where contextual knowledge is well-defined, and tasks and information have additional constraints that the lab environment poses. Wearable Digitization of Life Science Life science experiments can be categorized into two major themes: establishing a new protocol and reproducing an already existing protocol. When establishing a new protocol, interaction with the system will be more explicit since an important aspect is the documentation of the experimental steps. The scientist will be more content to invest an additional burden to digitize the taken steps. This is different when reproducing an already established protocol. Depending on the familiarity of the scientist with the protocol, the less information will be queried and recorded explicitly. Figure 3: The prototpyical RFID bracelet used for implicit identification of labelled materials/containers. A wrist-worn accelerometer logging device (called HedgeHog) is also visible. There are also combinations of both cases, where the scientist re-establishes only a fraction of the original protocol. An experiment is therefore most probably commemorated as a fixed procedure of steps performed from a set of organic/chemical material using machinery/tools to quantify some physical property, exposing the material to varying physical conditions. Memory augmentation techniques therefore should revolve around those specific steps. During the Labscape[1] project these steps were categorized as: Combination forming a single compound from several other compounds. Dispensing non-selectively extracting a sub-volume of a compound. 1384

WORKSHOP: WAHM Separation selectively extracting a sub-volume based on a range of some physical property, e.g. by molecular mass during centrifugation. Incubation exposing a compound to specific, possibly varying, environmental conditions. This includes varying salinity, temperature, humidity, acidity etc. Detection recording physical properties of the resulting compound, either by images, natural language description, spectral images, optical density etc. Labelling and Identification naming samples or compounds in containers and identifying them. A system consisting of a head mounted display (HMD, Google Glass), a Smartwatch that records the wearers motion and includes an RFID-reader can support the scientist during those steps. STM can be supported through interaction with Glass by providing details on the protocol at hand, and by accessing previously recorded information. The scientist s LTM can be supported by providing different recording capabilities, indexed by various memory cues. Those cues are described in the following subsections. Wrist Motion Sequences Laboratory work usually involves a lot of repetitive manual work. Compounds are formed by pipetting, test tubes are collected and put into different machinery for incubation and many more actions are performed while conducting an experiment. This process can take hours to days to complete. Smartwatch like devices (cf. Fig. 3) can record the wearer s motion with high fidelity and accuracy for long periods of time. From this motion certain actions can be identified automatically (for example pipetting) and stored as memory cues. This would allow the scientist to query the whole process he was involved in while conducting an experiment. For example checking all recordings that have been done just before or after pipetting. Or jumping to the spot in a video log where containers have been openend/closed. Alternatively, the scientist could access recordings from previously done protocol repetitions by this automatic recognition. With an HMD even while he is involved in the experiment, with current information displayed based on the recognition results from wrist motion. Labelling and Identification One important aspect of laboratory work is the management of different compounds used to conduct the experiment, including samples, chemicals and other materials. With a wrist-worn RFID-reader and tagged containers, information on handled materials can be recorded almost automatically. Alternatively, fiducial markers and a wireless label printer can be used to achieve identification. This in turn can be used to provide cues to navigate recordings ( when have I last worked with sample X? ). It can also provide additional information for detecting actions for an increased activity recognition accuracy, e.g. when holding a pipette the wearer is probably not pouring material from a large container. Currently containers are labelled with hand-written mnemonics. These labels are sometimes only readable by the originating person, or, after having spent time in the freezer, are not readable at all. Also, for lack of space on those containers, they are sometimes only color-coded and a table with additional information is created on the fly. This kind of manual LTM augmentation, can also be achieved with wearables. Notes/Video/Photo Recordings Biologist work in environments where protective garment is required. Both for protecting themselves from possibly harmful agents and also protecting the experiment from 1385

UBICOMP '14 ADJUNCT, SEPTEMBER 13-17, 2014, SEATTLE, WA, USA outside contaminations. However, recording details about the work at hand like taking notes, video or photo recordings requires them to use their hands. In circumstances where the used agents are not harmful or contaminations can be ruled out, they only need to grab a pencil or a digital camera to create this documentation. In laboratories with a security level above L1, where this behaviour is not allowed, more effort needs to be put into documenting experiments. Here, notes and the protocol workflow need to be remembered. Looking up documentation or writing notes is too much of an effort, since all protective garment would need to be put down. Having a system like Google Glass, which not only allows to view information, but also provides a camera like the one in Bush s vision allows scientist to record and review their experiments in a hands-free manner. The scientist is able to take pictures of the organisms and compounds he creates, recording and commenting key observations while they happen. More radically she could video record the whole experiment and highlight key observations. This could augment her short- and long-term photographic memory. Cues could be provided by spoken text, recognized actions of the wrist, or simply time-based - allowing the scientist to review work a few minutes, hours, days ago or finding executions of similar actions. This could not only be done when reviewing their work later on a PC, but also during the experiment while interacting with Glass. Future Challenges A wearable system, consisting of an HMD and a sensor-enabled Smartwatch, allows scientist to digitally record their experiments without any modification to the laboratory environment. Some of those recordings can happen implicitly, for example by video-recording the whole experiment. The scientist is then able to review those recordings on a PC, after she has conducted the experiment. This is either done for archiving purposes or to elicit important information for repeating the experiment. The major challenges for such a system of wearables include: User Interface of the wearable system, especially regarding that the scientist to be supported is engaged in an experiment. The system should therefore keep interaction times short, and adapt to the current workflow as good as possible. Information processing may allow to provide the recorded data in a way that automates parts of writing a laboratory protocol. Especially for reviewing purposes the question arises, which memory cues are essential for the scientist. Will she/he query for taken actions, used samples/material, time of day, symbolic locations in the lab or other memory cues? Can these memory cues be used to compare repetitions of experiments, maybe highlighting differences in their execution? Legal problems can arise for certain laboratory environments. Especially laboratories financed through industry are usually obliged to secrecy by their partners. This includes video and audio recordings of lab benches and workflow materials of experiments. The scientist is held responsible if such data is shared. This should be reflected in the technical architecture. These challenges are not only present for a memory augmentation system in the life sciences, but in other scenarios as well. The solutions used in the context of biology laboratories might prove useful elsewhere. 1386

WORKSHOP: WAHM References [1] Arnstein, L., Franza, R., Borriello, G., and Consolvo, S. Labscape: a smart environment for the cell biology laboratory. IEEE Pervasive Computing (July 2002). [2] Begley, C. G. Raise standards for preclinical cancer research. Nature (2012), 8 10. [3] Bush, V. As We May Think, 1945. [4] Du, P., and Kofman, J. A. Electronic Laboratory Notebooks in Pharmaceutical R&D: On the Road to Maturity. Journal of the Association for Laboratory Automation 12 (2007). [5] Echtler, F., Häussler, M., and Klinker, G. BioTISCH: the interactive molecular biology lab bench. In CHI Extended Abstracts on Human Factors in Computing Systems, ACM (10), 5 10. [6] Hughes, G., Mills, H., De Roure, D., Frey, J. G., Moreau, L., Schraefel, M. C., Smith, G., and Zaluska, E. The semantic smart laboratory: a system for supporting the chemical escientist. Organic & Biomolecular Chemistry (Nov. 2004). [7] Lukowicz, P., Timm-Giel, A., Lawo, M., and Herzog, O. WearIT@work: Toward Real-World Industrial Wearable Computing. IEEE Pervasive Computing (2007). [8] Mackay, W. E., Pothier, G., and Letondal, C. The Missing Link: Augmenting Biology Laboratory Notebooks. In Symposium on User Interface Software and Technology, vol. 15 (2002). [9] Nicolai, T., Sindt, T., and Witt, H. Wearable computing for aircraft maintenance: Simplifying the user interface. In Applied Wearable Computing (IFAWC) (2006). [10] Ockerman, J., and Pritchett, A. Preliminary investigation of wearable computers for task guidance in aircraft inspection. International Symposium on Wearable Computers (ISWC) (1998). [11] Tabard, A., Eastmond, E., and Mackay, W. E. From Individual to Collaborative: The Evolution of Prism, a Hybrid Laboratory Notebook. In Computer Supported Cooperative Work, ACM (2008). [12] Tabard, A., Ramos, J. H., and Bardram, J. The elabbench in the wild: supporting exploration in a molecular biology lab. In Human Factors in Computing Systems, ACM (2012). [13] Yeh, R. B., Liao, C., Klemmer, S. R., Lee, B., Kakaradov, B., Stamberger, J., Paepcke, A., and Sciences, B. ButterflyNet: A Mobile Capture and Access System for Field Biology Research. In Human Factors in Computing Systems, ACM SIGCHI (2006). 1387