Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Similar documents
Wi-Fi Fingerprinting through Active Learning using Smartphones

Kissenger: A Kiss Messenger

PerSec. Pervasive Computing and Security Lab. Enabling Transportation Safety Services Using Mobile Devices

Early Take-Over Preparation in Stereoscopic 3D

Multi-sensory Tracking of Elders in Outdoor Environments on Ambient Assisted Living

A Simple Smart Shopping Application Using Android Based Bluetooth Beacons (IoT)

OASIS concept. Evangelos Bekiaris CERTH/HIT OASIS ISWC2011, 24 October, Bonn

Findings of a User Study of Automatically Generated Personas

Current Technologies in Vehicular Communications

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

NAVIGATION. Basic Navigation Operation. Learn how to enter a destination and operate the navigation system.

Game Glass: future game service

A MOBILE SOLUTION TO HELP VISUALLY IMPAIRED PEOPLE IN PUBLIC TRANSPORTS AND IN PEDESTRIAN WALKS

Human Autonomous Vehicles Interactions: An Interdisciplinary Approach

Improving Work Zone Safety Utilizing a New Mobile Proximity Sensing Technology

Raising Awareness of Emergency Vehicles in Traffic Using Connected Vehicle Technologies

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

CONTROLLING METHODS AND CHALLENGES OF ROBOTIC ARM

Adapting SatNav to Meet the Demands of Future Automated Vehicles

Connected Car Networking

Learning and Using Models of Kicking Motions for Legged Robots

Practical Experiences on a Road Guidance Protocol for Intersection Collision Warning Application

Measuring User Experience through Future Use and Emotion

Virtual Reality Calendar Tour Guide

Israel Railways No Fault Liability Renewal The Implementation of New Technological Safety Devices at Level Crossings. Amos Gellert, Nataly Kats

NETWORK CONNECTIVITY FOR IoT. Hari Balakrishnan. Lecture #5 6.S062 Mobile and Sensor Computing Spring 2017

6 Ubiquitous User Interfaces

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

STUDY OF VARIOUS TECHNIQUES FOR DRIVER BEHAVIOR MONITORING AND RECOGNITION SYSTEM

SMART ELECTRONIC GADGET FOR VISUALLY IMPAIRED PEOPLE

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

TECHNICAL REPORT. NADS MiniSim Driving Simulator. Document ID: N Author(s): Yefei He Date: September 2006

Xdigit: An Arithmetic Kinect Game to Enhance Math Learning Experiences

C-ITS Platform WG9: Implementation issues Topic: Road Safety Issues 1 st Meeting: 3rd December 2014, 09:00 13:00. Draft Agenda

One App at a Time: How Technology Promotes Safety in the Design & Construction Industry

Azaad Kumar Bahadur 1, Nishant Tripathi 2

University of Toronto. Companion Robot Security. ECE1778 Winter Wei Hao Chang Apper Alexander Hong Programmer

Platform-Based Design of Augmented Cognition Systems. Latosha Marshall & Colby Raley ENSE623 Fall 2004

Automated Virtual Observation Therapy

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

arxiv: v1 [cs.sy] 20 Jan 2014

* Intelli Robotic Wheel Chair for Specialty Operations & Physically Challenged

An Audio-Haptic Mobile Guide for Non-Visual Navigation and Orientation

Results of public consultation ITS

Location Discovery in Sensor Network

Mobile Sensing: Opportunities, Challenges, and Applications

DENSO

Introducing LISA. LISA: Laboratory for Intelligent and Safe Automobiles

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

Semi-Autonomous Parking for Enhanced Safety and Efficiency

Enhancing Shipboard Maintenance with Augmented Reality

Introducing a Spatiotemporal Tactile Variometer to Leverage Thermal Updrafts

IEEE Internet of Things

EVALUATION OF DIFFERENT MODALITIES FOR THE INTELLIGENT COOPERATIVE INTERSECTION SAFETY SYSTEM (IRIS) AND SPEED LIMIT SYSTEM

User Interface Agents

Context-Aware Interaction in a Mobile Environment

NCCT IEEE PROJECTS ADVANCED ROBOTICS SOLUTIONS. Latest Projects, in various Domains. Promise for the Best Projects

Networks of any size and topology. System infrastructure monitoring and control. Bridging for different radio networks

Motion Control of a Three Active Wheeled Mobile Robot and Collision-Free Human Following Navigation in Outdoor Environment

GPS-Based Navigation & Positioning Challenges in Communications- Enabled Driver Assistance Systems

Adaptive Controllers for Vehicle Velocity Control for Microscopic Traffic Simulation Models

Multi-Robot Cooperative System For Object Detection

HUMAN COMPUTER INTERFACE

Smart Navigation System for Visually Impaired Person

Beyond the switch: explicit and implicit interaction with light Aliakseyeu, D.; Meerbeek, B.W.; Mason, J.; Lucero, A.; Ozcelebi, T.; Pihlajaniemi, H.

CONNECTED VEHICLE-TO-INFRASTRUCTURE INITATIVES

Move 4. Physical Activity Sensor User Manual

SUNYOUNG KIM CURRICULUM VITAE

Sensor, Signal and Information Processing (SenSIP) Center and NSF Industry Consortium (I/UCRC)

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Formula Student Racing Championship: Design and implementation of an automatic localization and trajectory tracking system

REAL TIME VISUALIZATION OF STRUCTURAL RESPONSE WITH WIRELESS MEMS SENSORS

Interactions and Applications for See- Through interfaces: Industrial application examples

Development of a telepresence agent

Exploration of Tactile Feedback in BI&A Dashboards

Validation of the Happify Breather Biofeedback Exercise to Track Heart Rate Variability Using an Optical Sensor

Towards affordance based human-system interaction based on cyber-physical systems

Validation of stopping and turning behavior for novice drivers in the National Advanced Driving Simulator

idocent: Indoor Digital Orientation Communication and Enabling Navigational Technology

Development of a Laboratory Kit for Robotics Engineering Education

Sponsored by. Nisarg Kothari Carnegie Mellon University April 26, 2011

Mirrored Message Wall:

Roadside Range Sensors for Intersection Decision Support

ARMY RDT&E BUDGET ITEM JUSTIFICATION (R2 Exhibit)

Baroesque Barometric Skirt

FAQ New Generation Infotainment Insignia/Landing page usage

Traffic Management for Smart Cities TNK115 SMART CITIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

The Intel Science and Technology Center for Embedded Computing

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Formation and Cooperation for SWARMed Intelligent Robots

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Virtual Grasping Using a Data Glove

The Khepera Robot and the krobot Class: A Platform for Introducing Robotics in the Undergraduate Curriculum i

Team Autono-Mo. Jacobia. Department of Computer Science and Engineering The University of Texas at Arlington

The Seamless Localization System for Interworking in Indoor and Outdoor Environments

A Closed-Loop System to Monitor and Reduce Parkinson s Tremors

How to Pair AbiBird Sensor with App and Account

Transcription:

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback Jung Wook Park HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 jungwoop@andrew.cmu.edu SeungJun Kim HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 sjunikim@cs.cmu.edu Anind K. Dey HCI Institute Carnegie Mellon University 5000 Forbes Avenue Pittsburgh, PA, USA, 15213 anind@cs.cmu.edu Abstract We have developed an integrated driving-aware system that allows us to effectively conduct driving user experience (UX) studies. Our system senses driver and vehicle status, analyzes the collected data, and makes a decision about what feedback to provide a driver in a single Android application. We also propose a graphical experimental authoring tool to plan driving routes and manage UX experimental factors. This research with real-world experiments should have great positive impact on further driving-related UX studies. Author Keywords In-vehicle user experience; Sensing framework; Modality design; Experimental design tool Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author. Copyright is held by the owner/author(s). CHI'16 Extended Abstracts, May 07-12, 2016, San Jose, CA, USA ACM 978-1-4503-4082-3/16/05. http://dx.doi.org/10.1145/2851581.2892309 ACM Classification Keywords H.5.m. Information interfaces and presentation Introduction Recently, we have seen significant advances in the area of in-vehicle information systems. However, there are still few systems that allow us to understand what drivers are doing in their cars, the contextual driving situations, and how to present information to them in a thoughtful manner. A common approach is to instrument a test vehicle, but this has limitations.

Drivers are required to use the test vehicle rather than their own, the instrumentation is often quite expensive and complex to install, and the instrumentation is usually limited to data collection. Instead, we propose an Integrated Driving Aware System (IDAS) that supports inexpensive and lightweight instrumentation of any vehicle, supports computation and analysis of driver actions and the driving situation, and supports the presentation of information to drivers in response to these actions and situations. There is currently no system that supports all three of these functions in a single, simple integrated system. In addition, in order to support driving experimentation, the system also needs to support experiment design and management. We also propose an experiment design tool that allows us to create and manipulate one s driving route, add mashup information, such as advertising, roadwork, or notification message, on the route, and specify test conditions. To evaluate interoperability and applicability of our system components, we performed real-world driving experiments on two different routes. Finally, we discuss our results and describe how the IDAS can be utilized for further driving-related studies. smartphone as a sensor platform. In our lab, Kim et al. [4] utilized two different wearable sensors and an onboard diagnostics (OBD) sensor to identify when a driver is susceptible to interruptions. Matt et al. [5] developed a graphical authoring tool for virtual driving experiments. Based on this survey, we identified the following four requirements for in-car sensing and experimentation: Sensing Integrated monitoring of vehicle and driver status Computing Real-time data computation Feedback Modality switching and combination Experiment Authoring Graphical user interface to design and setup driving experiments. Integrated Driving Aware System (IDAS) We designed the IDAS architecture to fulfill the system requirements as shown in Figure 1. System Requirements In order to identify system requirements, we first surveyed previous studies. Felipe et al. [1] attempted to monitor vehicle status using professional acquisition devices. Lu et al. [2] analyzed driving distraction using electrocardiogram (ECG) signals. Derick et al. [3] divided driving style into typical and aggressive categories using a Figure 1: System architecture of the IDAS.

Figure 2: A dashboard to visualize sensing data and check data communication before starting driving. Figure 3: A sensor manager to add or remove required sensors. Figure 5: Navigation view used by a driver in this study. This view performs as a basic navigator by displaying maneuvers, speaking driving direction guides. It also demonstrates mash-up information that was designed and configured in our experimental authoring tool. Figure 4: A haptic feedback pattern designer to control our custom-built haptic device, which called Haptove. Sensing The driver s motion can be determined by using bodyworn accelerometers. Four wearable motion sensors to monitor the head, both arms and right leg can be supported in the IDAS Mobile app. A physiological sensor can be connected with the IDAS Mobile app to monitor how the driver reacts to driving situations. For example, heart-rate variability (HRV) can be used to infer the nervousness of the driver. A vehicle equipped with OBD can measure engine RPM, vehicle speed, etc. and then transmit the data to the IDAS Mobile app. To detect location, AGPS with GLONASS in a tablet has been adopted in the IDAS Mobile app. All sensors can communicate with the IDAS Mobile app through Bluetooth. Researchers can easily extend the sensing capability of the IDAS Mobile app by adding the name and the communication protocol parser of a sensing device. Computing With the evolution of computing power in mobile platforms, it is now possible to use machine learning

algorithms to infer high level context. As a preliminary trial, we have used the IDAS to compute road curvature and right hand acceleration deviation in a real-time operation. However, computing parameters, feature extraction, and classification algorithms could be added later. Figure 6: An example of visual feedback in the IDAS Mobile app. Figure 7: A gaming steering wheel equipped with Haptove Feedback Researchers can select or combine modalities to gain visual, auditory, or haptic information. As shown in Figure 6, the navigation view delivers additional information as a pop-up view, along with an existing map and navigation guide. Auditory feedback is presented using the Google Text-To-Speech engine. In order to deliver vibration feedback, we designed and developed an attachable 20-channel haptic device called Haptove. This battery-powered device is controlled by a custom Bluetooth LE Characteristic and is placed on the steering wheel. As shown in Figure 4, the researcher can design haptic patterns for each driving maneuver. Haptove was evaluated before actual deployment by utilizing a gaming steering wheel in our lab, as shown in Figure 7. Experiment Authoring Tool: IDAS Designer Preparing driving experiments is started with route planning. Once a researcher enters starting and destination points, our authoring tool retrieves the best route using the Google Direction API. Then, the researcher can add mashup information using a map marker and the Google Place API (Figure 8). After the researcher completes the experimental route and mashup data, he can design information trigger conditions and their modality combinations using the IDAS Designer. Figure 8: Screenshot of the IDAS Designer IDAS Server The IDAS Server was developed based on the Spring Framework and MongoDB to store and retrieve experiment configuration, sensing data and participant information. Experiments The primary purpose of our experiment was to evaluate how the proposed system can effectively be used to design and conduct driving monitoring experiments. We also wanted to demonstrate the feasibility of the system in real-world driving situations so that we can provide a reliable driving experimental tool. We designed two different round-trip routes using the IDAS Designer. The first route included city streets, bridges and highways leading from the CMU campus to a public parking lot in downtown Pittsburgh. We utilized 10 advertisements, 5 roadwork signage warnings, and 5 breaking news alerts for the first route. A driver traveled 10 miles, two different times, under high and medium traffic conditions. We designed the

Sensor Motion Physiological OBD GPS Frequency 4~5 Hz 20 Hz 1 Hz 2 Hz Table 1: Sensing frequency per each sensor Field Engine RPM Right wrist acceleration RMS Upper body angle Road curvature Range Condition < 2000 rpm > 0.45 < 100 < 5 < 100m Table 2: Information triggering thresholds for the first and second route second route similar to the first route except we changed the destination to a shopping mall that was 7 miles away from CMU. For the second route, the driver covered 14 miles, two times, under high and medium traffic conditions. During these experiments, the driver wore batterypowered Bluetooth devices consisting of four YEI 3- Space motion sensors and one BioHarness physiological sensor. The four motion sensors were placed on left and right wrists and the head and right leg of the driver. The driver had the physiological sensor on his chest to measure heart rate, respiration, upper body posture and ECG. An on-board diagnostics (OBD) sensor was connected to the experimental vehicle through a SAE J1962 port and transmitted engine RPM and speed via Bluetooth. A-GPS and GLONASS in the Android tablet continuously monitored the location and speed of the vehicle. Table 1 summarizes the frequency of each sensor. All measured data was synchronized and stored into a log file. In these real-world experiments, feedback received from the advertising, roadwork, and breaking news alerts for the designed routes was presented via the use of a combination of three modalities: visual, auditory and haptic. Visual and auditory feedback was delivered to the driver on the screen (Fig. 6) and using the speaker of the Android tablet. We used our Haptove to give vibration feedback to the driver as shown in Figure 9. In these experiments, we specified an informationtriggering threshold as shown in Table 2. We selected four representable sensing values from each sensor: i) engine RPM from OBD, ii) right hand motion from 3- Space YEI, iii) upper body posture from BioHarness, and iv) road curvature from GPS and experimental route. Figure 9: Driving experiments using the IDAS Discussion In terms of cost, the IDAS has allowed us to cut experiment costs, by about $1000. For installation time, we were able to install our system in 5 minutes. We also created a test-driving route and added mashup information within 10 minutes using the IDAS Designer. However, design time varies by experiment complexity. The experimental conditions presented focus on interoperability of the IDAS components and applicability for UX studies. First, we established that heterogeneous sensing data could be gathered and computed in a single application. We assumed that the sample triggering thresholds in Table 2 indicated interruptible moments during driving since the thresholds could be interpreted as low acceleration, free right hand and a static driving route. Our system was able to both collect the sensing data and compute on the data to identify the triggering conditions.

Second, we explored how sensing and computing components of the system can be utilized for UX studies such as understanding when to change the feedback modality. Once sensing data matches the thresholds, we delivered the mashup data through visual (V), auditory (A), and haptic (H) channels simultaneously. Otherwise, we just visually displayed the mashup data without any sound and vibration. As summarized in Table 3, the modality of the mashup data presentation was changed based on the computed sensing data. NO Route Traffic V+A+H V Only Total 1 CMU Parking Lot High 4 35 39 2 CMU Parking Lot Medium 13 24 37 3 CMU Mall High 10 30 40 4 CMU Mall Medium 24 16 40 Table 3: Experimental results of modality change by the predefined thresholds and the computed sensing data. Conclusion and Future Work In this project, we proposed an integrated driving aware system that allows us to facilitate driving user experience studies. We also investigated our sensing framework with four motion sensors, one physiological sensor, one OBD, and one embedded GPS sensor synchronously. We analyzed the aggregated data and changed feedback modality by comparing processed data and pre-defined conditions in real-driving situations. The IDAS will have great impact on designing and conducting real-world driving UX studies. There are a number of interesting directions for future driving research. One direction involves how best to find personalized and optimal information triggering thresholds in terms of safety and usability. In our experiments, the thresholds were just configured based on an assumption. We will develop an intelligent and adaptive threshold model for each user by using machine learning, and evaluate the model through the IDAS. Another direction of interest would be to explore modality presentation techniques for in-vehicle information systems. For example, we will be able to not only combine different modalities, but also transform the graphic interface or change the length of the audio feedback. We suspect there will be a lot of experiments relating to how users interact with autonomous driving cars. The IDAS can also be utilized to model driver behavior and explore what interfaces of those cars might look like in a more automated driving environment. Acknowledgement This project is funded in part by Carnegie Mellon University's Technologies for Safe and Efficient Transportation, The National USDOT University Transportation Center for Safety (T-SET UTC) which is sponsored by the US Department of Transportation. References 1. Felipe E., José J., Enrique S., Alfredo G., Diego P., Jesús C., and Carlos S. 2011. Design and implementation of a portable electronic system for vehicle driver route activity measurement. Measurement, Vol 44(2), pp. 326 337 2. Lu Y., Xianghong S. and Kan Z. 2011. Driving Distraction Analysis by ECG Signals: An Entropy Analysis. LNCS 6775. 258-264.

3. Johnson, D.A. and Trivedi, M.M. 2011. Driving style recognition using a smartphone as a sensor platform. 14th International IEEE Conference on Intelligent Transportation Systems, 1609-1615. 4. Kim, S., Chun, J. and Dey, A.K. 2015. Sensors Know When to Interrupt You in the Car: Detecting Driver Interruptibility Through Monitoring of Peripheral Interactions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '15), 487-496. 5. Schikore, M., Papelis, Y.,. and Watson, G. 2000. Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS. Driving Simulation Conference (DSC)