Early Take-Over Preparation in Stereoscopic 3D

Similar documents
Human Autonomous Vehicles Interactions: An Interdisciplinary Approach

Experimental Setup of Motion Sickness and Situation Awareness in Automated Vehicle Riding Experience

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Further than the Eye Can See Jennifer Wahnschaff Head of Instrumentation & Driver HMI, North America

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach

Interaction design for nomadic devices in highly automated vehicles

Wi-Fi Fingerprinting through Active Learning using Smartphones

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

HAPTICS AND AUTOMOTIVE HMI

EVALUATION OF DIFFERENT MODALITIES FOR THE INTELLIGENT COOPERATIVE INTERSECTION SAFETY SYSTEM (IRIS) AND SPEED LIMIT SYSTEM

Significant Reduction of Validation Efforts for Dynamic Light Functions with FMI for Multi-Domain Integration and Test Platforms

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor

Supporting Interaction Through Haptic Feedback in Automotive User Interfaces

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

The Effects of Lead Time of Take-Over Request and Non-Driving Tasks on Taking- Over Control of Automated Vehicles

UbiBeam++: Augmenting Interactive Projection with Head-Mounted Displays

P1.4. Light has to go where it is needed: Future Light Based Driver Assistance Systems

PROGRAM OVERVIEW. 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications

Experimental Study on Different HMI Design Options for Lateral Safe Applications

Kissenger: A Kiss Messenger

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display

Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP)

Introducing a Spatiotemporal Tactile Variometer to Leverage Thermal Updrafts

Findings of a User Study of Automatically Generated Personas

A USEABLE, ONLINE NASA-TLX TOOL. David Sharek Psychology Department, North Carolina State University, Raleigh, NC USA

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

Interactions and Applications for See- Through interfaces: Industrial application examples

User requirements for wearable smart textiles. Does the usage context matter (medical vs. sports)?

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

HAVEit Highly Automated Vehicles for Intelligent Transport

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Gaze Behaviour as a Measure of Trust in Automated Vehicles

Volkswagen Group: Leveraging VIRES VTD to Design a Cooperative Driver Assistance System

Humans and Automated Driving Systems

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Adapting SatNav to Meet the Demands of Future Automated Vehicles

Usability Evaluation of Multi- Touch-Displays for TMA Controller Working Positions

Tech Center a-drive: EUR 7.5 Million for Automated Driving

Benefits and advantages of ergonomic studies in digital 3D

Intelligent Technology for More Advanced Autonomous Driving

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers

Ubiquitous Computing Summer Episode 16: HCI. Hannes Frey and Peter Sturm University of Trier. Hannes Frey and Peter Sturm, University of Trier 1

Social Viewing in Cinematic Virtual Reality: Challenges and Opportunities

Mixed Reality-based Process Control of Automatic Printed Circuit Board Assembly Lines

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient

Unlock the power of location. Gjermund Jakobsen ITS Konferansen 2017

Capability for Collision Avoidance of Different User Avatars in Virtual Reality

Safe, Efficient and Effective Testing of Connected and Autonomous Vehicles Paul Jennings. Franco-British Symposium on ITS 5 th October 2016

VSI Labs The Build Up of Automated Driving

A SYSTEM FOR VEHICLE DATA PROCESSING TO DETECT SPATIOTEMPORAL CONGESTED PATTERNS: THE SIMTD-APPROACH

Auto und Umwelt - das Auto als Plattform für Interaktive

Exploration of Tactile Feedback in BI&A Dashboards

Platform-Based Design of Augmented Cognition Systems. Latosha Marshall & Colby Raley ENSE623 Fall 2004

An Information Fusion Method for Vehicle Positioning System

Simulation and Animation Tools for Analysis of Vehicle Collision: SMAC (Simulation Model of Automobile Collisions) and Carmma (Simulation Animations)

Designing & Deploying Multimodal UIs in Autonomous Vehicles

Human Factors: Unknowns, Knowns and the Forgotten

SAfety VEhicles using adaptive Interface Technology (SAVE-IT): A Program Overview

The application of Work Domain Analysis (WDA) for the development of vehicle control display

TRB Workshop on the Future of Road Vehicle Automation

A Three-Tier Communication and Control Structure for the Distributed Simulation of an Automated Highway System *

Human Factors Studies for Limited- Ability Autonomous Driving Systems (LAADS)

Connected Vehicles Program: Driver Performance and Distraction Evaluation for In-vehicle Signing

Issues and Challenges of 3D User Interfaces: Effects of Distraction

WB2306 The Human Controller

Using Vision-Based Driver Assistance to Augment Vehicular Ad-Hoc Network Communication

Industrial Keynotes. 06/09/2018 Juan-Les-Pins

Virtual Homologation of Software- Intensive Safety Systems: From ESC to Automated Driving

The Effect of Display Type and Video Game Type on Visual Fatigue and Mental Workload

Deliverable D1.6 Initial System Specifications Executive Summary

MOBILITY RESEARCH NEEDS FROM THE GOVERNMENT PERSPECTIVE

Naturalistic Flying Study as a Method of Collecting Pilot Communication Behavior Data

Flight Data Handling with Augmented Reality. Doctoral Symposium ICRAT 18, Castelldefels, Barcelona (Catalonia) June 25 th 29th 2018

Evaluation of an Enhanced Human-Robot Interface

Using FMI/ SSP for Development of Autonomous Driving

Current Technologies in Vehicular Communications

THE FUTURE OF AUTOMOTIVE - AUGMENTED REALITY VERSUS AUTONOMOUS VEHICLES

Intelligent driving TH« TNO I Innovation for live

A Concept Study on Wearable Cockpit for Construction Work - not only for machine operation but also for project control -

HUMAN FACTORS IN VEHICLE AUTOMATION

A SERVICE-ORIENTED SYSTEM ARCHITECTURE FOR THE HUMAN CENTERED DESIGN OF INTELLIGENT TRANSPORTATION SYSTEMS

The Effects of an Eco-Driving Interface on Driver Safety and Fuel Efficiency

The 3xD Simulator for Intelligent Vehicles Professor Paul Jennings. 20 th October 2016

Validation of an Economican Fast Method to Evaluate Situationspecific Parameters of Traffic Safety

Building Spatial Experiences in the Automotive Industry

Prototyping Automotive Cyber- Physical Systems

Perceptual Overlays for Teaching Advanced Driving Skills

Human Factors Evaluation of Level 2 and Level 3 Automated Driving Concepts

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

A Multi-Touch Enabled Steering Wheel Exploring the Design Space

LANEKEEPING WITH SHARED CONTROL

EMPOWERING THE CONNECTED FIELD FORCE WORKER WITH ADVANCED ANALYTICS MATTHEW SHORT ACCENTURE LABS

Multi-Touchpoint Design of Services for Troubleshooting and Repairing Trucks and Buses

Advanced Tools for Graphical Authoring of Dynamic Virtual Environments at the NADS

Situational Awareness A Missing DP Sensor output

VR Haptic Interfaces for Teleoperation : an Evaluation Study

Perspective of Reality

Transcription:

Adjunct Proceedings of the 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 18), September 23 25, 2018, Toronto, Canada. Early Take-Over Preparation in Stereoscopic 3D Gesa Wiegand fortiss GmbH wiegand@fortiss.org Christian Mai Christian.Mai@ifi.lmu.de Yuanting Liu fortiss GmbH liu@fortiss.org Heinrich Hußmann hussmann@ifi.lmu.de Abstract Situation awareness in highly automated vehicles can help the driver to get back in the loop during a take-over request (TOR). We propose to present the driver a detailed digital representation of situations causing a TOR via a scaled down digital twin of the highway inside the car. The digital twin virtualizes real time traffic information and is displayed before the actual TOR. In the car cockpit an augmented reality headset or a Stereoscopic 3D (S3D) interface can realize the augmentation. As today s hardware has technical limitations, we build an HMD based mock-up. We conducted a user study (N=20) to assess the driver behavior during a TOR. We found that workload decreases and steering performance raise significantly with the proposed system. We argue that the augmentation of the surrounding world in the car helps to improve performance during TOR due to better awareness of the upcoming situation. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. AutomotiveUI 18 Adjunct, September 23 25, 2018, Toronto, ON, Canada. 2018 Association for Computing Machinery. ACM ISBN 978-1-4503-5947-4/18/09$15.00 https://doi.org/10.1145/3239092.3265957 ACM Classification Keywords H.5.m [Human-centered computing]: Visualization.; H.5.m [Human computer interaction (HCI)]: Interaction paradigms. Introduction Highly automated driving is a key technology in current automotive research. Especially take-over requests

Work-in-Progress Figure 1: VR Cockpit with digital twin and representation of the own car. Video task as used in the study near the A-Pillar. Figure 2: Pre-Warning of accident on own lane. Figure 3: Take-over request with video task turned off. (TOR), in which the system fails and the driver needs to regain control, is in focus of current research. According to literature [7, 6, 3] the driver needs 7 to 10 seconds to take back control of the vehicle. But even with ten seconds the driver could be in a situation in which he needs more preparation to take over the steering task. In contrast to existing solutions that focus on the TOR itself this paper focuses on the situation before the TOR. The project Providentia [5] sets up a sensor system on the highway to track all vehicles on the road. Those sensors detect the traffic and possible obstructions affecting the traffic flow. Vehicles and drivers get a far-reaching view of the highway for several kilometers in advance. Thus, it is possible to inform the driver early on of a highway situation in several kilometers away and warn him about a possible system limit that requires a take-over of the driver. The sensor information is used to reproduce the real-time traffic situation in a virtual highway, the so-called digital twin. This paper considers a vehicle with SAE Level 3 automation [1]. This means that the car drives autonomously for most of the time but the driver needs to take over the vehicle if the autonomous system fails. The idea behind this work is that a floating 3D AR representation (or S3D) of the real traffic situation within the cockpit improves the driver s take-over performance. By visualizing in a bird s-eye view the traffic situation, the driver is able to assess the appropriate driving trajectory. The AR interface is set next to the driver in the middle of the cockpit above the gearbox. A user study (N=20), comparing our proposed system to a system without a possibility to foresee the situation, was conducted in order to gain insights into driving performance and workload of drivers. In order to be able to simulate this S3D interface and for an easier implementation the study was conducted using a head-mounted display based driving simulator. The S3D representation of the highway (or digital twin) 143 was displayed to the driver before the driver needed to take over the driving task (Figure 1). After informing the driver of the highway situation with the S3D (Figure 2) the TOR itself is executed (Figure 3). The take-over time is set to 10 seconds. The driving task after the TOR is a lane change maneuver in bad weather conditions in order to avoid collision with a broke down car. This paper presents a novel output interface to prepare the driver of a TOR. By showing the driver an accurate traffic representation the driver is able to get more information to plan the driving trajectory of the car. Our study results give good indication that the driving behavior improves due to the digital twin. Related Work Augmented reality within the car cockpit is widely under research. HUDs use AR elements to display velocity and navigation cues. At CES 2016 BMW presented HoloActive Touch [8] a haptic touch interface floating in midair. The interface is projected in the car cockpit by a mirror plate to create the effect of floating objects. So far this interface displays buttons and no real-time information similar to our system. In some advanced car cockpits a representation of the detected vehicles on nearby lanes is displayed. Drezet et al. [2] presented a HMI concept for autonomous cars. HUDs can be used to display vehicles driving behavior and information about the car s sensor perception. In this work the visualization of the sensor perception is extended to infrastructure sensors. This information is not displayed in the car integrated interface but in a S3D interface next to the driver. User Study The goal of the proposed system is to increase situational awareness and thus improving the driving trajectory planning. By displaying an as accurately as possible

representation of the highway the driver gets more information about possible limitations on the road. Within the VR environment, the driver sits in the cockpit of a highly automated car. Inside the cockpit of the vehicle an AR representation of the highway is displayed above the gearshift. The digital twin displays a sector of the highway the ego-vehicle is driving on. The user can move the section in the digital twin forwards and backwards by pressing a button. They can center the digital twin again on their vehicle if they want to focus on the situation around the ego-vehicle. The distance of the current sector of the highway to the ego-vehicle is shown as a floating digit next to the highway. The virtual environment was set up using Unity 3D. The digital twin appears before the take-over request. A possible reason for a take-over request, like bad weather or an object on the road, is displayed in the digital twin. The user can interact with the digital twin before a take-over request to inform himself about the traffic situation on the highway. He can scroll through the whole traffic situation and change the view to get an overview of the current situation on the highway. With the distance information, the driver knows where the possibly dangerous situation is in respect to the ego-vehicle. 20 participants with a mean age of 25.3 took part in the study, 8 female and 12 male. To test the system the participants were required to perform two driving tasks. The first driving task (Test A) serves as baseline. The second driving task (Test B) includes the AR representation of the highway. The study is designed as a within-subject study, so every participant performs both driving tasks. To eliminate possible learning effects due to this setup the participants were divided up into two groups. Both groups get the same introduction to the setup. Group 1 takes Test A first and then Test B while Group 2 takes Test B first and then takes Test A. Both groups do a preliminary driving test to get used to the system. During the introduction the AR representation of the highway and the functions of the setup are explained. In the next step the take-over process is explained. Within the user study there are four phases: 1. Autonomous Driving During this phase the car controls the steering. The driver concentrates on the secondary task. In this study the secondary task consists of watching a movie and counting objects within the movie. 2. Information Phase This phase begins by a sound to inform the driver that the digital twin is displayed. The driver can interact with it and thus inform himself about the current traffic situation. The secondary task is not available to the driver anymore. This phase lasts 10 seconds. 3. Take-over Phase A take-over request is issued when the ego-vehicle has a time to collision (TTC) of 10 seconds. A warning sound initiates this phase, and the driver is prompted to take over the steering wheel. At the end of the take-over phase, the digital twin disappears. 4. Manual driving After the driver regained control of the vehicle, the driver needs to decide on the driving maneuver. In order to avoid a collision with the objects on the road, the participant needs to change lanes. The participants take the following tests: Test A The first driving test is the baseline of the study. The driving phases consist of Phase 1 - Phase 3 - Phase 4. Test B During Test B the introduced system is used. Different to Test A an additional phase is added. The driver switches through Phase 1 - Phase 2 - Phase 3 - Phase 4. 144

Figure 4: NASA TLX Test Scores. The system is evaluated by the NASA TLX [4] questionnaire and an acceptance scale questionnaire. To evaluate the significance of the results a paired t-test is used. To evaluate the effect of the system on the driving behavior, the steering wheel angle, velocity and the position of the vehicle is tracked. Result and Discussion Take over behavior The driven trajectory is compared between Test A and Test B. To assess the driven trajectory the lateral lane position is recorded. As the driving task requires the driver to change the lane it was considered better to drive on the lane without the obstruction on the road. In Test B the lateral lane positions were generally closer to the lane without an object and results indicate a significant difference between Test A and Test B (p = 0.04). Additionally the steering wheel angle of the driving maneuver is recorded. This gives hints about the steering behavior of the driver, like uncontrolled or smoother lane changes. Though Test B has overall smaller values with a mean of 10.24 and a SD of 4.82 and Test A has higher values with a mean of 15.86 and a SD of 8.45, the p-value indicates no significant difference between the two tests. As a third evaluation measure the speed was taken into account. The p-value is 0.04, indicating significant differences in the speed values in both tests. The speed in Test B was lower, which indicates that the user adapted the speed earlier and more appropriately. User Experience The workload of the system is evaluated using NASA TLX (see figure 4). The temporal demand of Test B is highest, which indicates that the participants needed to spend time on the digital twin to understand the digital representation. The biggest difference between both tests is the effort value. That could be an indication that the digital twin mentally prepares the driver better on the TOR and thus lessens the effort of the TOR. The measurement scales of mental demand, performance, effort and the overall measure show a significant difference between the two tests. Test B has a lower score in most measures and performs better than Test A. The acceptance scale is used to analyze the acceptance of the digital twin. The acceptance is measured by evaluating the usefulness and satisfaction scale. Participants have a positive attitude towards the hologram with a mean of 1.06 in the usefulness and 1.16 in the satisfaction scale. One remark of a participant indicated that he had a better understanding, when he needs to take over the car and in assessing the proper point of time to take-over the vehicle. Conclusion and Future Work Within this paper a system is presented that prepares the driver for a take-over request by informing him of the current traffic situation and the possible reason of the necessity to take over the car. This system simulates a possible S3D interface within the car. A floating AR interface is currently just possible with a HMD within cars, in this study it is set up in VR glasses. A further study evaluates this setup with an implementation in AR glasses. Future research investigates, if the driver shows better driving performance because of the longer time between preparation and manual driving in Test B or because of the visualization of the traffic situation. Overall this study shows promising results in improving take-over behavior of the driver by displaying a digital twin within the vehicle cockpit. Further research needs to be done to evaluate the improvement of situational awareness, changing output modalities and usefulness of the system. 145

References [1] Taxonomy and definitions for terms related to on-road motor vehicle automated driving systems. Tech. rep., SAE International, January 2016. [2] Drezet, H., and Colombel, S. 62-1: Invited paper: Hmi concept for autonomous car. In SID Symposium Digest of Technical Papers, vol. 49, Wiley Online Library (2018), 815 818. [3] Gold, C., Damböck, D., Lorenz, L., and Bengler, K. take over! how long does it take to get the driver back into the loop? In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 57, SAGE Publications Sage CA: Los Angeles, CA (2013), 1938 1942. [4] Hart, S. G., and Staveland, L. E. Development of nasa-tlx (task load index): Results of empirical and theoretical research. In Advances in psychology, vol. 52. Elsevier, 1988, 139 183. [5] Hinz, G., Büchel, M., Diehl, F., Schellmann, M., and Knoll, A. Designing a far-reaching view for highway traffic scenarios with 5g-based intelligent infrastructure. In 8. Tagung Fahrerassistenz (2017). [6] Melcher, V., Rauh, S., Diederichs, F., Widlroither, H., and Bauer, W. Take-over requests for automated driving. Procedia Manufacturing 3 (2015), 2867 2873. [7] Mok, B., Johns, M., Lee, K. J., Miller, D., Sirkin, D., Ive, P., and Ju, W. Emergency, automation off: unstructured transition timing for distracted drivers of automated vehicles. In Intelligent Transportation Systems (ITSC), 2015 IEEE 18th International Conference on, IEEE (2015), 2458 2464. [8] Rümelin, S., Gabler, T., and Bellenbaum, J. Clicks are in the air: How to support the interaction with floating objects through ultrasonic feedback. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, ACM (2017), 103 108. 146