Supporting Interaction Through Haptic Feedback in Automotive User Interfaces

Similar documents
HAPTICS AND AUTOMOTIVE HMI

A Multi-Touch Enabled Steering Wheel Exploring the Design Space

Auto und Umwelt - das Auto als Plattform für Interaktive

Multimodal human-computer interaction in the car Novel interface and application concepts

Early Take-Over Preparation in Stereoscopic 3D

Vibro-Tactile Information Presentation in Automobiles

EVALUATION OF DIFFERENT MODALITIES FOR THE INTELLIGENT COOPERATIVE INTERSECTION SAFETY SYSTEM (IRIS) AND SPEED LIMIT SYSTEM

MOBILE AND UBIQUITOUS HAPTICS

Controlling vehicle functions with natural body language

Haptic Navigation in Mobile Context. Hanna Venesvirta

Glasgow eprints Service

C-ITS Platform WG9: Implementation issues Topic: Road Safety Issues 1 st Meeting: 3rd December 2014, 09:00 13:00. Draft Agenda

Haptic messaging. Katariina Tiitinen

Touch Drive A touch-based multi-function controller for autonomous driving. Master of Science Thesis JUNTIMA NAWILAIJAROEN VASILEIOS GOLEMATIS

NAVIGATION. Basic Navigation Operation. Learn how to enter a destination and operate the navigation system.

Human Factors Studies for Limited- Ability Autonomous Driving Systems (LAADS)

Haptic Cueing of a Visual Change-Detection Task: Implications for Multimodal Interfaces

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

Platform-Based Design of Augmented Cognition Systems. Latosha Marshall & Colby Raley ENSE623 Fall 2004

Drumtastic: Haptic Guidance for Polyrhythmic Drumming Practice

What You See Is What You Touch: Visualizing Touch Screen Interaction in the Head-Up Display

Mobile & ubiquitous haptics

Gestural Interaction on the Steering Wheel Reducing the Visual Demand

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

SAfety VEhicles using adaptive Interface Technology (SAVE-IT): A Program Overview

Heads up interaction: glasgow university multimodal research. Eve Hoggan

E61, E63, E64, E70, E87, E90, E91, E92, E93, R56 BMW AG - TIS

Creating User Experience by novel Interaction Forms: (Re)combining physical Actions and Technologies

Geo-Located Content in Virtual and Augmented Reality

COMAND Operator, s Manual

Adapting SatNav to Meet the Demands of Future Automated Vehicles

Stabilization of the Mobility in Free Traffic

Intelligent Technology for More Advanced Autonomous Driving

Kinect Interface for UC-win/Road: Application to Tele-operation of Small Robots

NORTHWESTERN UNIVERSITY. Reducing Driver Distraction with Touchpad Physics

FAQ New Generation Infotainment Insignia/Landing page usage

E90 Project Proposal. 6 December 2006 Paul Azunre Thomas Murray David Wright

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Further than the Eye Can See Jennifer Wahnschaff Head of Instrumentation & Driver HMI, North America

Gestural Interaction With In-Vehicle Audio and Climate Controls

A Haptic Gearshift Interface for Cars

CarTeam: The car as a collaborative tangible game controller

Minimizing Distraction While Adding Features

VX6020. Navigation Operating Instructions. watts peak

Navigation Operating Instructions

HapTouch and the 2+1 State Model: Potentials of Haptic Feedback on Touch Based In-Vehicle Information Systems

Unlock the power of location. Gjermund Jakobsen ITS Konferansen 2017

Voice Control System Operation Guide. Mercedes-Benz

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Human Factors: Unknowns, Knowns and the Forgotten

HMI: An Emotional Experience

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

Communication and interaction strategies in automotive adaptive interfaces *

THE USE OF DIGITAL GESTURES FOR A SAFE DRIVING

A Flexible, Intelligent Design Solution

Significant Reduction of Validation Efforts for Dynamic Light Functions with FMI for Multi-Domain Integration and Test Platforms

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

Humans and Automated Driving Systems

HapticArmrest: Remote Tactile Feedback on Touch Surfaces Using Combined Actuators

Navigation Operating Manual

Design and evaluation of Hapticons for enriched Instant Messaging

Virtual Reality Calendar Tour Guide

HAVEit Highly Automated Vehicles for Intelligent Transport

Digital Engines for Smart and Connected Cars By Bob O Donnell, TECHnalysis Research Chief Analyst

SIDVI Safe and Integrated Driver-Vehicle Interface

ADAS Development using Advanced Real-Time All-in-the-Loop Simulators. Roberto De Vecchi VI-grade Enrico Busto - AddFor

Direct Manipulation. and Instrumental Interaction. CS Direct Manipulation

Chapter 2 Mechatronics Disrupted

Safe Speech by Knowledge

How To Make Large Touch Screens Usable While Driving

Perceptual Overlays for Teaching Advanced Driving Skills

Haptic presentation of 3D objects in virtual reality for the visually disabled

PerSec. Pervasive Computing and Security Lab. Enabling Transportation Safety Services Using Mobile Devices

B O S E A N D C A B I N S O U N D M A N A G E M E N T. John Pelliccio Head of Product Communications Bose Automotive Systems Stow, Massachusetts

Current Technologies in Vehicular Communications

OASIS concept. Evangelos Bekiaris CERTH/HIT OASIS ISWC2011, 24 October, Bonn

Nidec Moves Everything World s biggest motor maker to show off its comprehensive line-up at CES 2018"

Interactions and Applications for See- Through interfaces: Industrial application examples

Building Spatial Experiences in the Automotive Industry

A Matter of Trust: white paper. How Smart Design Can Accelerate Automated Vehicle Adoption. Authors Jack Weast Matt Yurdana Adam Jordan

CS 315 Intro to Human Computer Interaction (HCI)

Development of Gaze Detection Technology toward Driver's State Estimation

TOUCHABLE HOLOGRAMS AND HAPTIC FEEDBACK: REAL EXPERIENCE IN A VIRTUAL WORLD

Technologies that will make a difference for Canadian Law Enforcement

CONNECTED VEHICLE-TO-INFRASTRUCTURE INITATIVES

Human Autonomous Vehicles Interactions: An Interdisciplinary Approach

For Review Only. Wireless Access Technologies for Vehicular Network Safety Applications

Designing & Deploying Multimodal UIs in Autonomous Vehicles

CSC C85 Embedded Systems Project # 1 Robot Localization

Camera-Monitor Systems as a Replacement for Exterior Mirrors in Cars and Trucks

6 Ubiquitous User Interfaces

Non-Visual Menu Navigation: the Effect of an Audio-Tactile Display

The Amalgamation Product Design Aspects for the Development of Immersive Virtual Environments

Bluetooth Low Energy Sensing Technology for Proximity Construction Applications

Session 2: New tools for production support Does technology do it all? Reflections on the design of a tramway cockpit

A Comparison of Two Wearable Tactile Interfaces with a Complementary Display in Two Orientations

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

QS Spiral: Visualizing Periodic Quantified Self Data

S.4 Cab & Controls Information Report:

White paper on CAR28T millimeter wave radar

Transcription:

The boundaries between the digital and our everyday physical world are dissolving as we develop more physical ways of interacting with computing. This forum presents some of the topics discussed in the colorful multidisciplinary field of tangible and embodied interaction. Eva Hornecker, Editor Supporting Interaction Through Haptic Feedback in Automotive User Interfaces Dagmar Kern Bertrandt Ingenieurbuero GmbH dagmar.kern@googlemail.com Bastian Pfleging VIS, University of Stuttgart bastian.pfleging@vis.uni-stuttgart.de Driving a car today is becoming much like interacting with a mobile computer in a moving environment. Due to the rapid development of sensor technologies and mobile services, the car has already become a 16 space for media consumption, and even acts as a moving office. The car today is far more than what it was in the early days: a simple means of transportation. But maneuvering a car has been, and probably always will be, a highly visual task at least until the introduction of autonomous driving. The driver must observe the outside environment and check driving-related information such as current speed.

such as heating, navigation, adaptive cruise control, active lane guidance, parking assistance, and night vision to name only a few. Altogether, a modern car today can offer more than 700 functions. Now, imagine a car with more than 700 controls on its dashboard! It is quite obvious that this number of controls could not be handled by a driver, who would be able neither to reach all of them nor to remember their locations. In particular, the introduction of GPS navigation systems and advanced infotainment features created the need for a display, which paved the way to combine all functions into one central multifunctional system. The benefits of haptic control and haptic feedback in the car seem to have been pushed aside by the need to provide more comfort and infotainment functions, not to mention additional safety systems. This trend has F igure 1. Car controls, then and now. Left: 1938 BMW 328. Right: 2011 BMW 6 Series. frequency. The advantages of physical controls are that drivers can find and use them more or less eyes-free, just by feeling, and that these controls provide direct haptic feedback. Early on, buttons for turning on the headlights even remained in their position when pushed so the driver could feel whether the headlights were on. To turn off the headlights, the driver had to pull the button. Looking at cars of the late 1990s, we can see there still were a lot of different haptic controls, such as buttons, sliders, knobs, and stalk controls, and that one-to-one mappings were still in use (for an overview of car controls, see [1]). In contrast, over the past decade the number of functions for infotainment, comfort, and assistive systems within cars has exploded: Current well-equipped cars require means for browsing MP3 collections and controlling complex systems Photographs from left by Dagmar Kern, Bastian Pfleging. Therefore, when designing automotive interfaces it is essential that the interaction concept include additional senses and modalities to reduce visual demands on the driver. A prominent example is the human haptic sense. Devices for the primary driving task are designed for eyes-free interaction; nobody needs to look at the steering wheel while turning or at the accelerator pedal when increasing speed. These haptic primary driving devices have been part of cars from the very beginning and are almost indispensable. During the first 100 years of the automobile, it was common to add a new control when introducing new functionalities. The predominant interaction concept was a one-to-one mapping in which one control operates exactly one function (see Figure 1). For example, there would be a button for turning on the headlights or a knob for adjusting the radio 17

led to a reduced number of different interaction devices but requires the driver to search through a range of menus to find a desired function. The driver now has to divide his or her visual attention between the primary driving task and a central information display (CID), which requires a lot of visual attention and thus distracts from the primary driving task. Of course, with the increasing number of new driver-assistance systems, such as lane-departure warning or pedestrian-detection systems, one could argue that today s cars also have eyes on the road, removing some visual load from the driver. However, from a legal perspective, the driver is still responsible for any traffic accidents. The user interface in the car should therefore be designed in the best possible way to prevent driver distraction and thus potential accidents. 18 Haptic Feedback in Multifunctional Car Systems Usually there are three ways to interact with multifunctional systems in the car: Buttons and additional controls are arranged around or near the CID. Some of these buttons are context dependent; their meaning is shown on the display next to the button. A multifunctional controller is used to navigate hierarchical menu structures, which are shown on a high-resolution CID. Virtual buttons must be pressed directly on a touchenabled version of the screen. There are also combinations of these interaction concepts available on the market. Which haptic feedback do these interaction concepts provide? In the first case, a kind of haptic feedback is given by the control itself: Even if the driver has to look for the meaning of a context-dependent button mounted around the screen, the button lets the driver feel if it has been pushed and therefore communicates if a function has been selected. When the driver knows the meaning of a contextdependent button, he or she can even select it without looking. In the second case, the haptic feedback is more complex: Push and turn controllers such as the idrive controller by BMW [2], the COMAND controller by Mercedes [3], and the MMI controller by Audi [4] are usually mounted in the driver s armrest on the center console and can be turned, pushed, and shifted in four or even eight directions to navigate through menus shown on the display. Often there are additional buttons mounted around the controller to support menu browsing. Haptic feedback is provided by force feedback when turning the controller. Thus, the driver can feel how the menu selection moves to the next item in a list; at the end of the list, the controller stops turning to indicate there are no more items. This haptic feedback enables a kind of eyes-free interaction. For example, when a driver wants to select the second-to-last song in a list, he or she can quickly navigate to the end of the list and then go back one step, without looking at the display. Another approach is followed by the Lexus Remote Touch controller [5], which is mainly a reactionforce and force-feedback joystick that allows the user to do pointing tasks on the screen using the WIMP metaphor known from the personal computer. The cursor movement is supported by haptic feedback in a way that the driver can feel the cursor entering a region of a selectable item. Force-feedback is used to snap the cursor to buttons. The item can be selected by clicking buttons on the joystick. In the third case there is no haptic feedback at all in current technologies. The virtual buttons on a touchenabled CID rely only on visual feedback on the display or on audio feedback. But there are already some Photograph by Bastian Pfleging

F igure 2 (left). Audi MMI touch controller. Figure 3 (right). Adaptive control elements can dynamically change their shape to communicate information to the driver in an eyesfree manner [9]. To add vibrotactile feedback, Richter et al. created the HapTouch system, which enriched a touchscreen with force sensors and a linear actuator able to move the entire display in z-direction [7]. When the screen is touched, the system can sense a finger position and provide tactile information. If the display is pressed harder, fur- ther interaction, such as pressing a button, can be implemented. Instead of vibrotactile feedback, Spies et al. propose using an adaptive haptic touchpad that changes its shape in z-direction [8]. By using an adaptive, adjustable surface like the HyperBraille technology, elements on the center display can be represented on the touchpad by Touch Interaction and Haptic Feedback Depending on the position of the screen (e.g., in the middle of the dashboard), it can be cumbersome for the driver to reach and touch the central information display. Decoupled input devices like the multifunctional controllers mentioned before allow us to overcome this issue. Similarly, touch input decoupled from the display is used in the MMI touch interface [6] developed by Audi: A touchpad on the center stack close to the gearshift allows drivers to perform gestures, input characters, or execute shortcut commands (assigned to certain areas of the input device) to control the infotainment system (see Figure 2). Since this is a traditional touchpad, this device offers neither visual nor haptic feedback, which makes it difficult to implement, for example, virtual buttons. IKTD, University of Stuttgart projects that are looking into possibilities to add haptic feedback to touch interaction. 19

Let s Get Physical interactions March + April 2013 elevated areas. Thus, drivers can feel and even press these elevated elements, allowing them to interact with the infotainment system. In a user study, Spies et al. compared a traditional, flat touchpad and the haptic touchpad. The results show that the haptic touchpad reduces the number and duration of glances to the screen and results in less lane deviation than the use of the traditional touchpad [8]. Even for traditional controls such as buttons or dials, first concepts propose to add more tactile feedback. Prototypes of these controls, called adaptive control elements [9], can change their shape to improve eyes-free usage in the cockpit (see Figure 3). These controls can be dynamically modified by reorienting the control or certain surfaces. Similarly, the control s geometric shape can be modified (e.g., height and width change from a circular to a square control). Finally, the surface can be modified to transmit certain information. Tactile Automotive User Interfaces One further option for haptic feedback in the car is provided by tactile interfaces based on vibration impulses. Such interfaces are already available, for example, integrated into the seat or steering wheel. These are part of advanced driver-assistance systems and may emit vibration impulses to inform drivers when they leave the lane unintentionally. So far, vibration output is used only to alert drivers of an event. However, a few research projects posit that vibration feedback can communicate more than just simple warning signals. The navigation context seems to be a promising application area, in which complex information such as distance to the next turn or intersection could be encoded as vibration signals. This would have the same advantage as muting the sometimes annoying audio output of the navigation system (a common practice among drivers), but would not result in missed turns. The vibration output can provide sufficient information or may prompt the driver to look at the display. Meaningful use of tactile information requires a direct connection between the actuators and the human body. Up until now, the following locations for actuators have been considered by researchers to communicate navigational instructions: driver s seat [10], steering wheel [11], and additional wearable devices, such as the waist belt used by Asif et al. [12]. Van Erp and van Veen [10] compared the effects of providing navigation instructions through tactile output instead of visual output on the driver s cognitive workload and performance. They developed a tactile display consisting of eight tactors mounted in the driver s seat (four under each thigh) and used an ipsilateral mapping, which means vibration under the left thigh indicates a left turn. Distance to the next waypoint was encoded in rhythm (closer temporal intervals indicate that the distance to the next waypoint is decreasing). They observed a reduction of cognitive workload when comparing the use of a tactile display with the use of a visual display, especially in highworkload conditions. The steering wheel as a tactile output device was the focus of one of our own projects [11]. We used tactile output as part of a multimodal automotive user interface. In addition to vibrotactile output, this interface provided visual and audio information. We developed a steering wheel with six vibration motors; one at the top, one at the bottom, and two at each side. Similar to van Erp and van Veen s approach, vibration on the left side indicated a left turn, whereas vibration on the right side indicated a right turn. By comparing different combinations of these output modalities in a driving simulator study, we found that adding tactile information to existing audio or, especially, to visual representations can improve both driving performance and driver experience. Most of our participants used the tactile information as a pointer/trigger to tell them when to attend to the other forms of information presented, thus enabling them to offload the cognitive work associated with monitoring for navigational information. Whereas the addition of tactile output to the seat and steering wheel requires changing the car interior (the easiest way would be to add additional components like seat or steering wheel covers with integrated vibration motors), Asif et al. [12] developed a device worn by the driver. Their team integrated eight vibration motors into a waist belt and used this device to present spatial turn-by-turn information, including distance encoding. On a test track, they compared this approach with a conventional car navigation system and found that the tactile belt led to better orientation performance, with no significant effect on cognitive workload, driving performance, and distraction. Summing up these results, we see that tactile feedback can be provided at various locations and allows us to communicate more information than just warnings, without having a negative effect on the driver s workload and performance. We assume that tactile displays will have a large impact on further developments of automotive user interfaces. 20

Let s Get Physical Conclusion Looking at the automotive user interface and how it has changed, in particular throughout the past decade, we see a lot of changes and challenges when it comes to usable and at the same time safe interfaces. Driven by technical advances, assistance systems, and especially by consumer devices, we expect the number of functions in the car to increase even further throughout the coming years. This means consequently that the hierarchical CID menus that provide access to different functions will become more and more complex. Even if controls provide haptic feedback, browsing through the menus to find a desired function will require a lot of the driver s visual attention and cause driver distraction. A revival of the initial paradigm of one-to-one mappings can be observed in BMW s concept of functional bookmarks [13]. For frequently used functions, such as radio stations, favorite settings, or navigation destinations, shortcut buttons can be used. When a finger approaches a (user-defined, freely assigned) bookmark button, visual information about the function of the button is provided on the screen. Once the button is pressed, the function is activated. Functional bookmarks combine the benefit of haptic controls with personalized interaction. If the driver knows the function assignment by heart, he or she can operate these buttons eyesfree and speed up his or her interaction time enormously. As long as the task of driving a car is not completely automated, visual attention should get diverted away from the road as little as possible. Since the audio channel is already occupied by other activities in the car such as talking to passengers and listening to the radio, in the coming years haptic and tactile feedback in the various forms described here will receive more attention in the development process of automotive user interfaces. The simplest and already most common usage of tactile feedback is to alert the driver about potential hazards or to raise his or her attention to further visual information shown on the display. The fact that more drivingrelated apps and personal devices that are not or are only partially integrated into the system find their way into the car raises new challenges, but also provides new opportunities for haptic feedback. Add-ons such as steering wheel or seat covers that provide haptic feedback can be offered by thirdparty suppliers, together with an associated app for a smartphone. Overall, we assume the task of designing automotive user interfaces will remain challenging throughout the coming years. New functions and technologies will continuously find their way into the car and want to be operated while driving. At the same time, driving safety needs to be maintained and even improved. Therefore, the main goal is to keep driver distraction low, for example, by reducing the necessity of visual attention other than to the road. By using haptic or tactile feedback, an additional modality can be used to pass information to the driver. While tactile feedback has already found its way into the car for basic warnings, it will be interesting to see how additional haptic feedback will be integrated into future cars. EndNotes: 1. Kern, D. and Schmidt, A. Design space for driver-based automotive user interfaces. Proc. of 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, New York, 2009, 3-10. 2. http://www.bmw.com/com/en/insights/technology/technology_guide/articles/controller.html 3. http://www4.mercedes-benz.com/manual-cars/ ba/cars/221/en/overview/comand.html 4. http://www.audi.com/com/brand/en/tools/advice/ glossary/mmi.browser.html 5. https://secure.drivers.lexus.com/lexusdrivers/ magazine/articles/vehicle-insider/remote-touch- System 6. http://www.audi.co.uk/new-cars/a8/a8/technology-as-standard/mmi-touch.html 7. Richter, H., Ecker, R., Deisler, C., and Butz, A. HapTouch and the 2+1 state model: Potentials of haptic feedback on touch based in-vehicle information systems. Proc. of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, New York, 2010, 72-79. 8. Spies, R., Blattner, A., Lange, C., Wohlfarter, M., Bengler, K., and Hamberger, W. Measurement of driver s distraction for an early prove of concepts in automotive industry at the example of the development of a haptic touchpad. In Human- Computer Interaction. Interaction Techniques and Environments, LNCS 6762. J. Jacko, ed. Springer, Berlin/Heidelberg, 2011, 125-132. 9. Maier, T., Schmid, M., and Aleko, P. HMI with adaptive control elements. ATZautotechnology 8 (2008-07), 50-55; http://www.atzonline.de/ Artikel/3/8252/HMI-with-Adaptive-Control- Elements.html 10. Van Erp, J.B.F. and Van Veen, H.A.H.C. Vibrotactile in-vehicle navigation system. Transportation Research Part F: Traffic Psychology and Behaviour 7, 4-5 (2004), 247-256. 11. Kern, D., Marshall, P., Hornecker, E., Rogers, Y., and Schmidt, A. Enhancing navigation information with tactile output embedded into the steering wheel. Proc. of Pervasive Computing. Springer, 2009. 12. Asif, A. and Boll, S. Where to turn my car? Comparison of a tactile display and a conventional car navigation system under high load condition. Proc. of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications. ACM, New York, 2010, 64-71. 13. http://www.bmw.com/com/en/insights/technology/technology_guide/articles/functional_bookmarks.html About the Authors Dagmar Kern works as an HMI developer at Bertrandt Ingenieurbuero GmbH in Cologne, Germany. She studied media informatics at the University of Munich. In January 2012 she received her Ph.D. from the University of Duisburg- Essen. Her research interests are in humanmachine interaction in the automotive context. Bastian Pfleging is a research assistant at the Institute for Visualization and Interactive Systems (VIS) at the University of Stuttgart. His general research interests are multimodal and natural user interfaces. In particular, he is interested in human-computer interaction in the automotive context. He received his M.Sc. in computer science from the Technical University of Dortmund. DOI: 10.1145/2427076.2427081 2013 ACM 1072-5520/13/03 $15.00 interactions March + April 2013 21