Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers

Similar documents
Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

R.I.T. Design Thinking. Synthesize and combine new ideas to create the design. Selected material from The UX Book, Hartson & Pyla

CS 350 COMPUTER/HUMAN INTERACTION

Selecting Photos for Sharing

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

Early Take-Over Preparation in Stereoscopic 3D

Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects

Universidade de Aveiro Departamento de Electrónica, Telecomunicações e Informática. Interaction in Virtual and Augmented Reality 3DUIs

EFFECTS OF A NIGHT VISION ENHANCEMENT SYSTEM (NVES) ON DRIVING: RESULTS FROM A SIMULATOR STUDY

THE EFFECTS OF PC-BASED TRAINING ON NOVICE DRIVERS RISK AWARENESS IN A DRIVING SIMULATOR

The Design and Assessment of Attention-Getting Rear Brake Light Signals

Mobile Audio Designs Monkey: A Tool for Audio Augmented Reality

Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP)

HUMAN-MACHINE COLLABORATION THROUGH VEHICLE HEAD UP DISPLAY INTERFACE

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM

Human-Centered Design. Ashley Karr, UX Principal

Effective Iconography....convey ideas without words; attract attention...

Study of Effectiveness of Collision Avoidance Technology

Definition, Effects and Nature of Distracted Driving Worksheet 9.1

Real-time Information Management System Final Report August 8, 2003

Human Factors Evaluation of Existing Side Collision Avoidance System Driver Interfaces

ENHANCED HUMAN-AGENT INTERACTION: AUGMENTING INTERACTION MODELS WITH EMBODIED AGENTS BY SERAFIN BENTO. MASTER OF SCIENCE in INFORMATION SYSTEMS

preface Motivation Figure 1. Reality-virtuality continuum (Milgram & Kishino, 1994) Mixed.Reality Augmented. Virtuality Real...

Revision of the EU General Safety Regulation and Pedestrian Safety Regulation

A Matter of Trust: white paper. How Smart Design Can Accelerate Automated Vehicle Adoption. Authors Jack Weast Matt Yurdana Adam Jordan

23270: AUGMENTED REALITY FOR NAVIGATION AND INFORMATIONAL ADAS. Sergii Bykov Technical Lead Machine Learning 12 Oct 2017

Current Technologies in Vehicular Communications

Understanding User s Experiences: Evaluation of Digital Libraries. Ann Blandford University College London

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display

Trust in Automated Vehicles

Perspective of Reality

SAfety VEhicles using adaptive Interface Technology (SAVE-IT): A Program Overview

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

MOBILITY RESEARCH NEEDS FROM THE GOVERNMENT PERSPECTIVE

The application of Work Domain Analysis (WDA) for the development of vehicle control display

Conceptual Metaphors for Explaining Search Engines

AutoHabLab Addressing Design Challenges in Automotive UX. Prof. Joseph Giacomin September 4 th 2018

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach

Journal of Physics: Conference Series PAPER OPEN ACCESS. To cite this article: Lijun Jiang et al 2018 J. Phys.: Conf. Ser.

ASSESSMENT OF A DRIVER INTERFACE FOR LATERAL DRIFT AND CURVE SPEED WARNING SYSTEMS: MIXED RESULTS FOR AUDITORY AND HAPTIC WARNINGS

Issues and Challenges of 3D User Interfaces: Effects of Distraction

Chapter 6 Experiments

MODULE 10: INTELLIGENT TRANSPORTATION SYSTEMS: SMART WORK ZONES LESSON 1: WORK ZONE SAFETY

An Integrated Approach Towards the Construction of an HCI Methodological Framework

AUGMENTED REALITY IN URBAN MOBILITY

HandsIn3D: Supporting Remote Guidance with Immersive Virtual Environments

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Evaluation based on drivers' needs analysis

Analyzing Situation Awareness During Wayfinding in a Driving Simulator

HELPING THE DESIGN OF MIXED SYSTEMS

Design and prototyping. CS4784: HCI Capstone Virginia Tech Instructor: Dr. Kurt Luther

Mixed Reality technology applied research on railway sector

Human Factors: Unknowns, Knowns and the Forgotten

Spatial Interfaces and Interactive 3D Environments for Immersive Musical Performances

Racenet - Sports Gambling. Multi Maxa - MVP app built from scratch

Bridging the Gap: Moving from Contextual Analysis to Design CHI 2010 Workshop Proposal

Evaluation of Connected Vehicle Technology for Concept Proposal Using V2X Testbed

Situational Awareness A Missing DP Sensor output

This is an electronic reprint of the original article. This reprint may differ from the original in pagination and typographic detail.

School of Computer Science. Course Title: Introduction to Human-Computer Interaction Date: 8/16/11

S.4 Cab & Controls Information Report:

Designing the sound experience with NVH simulation

Human Autonomous Vehicles Interactions: An Interdisciplinary Approach

Proposed Watertown Plan Road Interchange Evaluation Using Full Scale Driving Simulator

The Perception of Optical Flow in Driving Simulators

Design Process. ERGONOMICS in. the Automotive. Vivek D. Bhise. CRC Press. Taylor & Francis Group. Taylor & Francis Group, an informa business

Contextual Design and Innovations in Automotive HMI Andrew W. Gellatly, Ph.D.

Designing A Human Vehicle Interface For An Intelligent Community Vehicle

Image Characteristics and Their Effect on Driving Simulator Validity

Using VR and simulation to enable agile processes for safety-critical environments

Human Factors Studies for Limited- Ability Autonomous Driving Systems (LAADS)

Future Personas Experience the Customer of the Future

CS 315 Intro to Human Computer Interaction (HCI)

Sign Legibility Rules Of Thumb

Access Invaders: Developing a Universally Accessible Action Game

Introduction to HCI. CS4HC3 / SE4HC3/ SE6DO3 Fall Instructor: Kevin Browne

Driving Simulation Scenario Definition Based on Performance Measures

TRAFFIC SIGN DETECTION AND IDENTIFICATION.

THE SCHOOL BUS. Figure 1

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

The Impact of Road Familiarity on the Perception of Traffic Signs Eye Tracking Case Study

Interactions and Applications for See- Through interfaces: Industrial application examples

TANGIBLE IDEATION: HOW DIGITAL FABRICATION ACTS AS A CATALYST IN THE EARLY STEPS OF PRODUCT DEVELOPMENT

Below is provided a chapter summary of the dissertation that lays out the topics under discussion.

Industrial Keynotes. 06/09/2018 Juan-Les-Pins

Findings of a User Study of Automatically Generated Personas

AR 2 kanoid: Augmented Reality ARkanoid

Stanford Center for AI Safety

Scholarly Article Review. The Potential of Using Virtual Reality Technology in Physical Activity Settings. Aaron Krieger.

Multi-Modality Fidelity in a Fixed-Base- Fully Interactive Driving Simulator

Intelligent Technology for More Advanced Autonomous Driving

The Automated Psychophysical Test (APT) for assessing age-diminished capabilities

Multisensory virtual environment for supporting blind persons acquisition of spatial cognitive mapping, orientation, and mobility skills

EVALUATION OF DIFFERENT MODALITIES FOR THE INTELLIGENT COOPERATIVE INTERSECTION SAFETY SYSTEM (IRIS) AND SPEED LIMIT SYSTEM

AGILE USER EXPERIENCE

Course Syllabus. P age 1 5

Transcription:

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1750 Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers Prerana Rane 1, Hyungil Kim 2, Juan Lopez Marcano 1, and Joseph L. Gabbard 2 Department of 1 Electrical & Computer, 2 Industrial & Systems Engineering, Virginia Tech Studies have shown that experts are more sensitive to changes in the road scene than novice drivers, and use the driving patterns of other cars to infer important information. A tool that can help bridge the gap between experts and novices may be augmented reality (AR), which can be used to graphically overlay virtual information onto the real world that may not otherwise be easily inferred. In this paper, we propose an AR interface that aims to improve the sensation, attention, situation awareness and decision making of international drivers who are new to the United States (US). We present results of a preliminary study that identifies the needs of novice international drivers as well as an AR interface design created to support these needs. Contextual inquiry and analysis techniques were used to extract the needs of novice international drivers. Based on observations, iterative designs and a prototype were developed that merge AR and audio feedback. Lastly, the prototype was evaluated by two usability experts, who performed a heuristic walkthrough based on the principles of human information processing. The experts conclude that the interface has the potential of increasing the sensation, attention, situation awareness and decision making while reducing the mental workload of novice international drivers. Future work will include an empirical study to support the observations of the analytical evaluation presented herein. Copyright 2016 by Human Factors and Ergonomics Society. DOI 10.1177/1541931213601401 INTRODUCTION Creating technologies to increase the skills and safety of drivers is a very active topic of research. Such technologies offer promise since a large amount of traffic accidents are caused by drivers who are not capable of perceiving hazardous objects (Rickesh & Naveen Vignesh, 2011; Underwood, 2007; Wai-Tat, Gasper, & Seong-Whan, 2013; Young, Lenné, Beanland, Salmon, & Stanton, 2015). Expert drivers are more sensitive to changes in road environments than novice drivers (Underwood, 2007). For example, experts perceived and anticipated hazardous situations much better than beginners presumably because novices do not have the same high levels of situation awareness, which prevents them from understanding the complexity of the road ahead of them. Understanding the complexity of situations is even more difficult for foreign drivers. Dissanayake & Lu (2001) describe the problems that foreign drivers, both experts and beginners, had in understanding road signs. In the study, 69% of the drivers were experts and 31% were beginners, and none could understand the meaning of the Divided Highway Ends sign. Alarmingly, only 60% of the drivers in the study reacted appropriately to yellow lights. Aside from situation awareness and mental models, human limitations also affect drivers. Drivers need to pay attention to the roadway, sense the most relevant environmental elements, perceive required information, comprehend their meaning, and predict their status in the near future to decide appropriate responses and react to environmental changes. However, the limited capability of human vision (field of view and depth of field) does not allow drives to easily access all required information. Augmented Reality (AR), integrates display and reality using conformal graphics to guide the driver s attention to the most relevant environmental elements, thus making it a good candidate to address these issues. In AR, graphics are overlaid atop real environments in real time (conformal). While AR can be used for entertainment (e.g., interactive games), it has also been found to be an effective visualization technique to display information that users can not directly detect with their own senses. Thus, well-designed AR interfaces have the promise to enhance users perceptions by amplifying their intelligence and skills (Azuma, 1997). In surface transportation, AR head-up displays (HUDs) can guide drivers attention to relevant environmental elements. HUDs can support driver s attention by cueing important elements in such as objects difficult to see (e.g., low-visibility settings), occluded objects, objects out of drivers field of view, and of course additional information associated with objects in view. In a simulation study, Charissis & Papanastasiou (2010) found that virtual representations of vehicles, lane edges, and driving directions resulted in less collisions under limited visibility conditions (e.g., fog) with sudden traffic congestion. Yasuda & Ohama (2012) enhanced drivers attention by using x-ray vision metaphor at blind corners to reduce crossing collisions. Kim et al. examined how AR HUDs can inform drivers with vehicles in blind spot (Kim, Wu, Gabbard, & Polys, 2013) and dynamics of cross traffic (Kim et al., 2016). The study presented herein explored how AR HUDs can be used to train international drivers who are new to the US, which, to the best of our knowledge, has not been investigated before. In fact, most of the work that has been done in the AR area comes in the form of artificial intelligence, not in HCI or Human Factors. Much of the work to date, has examined whether participants can recognize AR objects, not to what degree AR HUD interfaces can help increase the skills of novice drivers. We examine this issue by creating an interface that contains virtual road signs and audio feedback. This interface was iteratively evolved using a user experience (UX) lifecycle in order to extract the problems that are most likely to affect drivers, and then design and prototype an interface that can potentially increase the skills and safety of drivers. This preliminary study was evaluated by two usability experts who concluded that the AR HUD interface can potentially increase drivers sensation, attention, situation

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1751 awareness and decision making. In addition to contributing to the knowledge on human information processing in driving scenarios, our findings can be used to inform other researchers and practitioners on UX processes used to analyze, design, and evaluate AR HUD interfaces. METHOD We realized the AR HUD Interface using the UX lifecycle proposed by Hartson & Pyla (2012), which consists of analysis, design, prototyping, and evaluation phases. Through our study, we sought to answer the following research questions: (1) What are the difficulties that novice international drivers may face while driving in the US? (2) What design elements in an AR HUD interface would best help resolve the difficulties? Prior to the formal requirements analysis phase, we listed the difficulties that we thought novice international drivers faced (note that some of the authors have direct and recent experience as novice international drivers in new environments). We assumed that novice drivers in new areas may be apprehensive about driving due to distances needed to travel, time behind the wheel, mental state, complexity of locale, or may not know the meaning of all road signs; may not be aware of how lane-changing rules work in the US; may not know the speed limits on different roads; and may avoid unfamiliar areas. Also, some international drivers are accustomed to driving on the left-hand side of the road; and may not understand the language the signs are written in or the units used. Contextual Inquiry processes were used to validate these assumptions Requirements Analysis Contextual inquiry with novice international drivers was performed in the high-fidelity simulator (Figure 1) using prerecorded driving footage. The video was 15 minutes long and was shown to five international drivers new to the US. As participants watched the video and emulated driving, we observed participants behavior. An observer, who was sitting behind the passenger seat, noted the participants attention management, steering, pedal, and turn-signal usage. To further measure pedal usage, we timestamped all the scenarios at specific moments when drivers should employ brakes. Using the timestamps as reference points, a rating system was developed to assess brake usage. Participants who used the brakes at the predetermined timestamp received a neutral score; if participants used the brakes before the timestamp, they received a positive score; and if they used the brakes after the timestamp, they received a negative score. An interviewer, who sat in the passenger seat, asked the participant questions related to events in the video. These events included pedestrians crossing the road, approaching cyclists, road signs, etc. For instance, for the event when the driver is approaching a cyclist, the interviewer asked the driver whether or not they saw the cyclist and how they felt. In addition, the interviewer encouraged participants to think aloud and express their thoughts, goals, rationale, frustrations, etc. at any time. After gathering the data, a contextual analysis was performed. We made an affinity diagram using activity notes derived from participants responses and additional comments. Activity notes were classified into different groups, and were then used to guide extraction of participants needs and requirements, as suggested by Beyer & Holtzblatt (1997). Participants needs and requirements were also used to assess the validity of our assumptions. The contextual analysis helped narrow down the list of difficulties faced by novice international drivers. The key findings were that potential drivers did not know the meaning of some road signs such as roundabout/traffic circle sign and construction zone. The participants were not aware of the designated speed limit in several situations. A new observation that emerged from the analysis was that novice international drivers are not comfortable with approaching cyclists or pedestrians on the road. The last step in the requirements analysis phase was to further inform our design based on the participants verbal comments. Comments from the participants such as I do not have an eye for speed limits and I do not know what the cyclist is going to do and I do not know what I am supposed to do gave us some insight on what the underlying problems regarding speed limits and cyclists are. It seems that the problems are generally related to visibility for speed limits and prediction for cyclists. Our analysis suggested that the design should magnify, highlight, or make road signs more visible and should either tell the driver what approaching cyclists may do, or suggest what actions/path the driver should take. Interface Design Based on the result of contextual analysis, we created a persona to help keep novice international drivers in the forefront of design decisions. The persona used, Chinmay, is an international graduate student from India. He is not familiar with the US driving laws and the right lane driving system. He has recently joined the university and is not familiar with the area. He does not have much experience in driving and is not confident driving on the main roads. He gets distracted by the numerous signs that he has to watch out for and he drives cautiously around cyclists and pedestrians. During the ideation phase, a few sample designs were sketched to specifically address the needs identified by the analysis phase, i.e., road signs, speed limits and collision warning (pedestrians and cyclists). The design ideas, suggested to help drivers understand the meaning of road signs and to enable them to take the best possible action in a given situation, included highlighting and enlarging road signs relevant to the driver on the AR HUD; pop-up messages attached to road signs such as "Apply the brakes" linked to stop signs, "Pedestrians and Cyclists frequently walk across. Careful!" at pedestrian crossings; and audio feedback instructing the driver what to do when he sees a road sign instead of explaining the meaning. To get drivers familiar with the speed limits on the road and give an indication of whether they are above/below the limit and by how much, we designed interfaces displaying the current speed and the speed limit of the road and used color-coded bars to depict whether drivers were speeding. To make the drivers aware of pedestrians and cyclists in the vicinity, we considered three interface designs: highlighting cyclists and pedestrians

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting with bounding box or exclamation marks. Based on the contextual inquiry, we also felt the need to add a new feature for aiding navigation because a subset of novice international drivers was not familiar with the area. We gleaned lessons learned from the AR and other literature (Tönnis, Klinker, & Plavšic, 2009) to inform characteristics of our AR designs, for example, information density, shape, size and position, color, brightness and intensity; when used properly can help in enhancing the viewing experience. The amount of information to be shown at a given time and space, or information density, is an important design consideration while designing in a driving context and for non-expert drivers who may get distracted by too much information on the AR HUD. The information provided on the AR HUD should not distract the driver from driving or add to his/her cognitive load. The shape of the virtual road signs should embody metaphors to make their meaning more readily understandable across different cultures. For example, an arrow is often used to indicate direction and the diamond sign is used for road signs. The size and position of the road signs should not obstruct the driver s field of view. On the contrary, they should be strategically placed and scaled to direct the drivers attention to the correct real-world objects. Color is another important design aspect; and provides semantics that can be leveraged across cultures. For example, by most driving conventions, red generally means "stop" and green means "go". AR HUD designs for driving should leverage these or other metaphors. The colors chosen for the virtual road signs should provide enough luminance and chrominance contrast (with the background) to ensure high salience. Signs must be bright enough so that it is clearly visible to the driver, but not so bright that it causes the driver inconvenience. Intensity (or transparency) of the virtual road signs should not obstruct the driver s view of the environment. The timing of appearance of the virtual road sign is crucial in helping the driver take a timely decision. On the completion of the ideation and critiquing phases, we narrowed the design space down to four key features (Figure 3), keeping the design factors and characteristics of our persona in mind. The navigation aid is placed in the center of the AR HUD, at a height such that it does not obstruct the view of any other vehicles ahead. The sides of the AR HUD are used to display the road signs. The road sign appears on the left or right indicating the direction of an approaching pedestrian or cyclist. This helps direct the driver s attention to the relevant object. A Figure 1. Layout of the interface design 1752 cyan colored navigational arrow points in the direction of the driver s destination. The color of the road sign changes based on the urgency of the driver s reaction (i.e.: quickly if red, not so quickly if amber). The symbols used build on existing conventions in driving and other scenarios to elicit a decision and corresponding reaction. Audio feedback provides redundancy gain through multimodal instructions such as "Turn left", "Yield", or "Slow down" based on the situation,. The audio was designed to be a redundant and complimentary feature implemented to reinforce the AR visual cues. Video Prototyping Before prototyping design ideas, we considered the requirements for the prototype, such as level of fidelity, depth, and breadth. From the interaction perspective, drivers need to interact with the prototype while driving. From the ecological perspective, the driving scenario should be representative and realistic. From emotional perspective, drivers should not feel any actual threat for safe evaluation. Therefore, we decided to use medium-to-high fidelity prototypes for better validity of evaluation. Regarding the depth and breadth of the prototype, we decided to develop a horizontal prototype, where we implemented all the design features to effectively address the findings identified earlier in the design phase. To help with rapid prototyping and subsequent design iterations, we used a video editing tool to overlay computer generated graphics atop pre-recorded driving video footage (Figure 2). The actual driving scenario aimed to improve our prototype s ecologically validity. Combined with a highfidelity driving simulator, the augmented video footage served as an appropriate tool for formative evaluation. Participants manipulated the steering wheel, pedals, and turn signals of a real vehicle in response to events in the video to mimic the actions required when driving a real vehicle on the road. We provided a small crosshair in the driving scene (controlled by the steering wheel) and asked participants to keep the cross hair in the center of the lane when driving. The prototype helped in drawing the drivers attention to the existing signs on the road by displaying a virtual road instruction attached to the target road sign. This functionality was supported by audio feedback such as "Stop" for pedestrian crossings or Stop signs, "Turn Right/Left" for roundabouts. The prototype made the driver aware of the speed limits and if they are over speeding by displaying the current speed limit and the Figure 2. Simulation of an AR HUD by synthesizing computer graphics with pre-recorded driving video footage

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1753 Finally, they performed a post-hoc evaluation of the interface that required them to predict the persona s performance, workload and identify any usability issues based either on heuristics or expertise. After their evaluation, we conducted a retrospective think aloud session, where we replayed the driving scenario using a desktop computer which afforded more time to review design factors of the AR interface design (i.e., information density, shape, size, position, color, brightness, transparency and timing). Experts filled out a matrix that relates predicted user performance and usability issues with any relevant design factors (Figure 4). Figure 3. Typical use-cases realized in the prototype speed limit on the road. Audio feedback was used to support this functionality with instructions such as "Slow down". The prototype informed the driver of a possible collision with an approaching cyclist or pedestrian by displaying a virtual road instruction attached to the target road sign. Audio feedback supported this functionality with instructions such as "Yield to Cyclist/Pedestrian" (Figure3). Usability Evaluation For the usability evaluation, a heuristic walkthrough (Sears, 1997) was conducted, where two usability experts evaluated the interface using the persona while driving a car in the high fidelity simulator and using our AR HUD interface. The heuristic walkthrough consisted of four sessions. First we introduced the overall process, our persona, a brief overview of the user interface and the heuristics. Second, the expert evaluators had a practice session to get used to the driving simulator. The experts were then shown a route plan (using Google Maps) that depicted the path of the pre-recorded driving scenario. Next, they drove the car while using (and evaluating) the AR HUD interface. The driving scenario included all the required use-cases that our AR interface intended to address: turning, merging, unfamiliar road signs, work zones, pedestrians, cyclists and even speeding situations. Figure 4. An evaluation sheet includes three main components (predicted user performance, usability issues and interface design factors) and mapping among components The stages of human information processing (Endsley, 2012) attention, sensation, situation awareness (perception, comprehension, projection) and decision - were integrated in our evaluation to help predict the driver s performance while using the AR HUD interface. The heuristics experts used included; the driver should be able to sense the information provided by the interface clearly; the information provided was relevant and sufficient to complete a particular task, and the information caught the driver s attention immediately without distracting or obstructing the driver s field of view; the information provided by the interface should help the driver perceive objects in the real world in a timely manner; the information provided should be easily to understand and should be able to predict changes in the environment; and the information should help the driver in making decisions and reduce the cognitive load of the driver. Experts evaluated the AR HUD interface by rating it against a non-ar HUD condition (i.e., without any AR driving aid) using a seven-point scale (Figure 5). RESULTS The heuristic evaluation performed by the usability experts compared driving using the virtual road sign AR HUD design against driving without any AR HUD aid. The experts expected the virtual road signs would help improve driver performance in the following areas: sensation (2.5), attention (2.5), perception (2.0), comprehension (2.5), projection (1.0), decision (1.5) and reduce workload (2.5). On the other hand, retrospective think aloud sessions revealed opportunities to improve the interface. Results from this session suggest that color and transparency of visual cues should be dynamically adjusted to outdoor background to increase sensation and Figure 5. A radar chart shows predicted user performance and workload as compared to the baseline (without an AR driving aid) performance: -3 strongly worse, 0 the same, +3 strongly better than the baseline

Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1754 perception. They also suggested that timing of the instructions (both visual and auditory) should be synchronized to increase projection and decision. DISCUSSION From the results obtained in the analytic evaluation and comments from the usability experts, we can infer that our interface may help novice international drivers understand the meaning of the road signs, increase awareness of speed limits, and help avoid cyclists and pedestrians. Our findings suggest that a well-designed AR HUD interface may marginally improve decision-making over having no interface. Virtual road signs can help increase situation awareness and reduce novice drivers workload when driving through unfamiliar areas with unfamiliar traffic rules. Conformal virtual signs and leader line designs can reduce the user s effort for visual search by directly indicating target objects, such as pedestrians and cyclists. Direct instructions in a given context can further reduce the driver s workload to react appropriately to the signs or situations encountered. The radar chart (Figure 5) compares the predicted performance of the participant against the baseline performance (without an AR driving aid). It also indicates that virtual road signs can increase the driver s situation awareness (perception, comprehension, and projection) as compared to the baseline. Conformal presentation of information can quickly guide a novice driver s attention to relevant environmental elements in a given driving context. Without our interface, our persona, who is new to the area, new to US traffic system, and new to driving, does not know where to focus his attention: he could focus on the road and the road signs; on the in-vehicle gauges; or on the navigation aid giving him directions. However, Figure 5 indicates that our interface can help him manage drivers attention. Further, the think aloud session with our experts revealed that highlighting only relevant road signs among many can help guide drivers attention to the most relevant stimuli in a given context. In sum, our AR HUD design shows strong promise to help drivers understand the meaning of road signs, be aware of the speed limit, using virtual road signs, and avoid cyclists and pedestrians. We believe these benefits are gained by the AR HUDs ability to direct drivers attention to relevant information, thereby reducing cognitive load, and in turn facilitating more effective decision making. Future work may involve improving the AR HUD design based on the results of the analytical evaluation presented herein. This would mean incorporating the changes suggested by the heuristic experts, such as implementing an adaptive UI that, for example, alters the color of navigation aid and virtual road signs based on the environment. The timing of the audio feedback (with respect to visual onset of virtual road signs) could also be improved. Also, a more detailed audio feedback content could be provided to include details such as the reason for asking the driver to slow down. Future designs should also address the challenges of attentional narrowing. Lastly, we aim to conduct an empirical user study to gather more quantitative and qualitative data. ACKNOWLEDGEMENTS We would like to thank Dr. Scott McCrickard for his advice during the work, as well as the expert evaluators for their time, effort and thoughts during the design walkthrough. REFERENCES Azuma, R. T. (1997). A survey of augmented reality. Presence, 6(4), 355-385. Beyer, H., & Holtzblatt, K. (1997). Contextual design: defining customercentered systems: Elsevier. Charissis, V., & Papanastasiou, S. (2010). Human-machine collaboration through vehicle head up display interface. Cognition Technology & Work, 12(1), 41-50. doi:doi 10.1007/S10111-008-0117-0 Dissanayake, S., & Lu, J. J. (2001). TRAFFIC CONTROL DEVICE COMPREHENSION: Differences between Domestic and International Drivers in USA. IATSS research, 25(2), 80-87. Endsley, M. R. (2012). Designing for situation awareness: An approach to user-centered design: CRC Press. Hartson, R., & Pyla, P. S. (2012). The UX book: process and guidelines for ensuring a quality user experience: Elsevier. Kim, H., Miranda Anon, A., Misu, T., Li, N., Tawari, A., & Fujimura, K. (2016). Look at Me: Augmented Reality Pedestrian Warning System Using an In-Vehicle Volumetric Head Up Display. Paper presented at the Proceedings of the 21st International Conference on Intelligent User Interfaces. Kim, H., Wu, X., Gabbard, J. L., & Polys, N. F. (2013). Exploring head-up augmented reality interfaces for crash warning systems. Paper presented at the Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications. Rickesh, T., & Naveen Vignesh, B. (2011). Augmented reality solution to the blind spot issue while driving vehicles. Paper presented at the Recent Advances in Intelligent Computational Systems (RAICS), 2011 IEEE. Sears, A. (1997). Heuristic walkthroughs: Finding the problems without the noise. International Journal of Human-Computer Interaction, 9(3), 213-234. Tönnis, M., Klinker, G., & Plavšic, M. (2009). Survey and Classification of Head-Up Display Presentation Principles. Proceedings of the International Ergonomics Association (IEA). Underwood, G. (2007). Visual attention and the transition from novice to advanced driver. Ergonomics, 50(8), 1235-1249. doi:10.1080/00140130701318707 Wai-Tat, F., Gasper, J., & Seong-Whan, K. (2013, 1-4 Oct. 2013). Effects of an in-car augmented reality system on improving safety of younger and older drivers. Paper presented at the Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on. Yasuda, H., & Ohama, Y. (2012, 5-8 Nov. 2012). Toward a practical wall see-through system for drivers: How simple can it be? Paper presented at the Mixed and Augmented Reality (ISMAR), 2012 IEEE International Symposium on. Young, K. L., Lenné, M. G., Beanland, V., Salmon, P. M., & Stanton, N. A. (2015). Where do novice and experienced drivers direct their attention on approach to urban rail level crossings? Accident Analysis & Prevention, 77, 1-11.