Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects

Size: px
Start display at page:

Download "Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects"

Transcription

1 Optical See-Through Head Up Displays Effect on Depth Judgments of Real World Objects Missie Smith 1 Nadejda Doutcheva 2 Joseph L. Gabbard 3 Gary Burnett 4 Human Factors Research Group University of Nottingham ABSTRACT Recent research indicates that users consistently underestimate depth judgments to Augmented Reality (AR) graphics when viewed through optical see-through displays. However, to our knowledge, little work has examined how AR graphics may affect depth judgments of real world objects that have been overlaid or annotated with AR graphics. This study begins a preliminary analysis whether AR graphics have directional effects on users depth perception of real-world objects, as might be experienced in vehicle driving scenarios (e.g., as viewed via an optical seethrough head-up display or HUD). Twenty-four participants were asked to judge the depth of a physical pedestrian proxy figure moving towards them at a constant rate of 1 meter/second. Participants were shown an initial target location that varied in distance from 11 to 20 m and were then asked to press a button to indicate when the moving target was perceived to be at the previously specified target location. Each participant experienced three different display conditions: no AR visual display (control), a conformal AR graphic overlaid on the pedestrian via a HUD, and the same graphic presented on a tablet physically located on the pedestrian. Participants completed 10 trials (one for each target distance between 11 and 20 inclusive) per display condition for a total of 30 trials per participant. The judged distance from the correct location was recorded, and after each trial, participants confidence in determining the correct distance was captured. Across all conditions, participants underestimated the distance of the physical object consistent with existing literature. Greater variability was observed in the accuracy of distance judgments under the AR HUD condition relative to the other two display conditions. In addition, participant confidence levels were considerably lower in the AR HUD condition. Keywords: augmented reality, depth perception, driving. Index Terms: H.5 [Information Interfaces and Presentation]: H.5.1: Multimedia Information Systems Artificial, Augmented, and Virtual Realities; H.5.2: User Interfaces Ergonomics, Evaluation / Methodology, Screen Design, Style Guides 1 INTRODUCTION People are often distracted while driving, especially when attending to secondary (e.g., engaging with GPS navigation) and tertiary tasks (e.g., manipulating entertainment controls, using a cell phone). In the US, there is ample evidence that the role of 1 mis16@vt.edu, 2 jgabbard@vt.edu, 4 Gary.Burnett@nottingham.ac.uk IEEE Virtual Reality Conference March, Arles, France /15/$ IEEE 3 nldoutch@vt.edu, distraction in accidents is increasing [1]. As such, there is growing interest in designing in-vehicle AR HUD-based technologies to minimize visual distraction by integrating critical primary and secondary task information into the forward looking field of view [2]. Some examples of AR driving applications include safety warning systems [3, 4], lane marking for low visibility [5], GPS services [6-8], and social awareness [9]. While head-up AR graphics offer opportunities for improved driver safety by, for example, decreasing eyes off road time, there exists a need to more deeply research, identify and design for the effects that augmenting graphics have on driver perception and workload. While an AR HUD may help co-locate a visual warning cue (pedestrian hazard) with a real hazard (i.e., the pedestrian), drivers still must visually attend to and process two types of visual information real and virtual. Further work is needed to quantify driver workload while trying to attend to multiple visual information channels; even when intended to be useful as in redundant encoding, collocated or conformal [10]. This paper presents a study where we begin to examine this relationship with respect to depth perception of real world objects. Safe driving necessitates accurate egocentric distance judgments (e.g. stopping time, braking distance, distance to other vehicles, pedestrians, and unexpected hazards). In vehicles, AR graphics are typically intended to cue and enhance real world objects. As auto manufacturers move towards increasingly integrated AR HUDs, a better understanding of the effect of these graphics on depth perception at various distances is needed. To our knowledge, there is no research quantifying how drivers perceive real world objects that are augmented via optical seethrough HUDs, though it has been shown that the depth perception of the actual AR graphics are generally underestimated [11]. In addition, most studies examined relatively short distances of a few meters, which employ a different weighting of depth cues than used in typical driving situation where important objects are located at considerably greater distances. Others have examined AR HUD designs for driving via VR simulation [12], however some perceptual phenomena, such as accommodation and thus depth perception at long distances, cannot generally be reproduced in these settings. This work proposes an alternative approach that employs an AR HUD and realistic distances to help bridge the gap in understanding how augmented reality graphics affect perception of real world objects. While this study does not require actual driving, or even simulated driving, it does quantify how drivers depth perception may be changed when AR graphics are present. Further, the method affords direct measurement the effects of AR graphics on depth perception as opposed to indirectly inferring the effect on perception via braking distance or time to stop as is typically used in driving simulators 2 RELATED WORK Previous studies have explored AR depth perception with a mixture of results. Many studies indicate a tendency for 401

2 individuals to underestimate the distance of a virtual object to themselves [13]. Swan s study comparing virtual reality and augmented reality depth judgments using the directed walking technique indicated that while VR viewing conditions lead individuals to underestimate distance to virtual objects, the same effect is not present in AR scenarios. Instead, there was no underestimation and depth judgments instead were more accurate [13]. Recent studies indicate that there is a distance at which the quality of our AR depth perception shifts from underestimation to overestimation of distance. Though the factors contributing to the location of the shift are not fully understood, Swan, 2007 [2] found this shift to occur at approximately 23m. In addition, this study showed that participant task errors tended to increase with increasing distance [2]. This means that the farther away a participant thought the object was, the less accurate they were, and the longer it took for them to provide a response. This phenomenon is cause for concern in driving when decisionmaking is time pressured, slow reactions can have negative results, and distances between driver and AR graphics may vary across this underestimate/overestimate boundary. Other research shows an interesting trend where the location of the study indoors or outdoors may affect the direction of error. While people tend to underestimate depth indoors, performing the same experiment outside leads to overestimation of depth [14, 15]. The experiment presented herein was performed indoors, though the vast majority of driving application will only occur outside. With our methods and apparatus in place, our intention is to follow up with an outdoor study to determine if a similar switch occurs. Given the difficulty of running experiments outdoors, and the variability in outdoor lighting minute to minute, we felt it was prudent to start our work indoors. While there has been a focus on depth perception of the AR graphics, there has been less research delving into the depth judgment comparisons of AR graphics and real world objects. Jerome & Witmer [16] found that participants were better at judging the distance to a real world object than an AR graphic. However, in this work, the participants did not view both the AR graphic and corresponding real-world object concurrently. 3 METHOD 3.1 Participants This study involved 14 male and 10 female participants. Each participant was required to have normal corrected vision. Additionally, the participants were all over 18 years of age (mean age = 32) and each had a driver s license and experience driving. 3.2 Equipment A cardboard adult male pedestrian was attached to a wheeled cart that could then be pushed towards the participant as if a pedestrian was walking forward (Figure 1). To assist the researchers in manually pushing the cart, a metronome and tape measure were used to ensure a constant rate of consistently sized steps, resulting in a constant pedestrian speed across all tasks. At all times during the task, the researcher pushing the cart was fully hidden by the cardboard cutout of a pedestrian. Attached to the cart was a digital single-lens reflex (DSLR) camera with a high shutter speed that was positioned to capture the cart s position along the tape measure on demand. A flashlight was placed next to the camera to provide adequate lighting and better picture quality due to the fast shutter speed. A tablet computer was fastened to the torso of the pedestrian to support our three display conditions as described in Section 4. Participants sat at a table with their chin resting on a chin rest to ensure that their gaze always centered on the tablet affixed to the pedestrian. A rudimentary HUD was placed in front of the participant for all scenarios, though it was only turned on for one of the three scenarios. The HUD had a focal depth of approximately two meters beyond the participants eyes. Each participant was given a remote trigger synched to the camera that would trigger a picture when the button was pressed. This gave the participant full control over when the distance was measured, and removed some of the inherent human error in measuring. Figure 1: Side view of experimental setup with participant sitting in chair on the left. Figure 2: Chin rest and view of pedestrian at beginning of experiment. We removed as many visual cues as possible in order for the experiment not to be skewed by unexpected, nearby depth cues. Because of this, the room selected for the study was extremely large with black curtains around all walls and all other furniture removed. Unfortunately, we were not able to remove the horizon cue created by the contrast in floor and background color (Figure 2). However, while piloting, we determined that the amount of perceived vertical movement of the pedestrian/horizon intersection was minimal, and ranged from the pedestrian s midthigh to just below the knee. 4 PROCEDURE Each participant experienced three different experimental conditions: no AR visual display (control), an AR graphic perceptually overlaid on the pedestrian via HUD, and the same graphic physically located on the pedestrian-mounted tablet. The three visual display conditions were counterbalanced across the 24 participants. The control condition was designed to yield baseline depth judgments in absence of any other AR-based visual channels. The HUD display condition was constructed to be representative of many emerging automotive HUDs with a fixed focal depth of 2m. The tablet condition was designed to emulate what we expect to be available in select forthcoming AR displays; namely dynamic accommodation; where individual AR graphics will be rendered at arbitrary focal depths, thus allowing designers to not only position conformal graphics in the correct location within the 2D view plane, but also at the correct depth position. 402

3 Figure 3: Experimental Conditions. Participants looked through the large screen used for the AR display for all three conditions, though only the AR experimental condition actually had a display on the screen. The tablet was used in all three conditions, but in slightly different ways. In the tablet condition, our AR stimuli were displayed using the tablet s graphics and display capabilities such that the AR graphics were rendered at the same focal depth as the moving real-world object/pedestrian. In the control and HUD display conditions, the tablet (in its powered-off state) was used simply as a visual cue for participants to direct their attention (and keep them from attending to other cues). In the HUD condition, while we rendered our visual stimuli via HUD, we carefully positioned the AR stimuli in space so as it was perceived to be located within the tablet footprint (and also, directing participants attention to that same area of the pedestrian). For the HUD condition, we also animated the stimuli so that it got larger as both the pedestrian and AR graphics got closer to participant (Figure 4). The graphic essentially emulated true conformal AR imagery and thus attempting to maintain the height in the visual field depth cue that is present in the tablet condition. Figure 4: Changing HUD display size with tail as the pedestrian and tablet approach a user. There were ten different distances examined ranging from m inclusively which were randomly ordered for each participant. We chose this range since it represents typical distance-to-hazards in common driving scenarios. At 1M/sec, the meter distances equate to about a 2 second time headway between a driver s vehicle and hazard vehicle that is travelling between 2 and 5 MPH slower than the driver (suggesting that the driver must take some action quickly). It is within these 2 seconds in which most drivers have to make critical depth judgments (e.g., brake hard, swerve, or simply slow down). Prior to each of the three display conditions, and to diminish learning effects and possible confusion, each participant was allowed up to three practice tasks before starting the recorded tasks. These practice tasks allowed participants to become comfortable with using the remote trigger and to better understand the structure of the experiment. For each trial, the pedestrian started at a distance of 25 m directly in front of the participant. Next, a researcher walked to a pre-specified distance along on a straight line between the pedestrian s starting point and the participant. The researcher then paused and asked the participant to note the target location. The purpose of using a different person to note the target location was to keep participants from using size as a cue. After all researchers were out of sight, a researcher began to push the pedestrian towards the participant at a rate of one meter per second. When the participants believed that the pedestrian had reached the target location, they pressed the remote DSLR camera trigger. Thus, each participant was exposed to ten different target distances under each of three display conditions for a total of 30 trials. After each trial, they were asked how confident they were (on a scale of 1-10) that they correctly identified the distance. While the primary task was to assess pedestrian distance and respond at specific target locations, the secondary AR task was designed to meet two specific criteria: 1. require only the most basic visual perception and no cognitive processing in the automotive domain, this would equate to a simple conformal indicator as opposed to a say a text message; and; 2. demand participants visual attention so that participants would not simply ignore the AR cue and perform the task based on real-world pedestrian cues alone. The visual search task employed a square graphic with a short tail that traced around the outside of the screen, much like the classic Snake, Blockade and Surround computer games. Because the short line moved, this required more visual attention than a still graphic. As mentioned above, in the HUD display condition, the graphic dynamically scaled (larger) at a constant rate and appeared conformal with the tablet attached to the pedestrian. In the control display condition, participants were instructed to visually attend to the blank tablet screen only. In the HUD and tablet conditions, participants were instructed to visually attend to the AR graphics rather than just the pedestrian. Asking participants to attend to the AR graphics alone while trying to judge the depth of the pedestrian was purposeful, and based on post-task interviews resulted in participants visually attending to both the AR graphics and the pedestrian (as opposed to just the pedestrian, which is what was likely to happen in lieu of any concrete instruction). In the HUD display condition, participants accommodated back and forth between 2m and the pedestrian distance at that point in time. In the tablet condition, post-task interviews suggest that participants shifted their foveal attention back and forth from the tablet AR stimuli to the edges of the pedestrian figure. Interestingly, even though the tablet and pedestrian focal depths were identical, participants still sought additional visual cues to perform the primary depth judgment task. In a driving setting, perhaps this accommodative switching and saccade/fixate pattern is a reasonable human response, since a driver would naturally attend to the AR cue first, and then immediately switch to the real-world hazard of interest. Ultimately, the AR community would benefit from experimental tasks that require very tight visual and cognitive integration of both virtual and real-world visual cues. The less integrated the cues are, the more likely participants are to essentially switch their accommodation between the real-world object and AR graphics; a practice that is so common among AR users that they most likely are not aware it is always occurring. 403

4 5 RESULTS Analysis was performed on the distance from the target location to the location determined by participants. This was calculated using the image taken using the remote trigger. The target location minus the location specified by the participant through the remote trigger was the metric for depth perception offset (DPO). Target Location Participant-specified location = DPO For example, if the target location was 15 m from the participant, and they pressed the trigger when the pedestrian was at 17 m, then the resulting value was -2 m. This value indicates that they thought the pedestrian was 2 m closer and corresponds with an underestimation of depth. Any negative value corresponds with underestimation. A value of 0 would indicate that the participant identified exactly the target location. Out of 720 possible data points, 10 values were lost (due to inability to read distance markings in images) for a total of N=710 data points. Of these, 13 outliers were outside of the values contained by 1.5xInter-Quartile Range (IQR) subtracted from the 1 st quartile, and added to the 3 rd quartile (IQR = 1.728). Outlier values less than or greater than were transformed to the corresponding number ( or 2.934). The variances are not equal with the HUD condition resulting in the highest variance (HUD-stdev = 1.406, control-stdev=1.193, tablet-stdev = 1.158). This indicates that the presence of the HUD graphic may result in greater variation in responses. There was no significant difference in overall means for the conditions (HUD-mean = , controlmean = , tablet-mean = ). A second analysis was performed to understand the impact of the three scenarios on the confidence of each participant. There were 16 outliers, 13 of which were associated with the HUD condition, 3 in the tablet condition. Because of the strict upper and lower bounds, scores of 1 or 2 (indicating extremely low confidence) fell outside of the range of non-outlier values using the value 1.5xIQR ± quartile values. Like the DPO data, these scores were transformed to the next highest non-outlier value. As with difference, the reported confidence levels also had unequal variances with the HUD condition reporting the highest standard deviation (HUD-stdev=1.786, control-stdev=1.434, tabletstdev=1.490). Again, the HUD scenario is coupled with greater variation in responses than the other two conditions. Figure 5: Mean confidence level by distance. Display Type Distance Control HUD Tablet Table 1: Mean confidence level by distance. The ANOVA indicated that the mean reported confidence levels were significantly different (F(2, 68)=8.199, p<0.0001). This indicates that under the HUD condition, participants were less confident in their ability to correctly judge the distance. This is likely due to the continual need to focus on the image and then back on the pedestrian. Most available HUDs do not have a variable focal depth, which means that drivers would continually need to make these adjustments while using AR HUDs in vehicles. This indicates a need to further refine AR HUD technology to remove potential problems caused by this. Overall, the use of a HUD significantly affects both the depth perception offset of individuals from the target location and the confidence level of participants. 6 CONCLUSIONS AND FUTURE WORK Overall, the AR HUD condition resulted in greater variability in both distance judgment accuracy and confidence level. On a whole, participants tended to underestimate the closeness of the physical object supporting findings of previous studies (e.g. refs). While the applications of AR HUDs in vehicles are still being explored, one possible application of AR HUDs is to highlight pedestrians or other hazards in the road. It is imperative to understand how graphic overlays affect perception and therefore how they could also affect driving ability. In addition, the fact that the AR HUD is correlated with lower confidence levels and therefore more uncertainty suggests the need to better refine the technology before implementing it into vehicles. Drivers need to be able to confidently make decisions, and anything that reduces their confidence could threaten the safety of both drivers and pedestrians in the vicinity. There is still a great deal of research to be done to fully understand the impact of AR HUDs on real world objects. Future studies could include exploring a wider variety of distances while continuing to try to minimize the depth cues that are available to participants. In addition, there may be interesting findings when changing speeds, lighting conditions, or the contrast on the AR graphic. Finally, this study used a relatively short focal depth relative to the distance to the pedestrian. Future studies should test different focal depths. REFERENCES [1] F. A. Wilson and J. P. Stimpson, "Trends in fatalities from distracted driving in the United States, 1999 to 2008," American Journal of Public Health, vol. 100, [2] J. E. Swan, A. Jones, E. Kolstad, M. A. Livingston, and H. S. Smallman, "Egocentric depth judgments in optical, see-through augmented reality," Visualization and Computer Graphics, IEEE Transactions on, vol. 13, pp , [3] M. Tonnis, C. Lange, and G. Klinker, "Visual longitudinal and lateral driving assistance in the head-up display of cars," 2007, pp

5 [4] H. Kim, X. Wu, and J. L. Gabbard, "Exploring Head-up Augmented Reality Interfaces for Crash Warning Systems " presented at the Submitted to Automotive UI 2013, Eindhoven, Netherland, [5] V. Charissis, S. Papanastasiou, L. Mackenzie, and S. Arafat, "Evaluation of collision avoidance prototype head-up display interface for older drivers," in Human-Computer Interaction. Towards Mobile and Intelligent Interaction Environments, ed: Springer, 2011, pp [6] Z. Medenica, A. L. Kun, T. Paek, and O. Palinko, "Augmented reality vs. street views: a driving simulator study comparing two emerging navigation aids," 2011, pp [7] S. Kim and A. K. Dey, "Simulated augmented reality windshield display as a cognitive mapping aid for elder driver navigation," 2009, pp %@ [8] A. Doshi, S. Y. Cheng, and M. M. Trivedi, "A novel active heads-up display for driver assistance," Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, vol. 39, pp , [9] R. Schroeter, A. Rakotonirainy, and M. Foth, "The social car: new interactive vehicular applications derived from social media and urban informatics," 2012, pp [10] G. K. Edgar, "Accommodation, cognition, and virtual image displays: A review of the literature," Displays, vol. 28, pp , [11] G. P. Hirshberg, "System for aiding a driver's depth perception," ed: Google Patents, [12] A. Kemeny and F. Panerai, "Evaluating perception in driving simulation experiments," Trends in cognitive sciences, vol. 7, pp , [13] J. A. Jones, J. E. Swan II, G. Singh, E. Kolstad, and S. R. Ellis, "The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception," in Proceedings of the 5th symposium on Applied perception in graphics and visualization, 2008, pp [14] M. A. Livingston, Z. Ai, J. E. Swan, and H. S. Smallman, "Indoor vs. outdoor depth perception for mobile augmented reality," in Virtual Reality Conference, VR IEEE, 2009, pp [15] A. Dey, A. Cunningham, and C. Sandor, "Evaluating depth perception of photorealistic mixed reality visualizations for occluded objects in outdoor environments," in Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology, 2010, pp [16] C. Jerome and B. Witmer, "The perception and estimation of egocentric distance in real and augmented reality environments," in Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 2005, pp

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality

Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Perceptual Characters of Photorealistic See-through Vision in Handheld Augmented Reality Arindam Dey PhD Student Magic Vision Lab University of South Australia Supervised by: Dr Christian Sandor and Prof.

More information

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 1750 Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers Prerana Rane 1, Hyungil Kim 2, Juan Lopez Marcano 1,

More information

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Proceedings of the Human Factors and Ergonomics Society 2016 Annual Meeting 2093 Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Hyungil Kim, Jessica D.

More information

TRAFFIC SIGN DETECTION AND IDENTIFICATION.

TRAFFIC SIGN DETECTION AND IDENTIFICATION. TRAFFIC SIGN DETECTION AND IDENTIFICATION Vaughan W. Inman 1 & Brian H. Philips 2 1 SAIC, McLean, Virginia, USA 2 Federal Highway Administration, McLean, Virginia, USA Email: vaughan.inman.ctr@dot.gov

More information

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display Hyungil Kim Department of Industrial and Systems Engineering, Virginia Tech. Objective: This work aims to

More information

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach

Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach Proceedings of the 6 th Humanist Conference, The Hague, Netherlands, 13-14 June 2018 Augmented Reality as an Advanced Driver-Assistance System: A Cognitive Approach Lucas Morillo Méndez, CTAG, Spain, l.morillo.lm@gmail.com,

More information

The application of Work Domain Analysis (WDA) for the development of vehicle control display

The application of Work Domain Analysis (WDA) for the development of vehicle control display Proceedings of the 7th WSEAS International Conference on Applied Informatics and Communications, Athens, Greece, August 24-26, 2007 160 The application of Work Domain Analysis (WDA) for the development

More information

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters University of Iowa Iowa Research Online Driving Assessment Conference 2017 Driving Assessment Conference Jun 28th, 12:00 AM Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected

More information

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K.

THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION. Michael J. Flannagan Michael Sivak Julie K. THE RELATIVE IMPORTANCE OF PICTORIAL AND NONPICTORIAL DISTANCE CUES FOR DRIVER VISION Michael J. Flannagan Michael Sivak Julie K. Simpson The University of Michigan Transportation Research Institute Ann

More information

Early Take-Over Preparation in Stereoscopic 3D

Early Take-Over Preparation in Stereoscopic 3D Adjunct Proceedings of the 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI 18), September 23 25, 2018, Toronto, Canada. Early Take-Over

More information

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display SUK WON LEE, TAEK SU NAM, ROHAE MYUNG Division of Information Management Engineering Korea University 5-Ga, Anam-Dong,

More information

Image Characteristics and Their Effect on Driving Simulator Validity

Image Characteristics and Their Effect on Driving Simulator Validity University of Iowa Iowa Research Online Driving Assessment Conference 2001 Driving Assessment Conference Aug 16th, 12:00 AM Image Characteristics and Their Effect on Driving Simulator Validity Hamish Jamson

More information

SPATIAL AWARENESS BIASES IN SYNTHETIC VISION SYSTEMS DISPLAYS. Matthew L. Bolton, Ellen J. Bass University of Virginia Charlottesville, VA

SPATIAL AWARENESS BIASES IN SYNTHETIC VISION SYSTEMS DISPLAYS. Matthew L. Bolton, Ellen J. Bass University of Virginia Charlottesville, VA SPATIAL AWARENESS BIASES IN SYNTHETIC VISION SYSTEMS DISPLAYS Matthew L. Bolton, Ellen J. Bass University of Virginia Charlottesville, VA Synthetic Vision Systems (SVS) create a synthetic clear-day view

More information

Effective Iconography....convey ideas without words; attract attention...

Effective Iconography....convey ideas without words; attract attention... Effective Iconography...convey ideas without words; attract attention... Visual Thinking and Icons An icon is an image, picture, or symbol representing a concept Icon-specific guidelines Represent the

More information

EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES

EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES PROCEEDINGS of the HUMAN FACTORS AND ERGONOMICS SOCIETY 49th ANNUAL MEETING 2005 35 EVALUATING VISUALIZATION MODES FOR CLOSELY-SPACED PARALLEL APPROACHES Ronald Azuma, Jason Fox HRL Laboratories, LLC Malibu,

More information

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System Driver Education Classroom and In-Car Instruction Unit 3-2 Unit Introduction Unit 3 will introduce operator procedural and

More information

EFFECTS OF A NIGHT VISION ENHANCEMENT SYSTEM (NVES) ON DRIVING: RESULTS FROM A SIMULATOR STUDY

EFFECTS OF A NIGHT VISION ENHANCEMENT SYSTEM (NVES) ON DRIVING: RESULTS FROM A SIMULATOR STUDY EFFECTS OF A NIGHT VISION ENHANCEMENT SYSTEM (NVES) ON DRIVING: RESULTS FROM A SIMULATOR STUDY Erik Hollnagel CSELAB, Department of Computer and Information Science University of Linköping, SE-58183 Linköping,

More information

Preliminary evaluation of a virtual reality-based driving assessment test

Preliminary evaluation of a virtual reality-based driving assessment test Preliminary evaluation of a virtual reality-based driving assessment test F D Rose 1, B M Brooks 2 and A G Leadbetter 3 School of Psychology, University of East London, Romford Road, Stratford, London,

More information

The Visitors Behavior Study and an Experimental Plan for Reviving Scientific Instruments in a New Suburban Science Museum

The Visitors Behavior Study and an Experimental Plan for Reviving Scientific Instruments in a New Suburban Science Museum The Visitors Behavior Study and an Experimental Plan for Reviving Scientific Instruments in a New Suburban Science Museum Jeng-Horng Chen National Cheng Kung University, Tainan, TAIWAN chenjh@mail.ncku.edu.tw

More information

Situational Awareness A Missing DP Sensor output

Situational Awareness A Missing DP Sensor output Situational Awareness A Missing DP Sensor output Improving Situational Awareness in Dynamically Positioned Operations Dave Sanderson, Engineering Group Manager. Abstract Guidance Marine is at the forefront

More information

Gaze Behaviour as a Measure of Trust in Automated Vehicles

Gaze Behaviour as a Measure of Trust in Automated Vehicles Proceedings of the 6 th Humanist Conference, The Hague, Netherlands, 13-14 June 2018 ABSTRACT Gaze Behaviour as a Measure of Trust in Automated Vehicles Francesco Walker, University of Twente, The Netherlands,

More information

LED NAVIGATION SYSTEM

LED NAVIGATION SYSTEM Zachary Cook Zrz3@unh.edu Adam Downey ata29@unh.edu LED NAVIGATION SYSTEM Aaron Lecomte Aaron.Lecomte@unh.edu Meredith Swanson maw234@unh.edu UNIVERSITY OF NEW HAMPSHIRE DURHAM, NH Tina Tomazewski tqq2@unh.edu

More information

Interactions and Applications for See- Through interfaces: Industrial application examples

Interactions and Applications for See- Through interfaces: Industrial application examples Interactions and Applications for See- Through interfaces: Industrial application examples Markus Wallmyr Maximatecc Fyrisborgsgatan 4 754 50 Uppsala, SWEDEN Markus.wallmyr@maximatecc.com Abstract Could

More information

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive Author manuscript, published in "2012 IEEE Intelligent Vehicles Symposium, Spain (2012)" DAARIA: Driver Assistance by Augmented Reality for Intelligent Automotive Paul George, Indira Thouvenin, Vincent

More information

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study Petr Bouchner, Stanislav Novotný, Roman Piekník, Ondřej Sýkora Abstract Behavior of road users on railway crossings

More information

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM University of Iowa Iowa Research Online Driving Assessment Conference 2007 Driving Assessment Conference Jul 11th, 12:00 AM Safety Related Misconceptions and Self-Reported BehavioralAdaptations Associated

More information

Spatial Judgments from Different Vantage Points: A Different Perspective

Spatial Judgments from Different Vantage Points: A Different Perspective Spatial Judgments from Different Vantage Points: A Different Perspective Erik Prytz, Mark Scerbo and Kennedy Rebecca The self-archived postprint version of this journal article is available at Linköping

More information

AGING AND STEERING CONTROL UNDER REDUCED VISIBILITY CONDITIONS. Wichita State University, Wichita, Kansas, USA

AGING AND STEERING CONTROL UNDER REDUCED VISIBILITY CONDITIONS. Wichita State University, Wichita, Kansas, USA AGING AND STEERING CONTROL UNDER REDUCED VISIBILITY CONDITIONS Bobby Nguyen 1, Yan Zhuo 2, & Rui Ni 1 1 Wichita State University, Wichita, Kansas, USA 2 Institute of Biophysics, Chinese Academy of Sciences,

More information

Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP)

Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP) University of Iowa Iowa Research Online Driving Assessment Conference 2003 Driving Assessment Conference Jul 22nd, 12:00 AM Steering a Driving Simulator Using the Queueing Network-Model Human Processor

More information

CAN GALVANIC VESTIBULAR STIMULATION REDUCE SIMULATOR ADAPTATION SYNDROME? University of Guelph Guelph, Ontario, Canada

CAN GALVANIC VESTIBULAR STIMULATION REDUCE SIMULATOR ADAPTATION SYNDROME? University of Guelph Guelph, Ontario, Canada CAN GALVANIC VESTIBULAR STIMULATION REDUCE SIMULATOR ADAPTATION SYNDROME? Rebecca J. Reed-Jones, 1 James G. Reed-Jones, 2 Lana M. Trick, 2 Lori A. Vallis 1 1 Department of Human Health and Nutritional

More information

DIFFERENCE BETWEEN A PHYSICAL MODEL AND A VIRTUAL ENVIRONMENT AS REGARDS PERCEPTION OF SCALE

DIFFERENCE BETWEEN A PHYSICAL MODEL AND A VIRTUAL ENVIRONMENT AS REGARDS PERCEPTION OF SCALE R. Stouffs, P. Janssen, S. Roudavski, B. Tunçer (eds.), Open Systems: Proceedings of the 18th International Conference on Computer-Aided Architectural Design Research in Asia (CAADRIA 2013), 457 466. 2013,

More information

More than Meets the Eye

More than Meets the Eye Originally published March 22, 2017 More than Meets the Eye Hold on tight, because an NSF-funded contact lens and eyewear combo is about to plunge us all into the Metaverse. Augmented reality (AR) has

More information

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface Xu Zhao Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan sheldonzhaox@is.ics.saitamau.ac.jp Takehiro Niikura The University

More information

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT TAYSHENG JENG, CHIA-HSUN LEE, CHI CHEN, YU-PIN MA Department of Architecture, National Cheng Kung University No. 1, University Road,

More information

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task

Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, MANUSCRIPT ID 1 Studying the Effects of Stereo, Head Tracking, and Field of Regard on a Small- Scale Spatial Judgment Task Eric D. Ragan, Regis

More information

Chapter 6 Experiments

Chapter 6 Experiments 72 Chapter 6 Experiments The chapter reports on a series of simulations experiments showing how behavior and environment influence each other, from local interactions between individuals and other elements

More information

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING

AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING 6 th INTERNATIONAL MULTIDISCIPLINARY CONFERENCE AUGMENTED VIRTUAL REALITY APPLICATIONS IN MANUFACTURING Peter Brázda, Jozef Novák-Marcinčin, Faculty of Manufacturing Technologies, TU Košice Bayerova 1,

More information

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN

CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN CHAPTER 8 RESEARCH METHODOLOGY AND DESIGN 8.1 Introduction This chapter gives a brief overview of the field of research methodology. It contains a review of a variety of research perspectives and approaches

More information

The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces

The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces Studies in Perception and Action VII S. Rogers & J. Effken (Eds.)! 2003 Lawrence Erlbaum Associates, Inc. The Mona Lisa Effect: Perception of Gaze Direction in Real and Pictured Faces Sheena Rogers 1,

More information

Perception in Immersive Environments

Perception in Immersive Environments Perception in Immersive Environments Scott Kuhl Department of Computer Science Augsburg College scott@kuhlweb.com Abstract Immersive environment (virtual reality) systems provide a unique way for researchers

More information

Driver behavior in mixed and virtual reality a comparative study

Driver behavior in mixed and virtual reality a comparative study DSC 2016 Europe VR B. Blissing et al. Driver behavior in mixed and virtual reality a comparative study B. Blissing, F. Bruzelius, and O. Eriksson Swedish National Road and Transport Research Institute;

More information

Driving Simulation Scenario Definition Based on Performance Measures

Driving Simulation Scenario Definition Based on Performance Measures Driving Simulation Scenario Definition Based on Performance Measures Yiannis Papelis Omar Ahmad Ginger Watson NADS & Simulation Center The University of Iowa 2401 Oakdale Blvd. Iowa City, IA 52242-5003

More information

Study of Effectiveness of Collision Avoidance Technology

Study of Effectiveness of Collision Avoidance Technology Study of Effectiveness of Collision Avoidance Technology How drivers react and feel when using aftermarket collision avoidance technologies Executive Summary Newer vehicles, including commercial vehicles,

More information

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1

EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 EYE MOVEMENT STRATEGIES IN NAVIGATIONAL TASKS Austin Ducworth, Melissa Falzetta, Lindsay Hyma, Katie Kimble & James Michalak Group 1 Abstract Navigation is an essential part of many military and civilian

More information

Survey and Classification of Head-Up Display Presentation Principles

Survey and Classification of Head-Up Display Presentation Principles Survey and Classification of Head-Up Display Presentation Principles Marcus Tönnis, Gudrun Klinker Fachgebiet Augmented Reality Technische Universität München Fakultät für Informatik Boltzmannstraße 3,

More information

3D and Sequential Representations of Spatial Relationships among Photos

3D and Sequential Representations of Spatial Relationships among Photos 3D and Sequential Representations of Spatial Relationships among Photos Mahoro Anabuki Canon Development Americas, Inc. E15-349, 20 Ames Street Cambridge, MA 02139 USA mahoro@media.mit.edu Hiroshi Ishii

More information

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain

Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Technical Disclosure Commons Defensive Publications Series October 02, 2017 Determining Optimal Player Position, Distance, and Scale from a Point of Interest on a Terrain Adam Glazier Nadav Ashkenazi Matthew

More information

Collaboration on Interactive Ceilings

Collaboration on Interactive Ceilings Collaboration on Interactive Ceilings Alexander Bazo, Raphael Wimmer, Markus Heckner, Christian Wolff Media Informatics Group, University of Regensburg Abstract In this paper we discuss how interactive

More information

Poles for Increasing the Sensibility of Vertical Gradient. in a Downhill Road

Poles for Increasing the Sensibility of Vertical Gradient. in a Downhill Road Poles for Increasing the Sensibility of Vertical Gradient 1 Graduate School of Science and Engineering, Yamaguchi University 2-16-1 Tokiwadai,Ube 755-8611, Japan r007vm@yamaguchiu.ac.jp in a Downhill Road

More information

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow

Chapter 9. Conclusions. 9.1 Summary Perceived distances derived from optic ow Chapter 9 Conclusions 9.1 Summary For successful navigation it is essential to be aware of one's own movement direction as well as of the distance travelled. When we walk around in our daily life, we get

More information

Evaluation based on drivers' needs analysis

Evaluation based on drivers' needs analysis Evaluation based on drivers' needs analysis Pierre Van Elslande (IFSTTAR) DaCoTA EU Conference On Road Safety data and knowledge-based Policy-making Athens, 22 23 November 2012 Project co-financed by the

More information

Haptic control in a virtual environment

Haptic control in a virtual environment Haptic control in a virtual environment Gerard de Ruig (0555781) Lourens Visscher (0554498) Lydia van Well (0566644) September 10, 2010 Introduction With modern technological advancements it is entirely

More information

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION 4 & 5 SEPTEMBER 2008, UNIVERSITAT POLITECNICA DE CATALUNYA, BARCELONA, SPAIN MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL

More information

HUMAN-MACHINE COLLABORATION THROUGH VEHICLE HEAD UP DISPLAY INTERFACE

HUMAN-MACHINE COLLABORATION THROUGH VEHICLE HEAD UP DISPLAY INTERFACE HUMAN-MACHINE COLLABORATION THROUGH VEHICLE HEAD UP DISPLAY INTERFACE 1 V. Charissis, 2 S. Papanastasiou, 1 P. Anderson 1 Digital Design Studio, Glasgow School of Art, 10 Dumbreck road, G41 5BW, Glasgow,

More information

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects NSF GRANT # 0448762 NSF PROGRAM NAME: CMMI/CIS Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects Amir H. Behzadan City University

More information

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis

CSC Stereography Course I. What is Stereoscopic Photography?... 3 A. Binocular Vision Depth perception due to stereopsis CSC Stereography Course 101... 3 I. What is Stereoscopic Photography?... 3 A. Binocular Vision... 3 1. Depth perception due to stereopsis... 3 2. Concept was understood hundreds of years ago... 3 3. Stereo

More information

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673

MOTION PARALLAX AND ABSOLUTE DISTANCE. Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 MOTION PARALLAX AND ABSOLUTE DISTANCE by Steven H. Ferris NAVAL SUBMARINE MEDICAL RESEARCH LABORATORY NAVAL SUBMARINE MEDICAL CENTER REPORT NUMBER 673 Bureau of Medicine and Surgery, Navy Department Research

More information

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming)

Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Physics 2310 Lab #5: Thin Lenses and Concave Mirrors Dr. Michael Pierce (Univ. of Wyoming) Purpose: The purpose of this lab is to introduce students to some of the properties of thin lenses and mirrors.

More information

Towards Wearable Gaze Supported Augmented Cognition

Towards Wearable Gaze Supported Augmented Cognition Towards Wearable Gaze Supported Augmented Cognition Andrew Toshiaki Kurauchi University of São Paulo Rua do Matão 1010 São Paulo, SP kurauchi@ime.usp.br Diako Mardanbegi IT University, Copenhagen Rued

More information

Optical Marionette: Graphical Manipulation of Human s Walking Direction

Optical Marionette: Graphical Manipulation of Human s Walking Direction Optical Marionette: Graphical Manipulation of Human s Walking Direction Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai (Digital Nature Group, University

More information

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT

PERFORMANCE IN A HAPTIC ENVIRONMENT ABSTRACT PERFORMANCE IN A HAPTIC ENVIRONMENT Michael V. Doran,William Owen, and Brian Holbert University of South Alabama School of Computer and Information Sciences Mobile, Alabama 36688 (334) 460-6390 doran@cis.usouthal.edu,

More information

F=MA. W=F d = -F FACILITATOR - APPENDICES

F=MA. W=F d = -F FACILITATOR - APPENDICES W=F d F=MA F 12 = -F 21 FACILITATOR - APPENDICES APPENDIX A: CALCULATE IT (OPTIONAL ACTIVITY) Time required: 20 minutes If you have additional time or are interested in building quantitative skills, consider

More information

Impact of Connected Vehicle Safety Applications on Driving Behavior at Varying Market Penetrations: A Driving Simulator Study

Impact of Connected Vehicle Safety Applications on Driving Behavior at Varying Market Penetrations: A Driving Simulator Study Louisiana State University LSU Digital Commons LSU Master's Theses Graduate School 2017 Impact of Connected Vehicle Safety Applications on Driving Behavior at Varying Market Penetrations: A Driving Simulator

More information

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment

Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment Multimedia Virtual Laboratory: Integration of Computer Simulation and Experiment Tetsuro Ogi Academic Computing and Communications Center University of Tsukuba 1-1-1 Tennoudai, Tsukuba, Ibaraki 305-8577,

More information

Gaze Interaction and Gameplay for Generation Y and Baby Boomer Users

Gaze Interaction and Gameplay for Generation Y and Baby Boomer Users Gaze Interaction and Gameplay for Generation Y and Baby Boomer Users Mina Shojaeizadeh, Siavash Mortazavi, Soussan Djamasbi User Experience & Decision Making Research Laboratory, Worcester Polytechnic

More information

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile

DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile DAARIA: Driver Assistance by Augmented Reality for Intelligent Automobile Paul George, Indira Thouvenin, Vincent Fremont, Véronique Cherfaoui To cite this version: Paul George, Indira Thouvenin, Vincent

More information

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient CYBERPSYCHOLOGY & BEHAVIOR Volume 5, Number 2, 2002 Mary Ann Liebert, Inc. Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient JEONG H. KU, M.S., 1 DONG P. JANG, Ph.D.,

More information

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions

Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Arbitrating Multimodal Outputs: Using Ambient Displays as Interruptions Ernesto Arroyo MIT Media Laboratory 20 Ames Street E15-313 Cambridge, MA 02139 USA earroyo@media.mit.edu Ted Selker MIT Media Laboratory

More information

Calling While Driving: An Initial Experiment with HoloLens

Calling While Driving: An Initial Experiment with HoloLens University of Iowa Iowa Research Online Driving Assessment Conference 2017 Driving Assessment Conference Jun 28th, 12:00 AM Calling While Driving: An Initial Experiment with HoloLens Andrew L. Kun University

More information

COPYRIGHTED MATERIAL. Overview

COPYRIGHTED MATERIAL. Overview In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experience data, which is manipulated

More information

Gaze informed View Management in Mobile Augmented Reality

Gaze informed View Management in Mobile Augmented Reality Gaze informed View Management in Mobile Augmented Reality Ann M. McNamara Department of Visualization Texas A&M University College Station, TX 77843 USA ann@viz.tamu.edu Abstract Augmented Reality (AR)

More information

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment

Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Evaluation of Guidance Systems in Public Infrastructures Using Eye Tracking in an Immersive Virtual Environment Helmut Schrom-Feiertag 1, Christoph Schinko 2, Volker Settgast 3, and Stefan Seer 1 1 Austrian

More information

COPYRIGHTED MATERIAL OVERVIEW 1

COPYRIGHTED MATERIAL OVERVIEW 1 OVERVIEW 1 In normal experience, our eyes are constantly in motion, roving over and around objects and through ever-changing environments. Through this constant scanning, we build up experiential data,

More information

EVALUATION OF COMPLEX AT-GRADE RAIL CROSSING DESIGNS USING A DRIVER SIMULATION

EVALUATION OF COMPLEX AT-GRADE RAIL CROSSING DESIGNS USING A DRIVER SIMULATION EVALUATION OF COMPLEX AT-GRADE RAIL CROSSING DESIGNS USING A DRIVER SIMULATION Authors: John Robinson, Ph.D., P. Eng. Delphi-MRC Alison Smiley, Ph.D., CCPE Human Factors North Jeff Caird, Ph.D. University

More information

AR 2 kanoid: Augmented Reality ARkanoid

AR 2 kanoid: Augmented Reality ARkanoid AR 2 kanoid: Augmented Reality ARkanoid B. Smith and R. Gosine C-CORE and Memorial University of Newfoundland Abstract AR 2 kanoid, Augmented Reality ARkanoid, is an augmented reality version of the popular

More information

Discriminating direction of motion trajectories from angular speed and background information

Discriminating direction of motion trajectories from angular speed and background information Atten Percept Psychophys (2013) 75:1570 1582 DOI 10.3758/s13414-013-0488-z Discriminating direction of motion trajectories from angular speed and background information Zheng Bian & Myron L. Braunstein

More information

Baby Boomers and Gaze Enabled Gaming

Baby Boomers and Gaze Enabled Gaming Baby Boomers and Gaze Enabled Gaming Soussan Djamasbi (&), Siavash Mortazavi, and Mina Shojaeizadeh User Experience and Decision Making Research Laboratory, Worcester Polytechnic Institute, 100 Institute

More information

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays

Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Quantitative Comparison of Interaction with Shutter Glasses and Autostereoscopic Displays Z.Y. Alpaslan, S.-C. Yeh, A.A. Rizzo, and A.A. Sawchuk University of Southern California, Integrated Media Systems

More information

Perception vs. Reality: Challenge, Control And Mystery In Video Games

Perception vs. Reality: Challenge, Control And Mystery In Video Games Perception vs. Reality: Challenge, Control And Mystery In Video Games Ali Alkhafaji Ali.A.Alkhafaji@gmail.com Brian Grey Brian.R.Grey@gmail.com Peter Hastings peterh@cdm.depaul.edu Copyright is held by

More information

Adapting SatNav to Meet the Demands of Future Automated Vehicles

Adapting SatNav to Meet the Demands of Future Automated Vehicles Beattie, David and Baillie, Lynne and Halvey, Martin and McCall, Roderick (2015) Adapting SatNav to meet the demands of future automated vehicles. In: CHI 2015 Workshop on Experiencing Autonomous Vehicles:

More information

THE SCHOOL BUS. Figure 1

THE SCHOOL BUS. Figure 1 THE SCHOOL BUS Federal Motor Vehicle Safety Standards (FMVSS) 571.111 Standard 111 provides the requirements for rear view mirror systems for road vehicles, including the school bus in the US. The Standards

More information

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS

NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS NAVIGATIONAL CONTROL EFFECT ON REPRESENTING VIRTUAL ENVIRONMENTS Xianjun Sam Zheng, George W. McConkie, and Benjamin Schaeffer Beckman Institute, University of Illinois at Urbana Champaign This present

More information

Perspective of Reality

Perspective of Reality Perspective of Reality [1] Ch. Aishwarya, [2] R. Sai Sravya, [3] P. Siva Parvathi [1][2][3] Department of Computer Science and Engineering. G. Narayanamma Institute of Science and Technology (for Women)

More information

Do Stereo Display Deficiencies Affect 3D Pointing?

Do Stereo Display Deficiencies Affect 3D Pointing? Do Stereo Display Deficiencies Affect 3D Pointing? Mayra Donaji Barrera Machuca SIAT, Simon Fraser University Vancouver, CANADA mbarrera@sfu.ca Wolfgang Stuerzlinger SIAT, Simon Fraser University Vancouver,

More information

Research on visual physiological characteristics via virtual driving platform

Research on visual physiological characteristics via virtual driving platform Special Issue Article Research on visual physiological characteristics via virtual driving platform Advances in Mechanical Engineering 2018, Vol. 10(1) 1 10 Ó The Author(s) 2018 DOI: 10.1177/1687814017717664

More information

COMPARISON OF DRIVER DISTRACTION EVALUATIONS ACROSS TWO SIMULATOR PLATFORMS AND AN INSTRUMENTED VEHICLE.

COMPARISON OF DRIVER DISTRACTION EVALUATIONS ACROSS TWO SIMULATOR PLATFORMS AND AN INSTRUMENTED VEHICLE. COMPARISON OF DRIVER DISTRACTION EVALUATIONS ACROSS TWO SIMULATOR PLATFORMS AND AN INSTRUMENTED VEHICLE Susan T. Chrysler 1, Joel Cooper 2, Daniel V. McGehee 3 & Christine Yager 4 1 National Advanced Driving

More information

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»!

Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! Welcome to this course on «Natural Interactive Walking on Virtual Grounds»! The speaker is Anatole Lécuyer, senior researcher at Inria, Rennes, France; More information about him at : http://people.rennes.inria.fr/anatole.lecuyer/

More information

Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli

Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli Chapter 6. Experiment 3. Motion sickness and vection with normal and blurred optokinetic stimuli 6.1 Introduction Chapters 4 and 5 have shown that motion sickness and vection can be manipulated separately

More information

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR

Admin. Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR HCI and Design Admin Reminder: Assignment 4 Due Thursday before class Questions? Today: Designing for Virtual Reality VR and 3D interfaces Interaction design for VR Prototyping for VR 3D Interfaces We

More information

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments

Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Effects of Visual-Vestibular Interactions on Navigation Tasks in Virtual Environments Date of Report: September 1 st, 2016 Fellow: Heather Panic Advisors: James R. Lackner and Paul DiZio Institution: Brandeis

More information

Introduction to Psychology Prof. Braj Bhushan Department of Humanities and Social Sciences Indian Institute of Technology, Kanpur

Introduction to Psychology Prof. Braj Bhushan Department of Humanities and Social Sciences Indian Institute of Technology, Kanpur Introduction to Psychology Prof. Braj Bhushan Department of Humanities and Social Sciences Indian Institute of Technology, Kanpur Lecture - 10 Perception Role of Culture in Perception Till now we have

More information

An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques

An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques An Autonomous Vehicle Navigation System using Panoramic Machine Vision Techniques Kevin Rushant, Department of Computer Science, University of Sheffield, GB. email: krusha@dcs.shef.ac.uk Libor Spacek,

More information

LED flicker: Root cause, impact and measurement for automotive imaging applications

LED flicker: Root cause, impact and measurement for automotive imaging applications https://doi.org/10.2352/issn.2470-1173.2018.17.avm-146 2018, Society for Imaging Science and Technology LED flicker: Root cause, impact and measurement for automotive imaging applications Brian Deegan;

More information

TxDOT Project : Evaluation of Pavement Rutting and Distress Measurements

TxDOT Project : Evaluation of Pavement Rutting and Distress Measurements 0-6663-P2 RECOMMENDATIONS FOR SELECTION OF AUTOMATED DISTRESS MEASURING EQUIPMENT Pedro Serigos Maria Burton Andre Smit Jorge Prozzi MooYeon Kim Mike Murphy TxDOT Project 0-6663: Evaluation of Pavement

More information

Augmented Reality Head-Up-Display for Advanced Driver Assistance System: A Driving Simulation

Augmented Reality Head-Up-Display for Advanced Driver Assistance System: A Driving Simulation Augmented Reality Head-Up-Display for Advanced Driver Assistance System: A Driving Simulation Study Lynda Halit, Andras Kemeny, Hakim Mohellebi, Samir Garbaya, Frédéric Merienne, Sylvain Michelin, Valentin

More information

HAPTICS AND AUTOMOTIVE HMI

HAPTICS AND AUTOMOTIVE HMI HAPTICS AND AUTOMOTIVE HMI Technology and trends report January 2018 EXECUTIVE SUMMARY The automotive industry is on the cusp of a perfect storm of trends driving radical design change. Mary Barra (CEO

More information

Intelligent Technology for More Advanced Autonomous Driving

Intelligent Technology for More Advanced Autonomous Driving FEATURED ARTICLES Autonomous Driving Technology for Connected Cars Intelligent Technology for More Advanced Autonomous Driving Autonomous driving is recognized as an important technology for dealing with

More information

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT F. TIECHE, C. FACCHINETTI and H. HUGLI Institute of Microtechnology, University of Neuchâtel, Rue de Tivoli 28, CH-2003

More information

Driving Simulators for Commercial Truck Drivers - Humans in the Loop

Driving Simulators for Commercial Truck Drivers - Humans in the Loop University of Iowa Iowa Research Online Driving Assessment Conference 2005 Driving Assessment Conference Jun 29th, 12:00 AM Driving Simulators for Commercial Truck Drivers - Humans in the Loop Talleah

More information

Regan Mandryk. Depth and Space Perception

Regan Mandryk. Depth and Space Perception Depth and Space Perception Regan Mandryk Disclaimer Many of these slides include animated gifs or movies that may not be viewed on your computer system. They should run on the latest downloads of Quick

More information