Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP)

Similar documents
The Perception of Optical Flow in Driving Simulators

Driving Simulators for Commercial Truck Drivers - Humans in the Loop

Development & Simulation of a Test Environment for Vehicle Dynamics a Virtual Test Track Layout.

An Example Cognitive Architecture: EPIC

TECHNICAL REPORT. NADS MiniSim Driving Simulator. Document ID: N Author(s): Yefei He Date: September 2006

Human Factors Studies for Limited- Ability Autonomous Driving Systems (LAADS)

Image Characteristics and Their Effect on Driving Simulator Validity

Comparison of Wrap Around Screens and HMDs on a Driver s Response to an Unexpected Pedestrian Crossing Using Simulator Vehicle Parameters

The Design and Assessment of Attention-Getting Rear Brake Light Signals

EFFECTS OF A NIGHT VISION ENHANCEMENT SYSTEM (NVES) ON DRIVING: RESULTS FROM A SIMULATOR STUDY

Virtual Testing of Autonomous Vehicles

Development and Validation of Virtual Driving Simulator for the Spinal Injury Patient

Autonomous Automobile Behavior through Context-based Reasoning

Adaptive signal Control. Tom Mathew

Assessments of Grade Crossing Warning and Signalization Devices Driving Simulator Study

Do Redundant Head-Up and Head-Down Display Configurations Cause Distractions?

Address Entry While Driving: Speech Recognition Versus a Touch-Screen Keyboard

Virtual testing by coupling high fidelity vehicle simulation with microscopic traffic flow simulation

Driver Education Classroom and In-Car Curriculum Unit 3 Space Management System

Early Take-Over Preparation in Stereoscopic 3D

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

Driving Simulation Scenario Definition Based on Performance Measures

Interaction in Urban Traffic Insights into an Observation of Pedestrian-Vehicle Encounters

Toward More Realistic Driving Behavior Models for Autonomous Vehicles in Driving Simulators

Driver-in-the-Loop: Simulation as a Highway Safety Tool SHAWN ALLEN NATIONAL ADVANCED DRIVING SIMULATOR (NADS) THE UNIVERSITY OF IOWA

DEVELOPMENT OF A MICROSCOPIC TRAFFIC SIMULATION MODEL FOR INTERACTIVE TRAFFIC ENVIRONMENT

WB2306 The Human Controller

Deployment and Testing of Optimized Autonomous and Connected Vehicle Trajectories at a Closed- Course Signalized Intersection

Key-Words: - Neural Networks, Cerebellum, Cerebellar Model Articulation Controller (CMAC), Auto-pilot

COMPARISON OF DRIVER DISTRACTION EVALUATIONS ACROSS TWO SIMULATOR PLATFORMS AND AN INSTRUMENTED VEHICLE.

UNIT-III LIFE-CYCLE PHASES

Perceptual Overlays for Teaching Advanced Driving Skills

Iowa Research Online. University of Iowa. Robert E. Llaneras Virginia Tech Transportation Institute, Blacksburg. Jul 11th, 12:00 AM

Visualization of Vehicular Traffic in Augmented Reality for Improved Planning and Analysis of Road Construction Projects

Comparison of Driver Brake Reaction Times to Multimodal Rear-end Collision Warnings

SAfety VEhicles using adaptive Interface Technology (SAVE-IT): A Program Overview

Humans and Automated Driving Systems

MULTI-LAYERED HYBRID ARCHITECTURE TO SOLVE COMPLEX TASKS OF AN AUTONOMOUS MOBILE ROBOT

Chapter 6 Experiments

System with driving simulation device for HMI measurements Petr Bouchner, Stanislav Novotný

Draft Recommended Practice - SAE J-2396

Virtual Shadow: Making Cross Traffic Dynamics Visible through Augmented Reality Head Up Display

EXPERIMENTAL FRAMEWORK FOR EVALUATING COGNITIVE WORKLOAD OF USING AR SYSTEM IN GENERAL ASSEMBLY TASK

Validation Plan: Mitchell Hammock Road. Adaptive Traffic Signal Control System. Prepared by: City of Oviedo. Draft 1: June 2015

Overview. Experiment IDUS315 - HCI 1. Competitive Analysis

IMPROVEMENTS TO A QUEUE AND DELAY ESTIMATION ALGORITHM UTILIZED IN VIDEO IMAGING VEHICLE DETECTION SYSTEMS

EXTRACTING REAL-TIME DATA FROM A DRIVING SIMULATOR SEYED AMIRHOSSEIN HOSSEINI. Bachelor of Engineering in Civil Engineering QIAU May 2012

The Real-Time Control System for Servomechanisms

Intelligent Driving Agents

Aimsun Next User's Manual

Distance Perception with a Camera-Based Rear Vision System in Actual Driving

Driving Performance in a Simulator as a Function of Pavement and Shoulder Width, Edge Line Presence, and Oncoming Traffic

Virtual Road Signs: Augmented Reality Driving Aid for Novice Drivers

VR Haptic Interfaces for Teleoperation : an Evaluation Study

Chapter 10 Digital PID

Firmware development and testing of the ATLAS IBL Read-Out Driver card

William Milam Ford Motor Co

Driver Comprehension of Integrated Collision Avoidance System Alerts Presented Through a Haptic Driver Seat

Automotive Needs and Expectations towards Next Generation Driving Simulation

C-ITS Platform WG9: Implementation issues Topic: Road Safety Issues 1 st Meeting: 3rd December 2014, 09:00 13:00. Draft Agenda

The application of Work Domain Analysis (WDA) for the development of vehicle control display

6 System architecture

Extending SUMO to support tailored driving styles

Significant Reduction of Validation Efforts for Dynamic Light Functions with FMI for Multi-Domain Integration and Test Platforms

OPEN CV BASED AUTONOMOUS RC-CAR

David Howarth. Business Development Manager Americas

A HUMAN PERFORMANCE MODEL OF COMMERCIAL JETLINER TAXIING

Using Driving Simulator for Advance Placement of Guide Sign Design for Exits along Highways

A Virtual Environments Editor for Driving Scenes

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

AN AUTONOMOUS SIMULATION BASED SYSTEM FOR ROBOTIC SERVICES IN PARTIALLY KNOWN ENVIRONMENTS

Multisensory Virtual Environment for Supporting Blind Persons' Acquisition of Spatial Cognitive Mapping a Case Study

SYMBOLIC MODEL OF PERCEPTION IN DYNAMIC 3D ENVIRONMENTS

BASIC CONCEPTS OF HSPA

Saphira Robot Control Architecture

Microscopic traffic simulation with reactive driving agents

Toward an Integrated Ecological Plan View Display for Air Traffic Controllers

Development of Virtual Reality Simulation Training System for Substation Zongzhan DU

2.4 Sensorized robots

PEGASUS Effectively ensuring automated driving. Prof. Dr.-Ing. Karsten Lemmer April 6, 2017

Psychophysics of night vision device halo

The Effect of Visual Clutter on Driver Eye Glance Behavior

MECHANICAL DESIGN LEARNING ENVIRONMENTS BASED ON VIRTUAL REALITY TECHNOLOGIES

HARDWARE ACCELERATION OF THE GIPPS MODEL

A Study of Optimal Spatial Partition Size and Field of View in Massively Multiplayer Online Game Server

Work Domain Analysis (WDA) for Ecological Interface Design (EID) of Vehicle Control Display

Intelligent driving TH« TNO I Innovation for live

Volkswagen Group: Leveraging VIRES VTD to Design a Cooperative Driver Assistance System

Simulation Performance Optimization of Virtual Prototypes Sammidi Mounika, B S Renuka

Safe Speech by Knowledge

Towards the development of cognitive robots

Swarm Robotics. Communication and Cooperation over the Internet. Will Ferenc, Hannah Kastein, Lauren Lieu, Ryan Wilson Mentor: Jérôme Gilles

Keywords- Fuzzy Logic, Fuzzy Variables, Traffic Control, Membership Functions and Fuzzy Rule Base.

ASSESSMENT OF A DRIVER INTERFACE FOR LATERAL DRIFT AND CURVE SPEED WARNING SYSTEMS: MIXED RESULTS FOR AUDITORY AND HAPTIC WARNINGS

Line Detection. Duration Minutes. Di culty Intermediate. Learning Objectives Students will:

Modeling a Continuous Dynamic Task

AN0503 Using swarm bee LE for Collision Avoidance Systems (CAS)

Adaptive Controllers for Vehicle Velocity Control for Microscopic Traffic Simulation Models

Design Process. ERGONOMICS in. the Automotive. Vivek D. Bhise. CRC Press. Taylor & Francis Group. Taylor & Francis Group, an informa business

Advancing Simulation as a Safety Research Tool

Transcription:

University of Iowa Iowa Research Online Driving Assessment Conference 2003 Driving Assessment Conference Jul 22nd, 12:00 AM Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP) Omer Tsimhoni University of Michigan, Ann Arbor Yili Liu University of Michigan, Ann Arbor Follow this and additional works at: http://ir.uiowa.edu/drivingassessment Tsimhoni, Omer and Liu, Yili. Steering a Driving Simulator Using the Queueing Network-Model Human Processor (QN-MHP). In: Proceedings of the Second International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design, July 21-24, 2003, Park City, Utah. Iowa City, IA: Public Policy Center, of Iowa, 2003: 81-85. https://doi.org/10.17077/ drivingassessment.1100 This Event is brought to you for free and open access by the Public Policy Center at Iowa Research Online. It has been accepted for inclusion in Driving Assessment Conference by an authorized administrator of Iowa Research Online. For more information, please contact lib-ir@uiowa.edu.

STEERING A DRIVING SIMULATOR USING THE QUEUEING NETWORK-MODEL HUMAN PROCESSOR (QN-MHP) Omer Tsimhoni and Yili Liu Department of Industrial and Operations Engineering University of Michigan Ann Arbor, Michigan, USA E-mail: omert@umich.edu Summary: The Queueing Network-Model Human Processor (QN-MHP) is a computational architecture that combines the mathematical theories and simulation methods of queueing networks (QN) with the symbolic and procedure methods of GOMS analysis and the Model Human Processor (MHP). QN-MHP has been successfully used to model reaction time tasks and visual search tasks (Feyen and Liu, 2001a,b). This paper describes our work of using QN-MHP to model vehicle steering and to steer a driving simulator as a step toward modeling more complex driving scenarios. The steering model was implemented in Promodel, a commercially available simulation program. A network of 20 servers represents different functional modules of the human perceptual, cognitive, and motor information processing system. Entities carrying information on vehicle location and orientation arrive at and flow through the visual, cognitive and motor sub-networks of the system and are processed independently and concurrently by the servers. The QN-MHP steering model was interfaced with a driving simulator (DriveSafety) using an Ethernet protocol and several custom-built software modules. Heading and location information were received in real-time from the simulator and processed through the servers. Whenever the model made a hand movement, the corresponding position of the steering wheel was transferred to the simulator, thus steering the simulated vehicle. The model demonstrated realistic steering behavior. It steered the driving simulator within the lane boundaries of straight sections and curves of varying curvature. This work showed the potential strength of QN-MHP as a model of driving behavior. Ongoing work will further develop the model by expanding the scope of the driving task and by adding secondary in-vehicle tasks. INTRODUCTION Computational cognitive models can contribute considerably to driving-related human factors research. Computational models can make quantitative predictions for scenarios that have not been tested, and provide a precise common language for description of phenomena of interest. Further, computational models have a symbiotic relationship with empirical research. Empirical findings can be integrated into models to strengthen their validity and expand their scope. In turn, models can identify gaps in the empirical literature and point to new directions of research. In this paper we describe our effort toward modeling driving using a novel computational model, called the Queueing Network Model Human Processor (QN-MHP). The QN-MHP is a 81

computational architecture that combines the mathematical theories and simulation methods of queuing networks with the symbolic and procedure methods of GOMS analysis and the Model Human Processor (MHP). As a network architecture, queuing networks are particularly suited for modeling parallel activities and complex mental architectures. Symbolic models have particular strength in generating a person s actions in specific task situations. By integrating the two complementary approaches, the QN-MHP offers a modeling and simulation architecture for generating in real-time and mathematical modeling of parallel and complex activities. QN-MHP has been successfully used to model reaction time tasks and visual search tasks (Feyen and Liu, 2001a, b). In this paper, we describe our work in modeling steering of a driving simulator using the QN-MHP. QN-MHP AND PROMODEL QN-MHP is implemented in ProModel (ProModel solutions, version 2001), a simulation-based software that is widely used for manufacturing and operational applications and provides a natural programming environment for queuing network simulation. In addition, it has built-in analysis tools and strong visualization capabilities. In QN-MHP, 20 servers represent different functional modules of the human perceptual, cognitive, and motor information processing system (Figure 1). (See Feyen and Liu, 2001a, for more details.) Customers enter the perceptual subnetwork carrying perceptual information, which is then processed by the cognitive subnetwork and converted into actions, carried out by the motor subnetwork. Flow of customers through the network can be visualized in real-time to provide an assessment of the utility of servers and the progress of actions. An output data file, documenting overt actions (e.g., hand movements and eye movements) and variable status (e.g., perceived vehicle information) is produced for post-simulation analysis. Figure 1. Layout of the servers in QN-MHP and the flow of information between them The front-end of the model is an MS-Excel file with data about the environment (stimuli and object description), the actuators and actions in use, parameters available in long-term memory, 82

and a goal list. The goal list is based on a GOMS task-analysis using the defined actions and actuators as they interact with the environment. One of the servers in the network, the task selection server, scans this list step by step to provide the next step that needs to be performed based on the current goal and method processed. Additional elements such as if-then statements and choice probabilities enhance the range of scenarios that can be modeled. THE STEERING MODEL The steering model follows a goal-oriented analysis. The main goal of maintaining the lane consists of subgoals for detecting the orientation of the vehicle, selecting a steering action, and performing a hand movement correspondingly. In order to detect the orientation of the vehicle, a motor action is produced for moving the eyes to the road scene, and a request for information from the visual system is made. Information is continuously perceived from the road scene, except when the eyes are moving (saccadic suppression). As a result, steering actions are triggered by the cognitive subnetwork in response to the analysis of the current state of the vehicle in comparison to the desired state. These steering actions can be normal steering actions or, when about to depart the lane, imminent steering actions. The process of maintaining the lane is continuous, and once an action is initiated, it flows in the network independently and concurrent to other actions. The steering model combines several concepts based on current literature (Table 1). (For further discussion of the steering model, see Tsimhoni and Liu, 2003.) Concept Hierarchical task analysis Availability of visual input Roles of focal and ambient visual systems Concurrent cognitive processing Limited speed control Steering movements Table 1. Concepts used in the steering model INTERFACE WITH A DRIVING SIMULATOR Description Driving a vehicle is described as a hierarchical combination of tasks. Image processing is not performed explicitly. Rather, estimated processing time is added and the extracted data are retrieved directly. Most of the visual input for steering is perceived by the ambient visual system, around the lane markers in front of the vehicle. Eye movements, information analysis, and motor actions are performed concurrently. The model currently operates at fixed speeds. The steering wheel is moved in single-phase open-loop corrections followed by closed-loop adjustments. To provide an off-the-shelf vehicle dynamics module that interacts with the steering model and is independent of it, and to examine the ability of the QN-MHP steering model to produce relevant steering actions in real-time, Promodel was interfaced with the DriveSafety Research Simulator, a high fidelity driving simulation system used for driving research and training. It utilizes a dynamics model that can be adjusted to simulate a variety of vehicle types. It keeps track of numerous state-variables and can output them to external devices. For communication with external devices, the driving simulator uses TCP/IP protocol. Although normally operated via a 83

steering wheel and pedals installed in a simulated car, it can also be controlled externally by receiving inputs digitally. Communication between Promodel and the driving simulator (Figure 2) was implemented via a TCP/IP host, created as an independent process by a communication DLL on the Promodel computer. Promodel sent and received data as function calls directly to the DLL. The driving simulator sent and received data by a TCP client that was connected to the host. The communication protocol accommodated the event-based approach of Promodel and the time-based approach of the driving simulator. Whenever the QN-MHP model made a glance to a specific position in the road scene and information was assumed to be available to it, corresponding information was retrieved from the communication thread. Whenever a hand or eye movement was made by the model, the intermediate or final steering wheel position and the area of fixation were output to the driving simulator via the communication thread. The driving simulator retrieved steering angle and eye position continuously to keep the virtual steering wheel at the desired position and show the area of fixation overlaid on the road scene. QN-MHP (ProModel) Send new steering angle Request and receive driving data Communication DLL (C++, TCP) Request and receive time adjustments Time Adjustment DLL (C++) Send driving data Receive steering angle Driving Simulator with built-in driving dynamics Figure 2. Block diagram of the interface between QN-MHP and the driving simulator Since Promodel is an event-based simulation program, its speed of operation varies as a function of the number of events fired at any given moment. Thus, the simulated time progresses at an irregular speed and is different from the actual time. To overcome this problem, an external time adjustment function was created and integrated with Promodel via a DLL (dynamic link library). This function was called by Promodel every 10 ms to adjust with its internal clock with the actual clock. It was assumed (and verified) that the computer running Promodel was always fast enough to perform all the required events faster than the real time equivalent. TEST DRIVE QN-MHP was successful in steering the driving simulator on a test course. Figure 3 shows the physical layout. The driving simulator demonstrated realistic steering behavior. It remained within the lane boundaries of straight sections and curves of varying radii. Transfer of information between the software modules of the system was smooth, and timing delays were short. Most important, the continuous flow of information through the QN-MHP servers appeared to represent the workload of steering a vehicle in its lane. 84

PROCEEDINGS of the Second International Driving Symposium on Human Factors in Driver Assessment, Training and Vehicle Design Figure 3. QN-MHP steering the driving simulator DISCUSSION The work presented here illustrates the ability of the QN-MHP to model the concurrent perceptual, cognitive, and motor activities of steering. Using a straightforward interface and a GOMS-like task analysis, the model processed external information and created steering actions to maintain the vehicle in its lane. Since the structure of QN-MHP is context free, the steering model did not require manipulation of the architecture of the model. The successful interfacing of ProModel with a driving simulator opens a range of possible areas of research and application for the current model. For example, it allows for visible and real-time demonstration of the steering strategy implemented in the model. It may also serve as an autopilot for the simulator. The potential strength of QN-MHP as a model of driving behavior is in its ability to add concurrent activities without limiting or predefining their order of occurrence. The success of modeling concurrent perceptual, cognitive, and motor activities of steering in a truly concurrent network architecture opens the door for modeling other concurrent activities. Our on-going research builds upon the current work and expands it in two aspects: (1) the driving task will be expanded to include speed control and to alter behavior based on traffic and (2) a secondary invehicle task will be added as a parallel activity. As other perceptual modalities are added to the QN-MHP architecture (e.g., vestibular, auditory), their addition to the driving task, and their effects on it, will be investigated. REFERENCES Feyen, R. G., & Liu, Y. (2001a). Modeling task performance using the queuing network - model human processor (QN-MHP). Proceedings of the 4th International Conference on Cognitive Modeling. Feyen, R. G., & Liu, Y. (2001b). The queuing network-model human processor (QN-MHP): An engineering approach for modeling cognitive performance. Proceedings of the Human Factors and Ergonomics Society 45th annual meeting. Tsimhoni, O., & Liu, Y. (2003). Modeling steering using the queuing network model human processor (QN-MHP). Proceedings of the Human Factors and Ergonomics Society 47th annual meeting. 85