Designing A Human Vehicle Interface For An Intelligent Community Vehicle Kin Kok Lee, Yong Tsui Lee and Ming Xie School of Mechanical & Production Engineering Nanyang Technological University Nanyang Avenue Singapore 639798 Email: p7311542f@ntu.edu.sg Abstract This paper describes the design of a human vehicle interface (HVI) for an autonomous road vehicle. It presents ongoing research work in the Intelligent Vehicle Laboratory of NTU. It includes the research works of fellow researchers to illustrate the integration of the vehicle physical systems to the human interface. The experimental vehicle platform is an electric vehicle with an on-board computer. The sensory suite of the vehicle includes modular active stereo eyes for road tracking and vehicle following, ultrasonic sensors for near field obstacles detection and GPS receivers for vehicle localization and tracking. The interactive components are motion control module for driving and communications module for tele-operation or operator assistance. The fusion of sensors and other vehicle components, representation of information and the human-task-vehicle interaction concept are discussed with consideration to ergonomics and safety. 1. Introduction This intelligent electric vehicle project arises out of a need for a community vehicle to reduce traffic congestion and cost of car ownership in cities like Tokyo and Singapore. A community vehicle is a car sharing concept [1] within a housing estate or an industrial park. For example, the residents of a housing estate located at the suburb can share numerous such vehicles. The vehicles travel only within the estate, providing access to amenities and mass public transport system. The vehicles are dispatched on demand from the vehicle station to the customer and returned to the station after the customer has completed his journey. Figure 1. Intelligent community car operation concept. An intelligent vehicle that can drive autonomously during dispatching and returning of vehicle without a human driver is the main research project. The vehicle would have built-in sensory suites and the intelligence (software) to travel autonomously to the required location. The intelligent system also acts as a danger watchdog and navigation aid for a human driver. Nondriver can also travels on the vehicle using the autonomous mode, thus providing a truly low cost public transport system at doorstep convenience.
Human ergonomics applies to the position and size of the interactive components, e.g. steering wheel, interactive console etc. It also applies to the design of the graphical user interface, as in screen layout etc. [2] 4) Safety In our project, this is a prime concern because the lives of the driver and other road users are at stake. For example, if the operation of the navigation guide should distract the driver s attention from the road or increase his workload, it would likely to cause accidents. 4. Design methodology Figure 2. Our experimental vehicle from Robosoft. 2. Human Vehicle Interface As our intelligent vehicle are to be capable of operating in both fully autonomous and manual mode by users of different level of driving skill or non-drivers, the design of the human-vehicle-interface (HVI) becomes an important issue of this research. By starting early on the HVI issue, the prototype vehicle would have a HVI that not only can be operated by researchers but also by the public. Another benefit would be other researchers working on different components of the vehicle would be constantly aware of the human interface integration of their part. The aim of this research is to achieve a simple intuitive HVI despite the complexity of the vehicle system and to present feedback information in the most direct, digestible form. To a qualified driver, the HVI would be close to a conventional car layout except for the addition of navigation guide, traffic information system and road hazard preventive and warning system. To a non-driver, the HVI would be a guided push button affair to get to the required destination. 3. Human machine interface design criteria The design of a general human machine interface can be assessed by four criteria. They are as follows. 1) Functionality The functionality is the ability of the human machine interface to translate human intention into results correctly and consistently. 2) Usability The highest level of usability is having the least memory load (remembering the procedures) and using the least number of steps to achieve the result. 3) Ergonomics We adopted a human orientated approach instead of system orientated approach, which is the case in most robotics and intelligent vehicle design. A system orientated (also called system up) approach, gives a researcher the full control and information collection over each and every system in a machine but to a nontechnical user, he would be overwhelmed by the numerous dials and buttons. A human orientated approach would look from the human s need perspective and integrate each system to perform the necessary action to satisfy each need. This approach ensures the human would always feel in control of the machine and not the other way round. Defining this approach is especially important in guiding the design of human interface component of the hazard warning and preventive system. The system would perform corrective actions like steering away from potential collision without appearing to override the driver action. The design and issues related to the system would be described in later sections. The human s need is what the driver would like to achieve. To illustrate this methodology for example, a driver would like to know the route to destination A. This is his need. He would find and activate the route planning function through a touch screen console. The onboard computer would prompt for the destination and send command to the Global Position System (GPS) for locating the present location of the vehicle, and obtain other necessary information it requires to plot the route. It displays the plotted route overlaid over a digital map on the touch screen. The interactive part thus consists of the human identifying and activating the function, whereas the action would consist of information processing and output by the vehicle system. The design methodology can be summarised as follow. GOAL TASK VEHICLE NEED WORK ACTION
5. Identifying human needs in intelligent vehicle We attempted to list the human needs (not exhaustive) in driving a car and place them into function subgroups [3, 4]. Here, we would not try to separate the needs of drivers and non-drivers because we are designing a common interface. Thus all the different needs are listed together. Table 1. A List of Needs Grouped According To Function NEED GROUP 1. Route Planning 2. Travel Time Estimation 3. Vehicle Localising Routing 4. Route Navigation Guidance & 5. Mass Transport Connection Navigation 6. Route Changing Functions 7. Fleet Management 8. Obstacle Warning Safety 9. Collision Prevention And 10. Overtaking Aid Warning 11. Automatic Emergency Call Functions 12. Traffic News 13. Traffic Sign Notification Services 14. Traffic Sign Information & 15. Manual Dial-Out Service Traffic Info. 16. Essential Amenities Functions 17. Shops Directory 18. Automatic Parking 19. Automatic Lane Following Autonomous 20. Cruise Control Functions 21. Fully Autonomous Driving 22. Steering Control Vehicle 23. Speed Control Control 24. Vehicle Status Monitoring Functions The needs are grouped according to common human interface issues and system requirements. This allows us to modularise the design and development of the HVI. 6. Human vehicle interface issues This section deals with the HVI issues and subsystem integration (both hardware and software) with the human interface. Attention is given to how various vehicle subsystems are linked by each function group requirements. 6.1 Routing & Navigation Table 2. Subsystem for routing & navigation Subsystem Description 1.Differential The positioning and tracking GPS system 2. Route Planning This includes the digital map, landmark and street database and the route finding algorithm. 3. Navigation Aid This takes information from the above two subsystems and generates a series of instructions, both visual and speech, to guide the driver. It is a software subsystem. Table 3. Human Vehicle Interface components Component Description 1. Graphical The touch screen is the main User Interface interactive component for the (GUI) majority of the functions. The user selects guided choices from the GUI to tell the route planning subsystem his destination and sees the route on the screen. It provides the input for the rest of the needs as well. The human interface issues involved in the design of the GUI deserve its own section of 2. Speech Recognition and Talking Module discussion later. The module allows natural human interaction with the machine. We are using the Naturally Speaking of Dragon System for our speech recognition. The HVI issues in this function group largely involve the GUI thus the issues would be described in the GUI section. The navigation guidance is a popular researching topic in automobile industry. At this stage, we decide to use a simple visual cue and speech for guidance. A driver can depend on a simple lighted arrow cue that points to the direction he needs to turn at the next junction of guidance, or he can listen to the voice instructions. The position and type of visual cues and the construction of voice instructions are also issues for research. 6.2 Safety & Warning Table 4. Subsystem for safety & warning Subsystem Descriptions 1.Proximity The vehicle makes use of Obstacle Detection ultrasonic sensor for detecting obstacles within 6 meters of the vehicle. It is normally used during parking or narrow lane 2. Active Vision 3. Collision Warning and Prevention Module manoeuvring. The vision subsystem looks ahead (beyond 3m) for vehicles and other objects on the road. It identifies and tracks them. This is the software module that takes in information from the sensory subsystem to make decision on giving warning to the driver or activating corrective actions to override the driver.
4. Emergency Phone Call The vehicle has a 9600kbps GSM modem that can accept voice and data. The modem would automatically connect to the control center whenever a vehicle breakdown or accident occurred. Table 5. Human Vehicle Interface components Component Descriptions 1. Graphical The GUI provides a map of the User Interface vehicle proximity during manual (GUI) parking. It is however not use as a warning device as it would take away the driver s attention from 2. Visual Warning 3. 3-D Positioning Audio Device 4. Talking Module 5. Force Feedback Steering Wheel the hazard. The visual warning cue consists of flashing light that alerts the driver to the hazard. It is possible to create a sound source that can be localised 360 degrees by a human driver using 4 speakers. A verbal warning option is available to the driver. It includes information about the hazard and avoidance advice. A force feedback steering wheel relay actual road conditions information to the driver. This feedback information is lacking in electrically driven car. The force feedback is also a warning cue when the vehicle overrides the driver. In this group of functions, the design criteria of ergonomic and safety is of utmost importance. If designed and used properly, it can prevent accidents. If not, life could be lost. Fortunately, computer-processing power has reached the level in which we can do 3-D sound positioning real-time. The 3-D sound hazard warning maybe in the form of car-horn, can alert the driver to the direction of hazard immediately. To illustrate this further, we use a case of reverse parking. The driver is manually trying to park a vehicle. If a child suddenly walks to the rear of the vehicle, the ultrasonic sensors would immediately detect the presence of the child and display the obstacles on the proximity map. If the driver continues to reverse, a car horn sound would be played from the 4 speakers. The sound would appear to come from the direction of the obstacle by processing the location data from the ultrasonic sensor. The force feedback wheel would apply a counter force to prevent the driver from steering into the child. A vibratory force can also be applied to the steering wheel, to trick the driver into thinking the car is scraping a wall. Through the sense of hearing and feeling (hand), the driver becomes instantly aware of the hazard and its location. This is more effective than just giving a flashing light or a warning siren, because the driver would need time to figure and search for the hazard. The autonomous part of the vehicle would take over the driving and stop if the driver doesn t react to it. Research is needed to identify the most effective warning method in different driving scenarios and developing the algorithm to do 3D sound processing and force feedback dynamic modelling. Figure 3. Current experimental setup on our vehicle. Seen in picture is one Pentium II Host PC with LCD touch screen, two CCD cameras and GSM modem handset. The person is maneuvering the car with the joystick in his right hand. 6.3 Services & traffic information Table 6. Subsystem for services & traffic information Subsystem Descriptions 1.Service The service information provides a Information database of amenities and services in the operating neighbourhood. In other words it can also be called a multimedia-shopping guide. 2. Traffic Sign Information/ Notification Traffic signs are identified by means of embedded emitter or image recognition. The human interface issues are related to the GUI. It would be covered together in later section.
6.4 Autonomous operation When a user or driver activates the autonomous operation, the vehicle would take over the control completely. Only the emergency brake button and the GUI can be operate by the user. All interactions are through the GUI, thus the issues related to the GUI are covered in the following section. 6.5 Graphical User Interface (GUI) The design of a graphical user interface for the interactive console is the first research area covered in the human vehicle interface (HVI) research. From the above description, about 80% of the tasks are performed through the GUI. It is the main component in our HVI. We are currently using Visual C++ to do GUI programming on a Pentium II PC running with Windows NT. Figure 5. Example of command and menu structure. Many software developers and researchers had studied human computer interface issues related to GUI. We are not going to study these issues in depth. We would be concentrating more on the ease of operation while following the GUI design guidelines put forward by these human computer interface researchers. The guidelines describe how to optimise GUI layout, colours and fonts to use, etc. Figure 4. Close-up of touch screen. At the top right corner is the GSM modem handset. Object-Orientated Programming (OOP) would be the basis of the GUI and software for all vehicles subsystem. This allows all researchers to share common code and achieve code modularity. We also make sure all researchers use the same GUI that would be finally used by users for development work so they would always be aware of the user s perspective. The command and menu structures follow the Task - Action approach. By pressing the task buttons on the main menu bar, the sub-menu bar (action) would appear to give the user choices to performed his required function. An example, pressing the navigation mode button would likely show the rest of the functions in the sub-menu. Figure 6. A view of the GUI showing the navigation menu and digital map. (This is not the finalised look.) At the beginning of the paper, we had set out our objective for the human vehicle interface to cater to all kind of users, even user with no computer experience. Thus the language used in the GUI must be in nontechnical terms and prompts and question are structured with a natural conversation style. The speech generating and speech recognition module is very useful in making the interface closer to human.
7. Conclusion We have completed defining the Human-Task- Vehicle structure and have partially integrated the vehicle subsystem hardware with the GUI. Many issues presented are ongoing research works and some are even yet to be started. When the vehicle prototype is completed, experiments are required to study the effects of each human interface component on the driver s workload and safety. This paper proposes the human down approach in designing the human vehicle interface to avoid complexity of the intelligent system by looking at the human needs. It also lists four designed criteria, functionality, usability, ergonomics and safety, described in section 3, constantly to assess the human machine interface and its components. References [1] S. Danved, Future Drive:Electric Vehicles & Sustainable Transportation, 1995, Island Press. [2] B. Peacock, W. Karuouski, Automotive Ergonomics, 1993, Taylor & Francis, London. [3] J.D. Lee, A Functional Description of ATIS/CVO Systems to Accommodate Driver Needs and Limits, Ergonomics and Safety of Intelligent Driver Interfaces, 1997, pp. 63-84, Lawrence Erlbaum Associates, Mahwah, New Jersey. [4] J. Malee, M. Mori, U. Palmquist, Driver Support in Intelligent Autonomous Cruise Control, Intelligent Vehicles 94 Symposium, pp. 160-164, Oct 1994 [5] P. Pleczan, S. Chalard, The Human-Machine Interface of Prolab 2 CoPilot, Proc. Of Intelligent Vehicles 94 Symposium, pp. 461-466, Oct 1994 [6] W. Richard, Smart Highways, Smart Cars, 1995, Artech House, London, [7] Working Conference on Engineering for Human- Computer Interaction, Engineering for humancomputer interaction, 1996, London, Chapman & Hall,. [8] David B. Roe, Jay G. Wilpon, Voice Communication between humans and machines, 1994, National Academy Press, Washington D.C. [9] M. John A., Generic Intelligent Driver Support,, 1993Taylor & Francis, London. [10] K. Chen, Advanced Technology for Road Transport: IVHS and ATT, Norwood, Artech House, London, 1994